gitlab: how can i programmatically upload the artifacts to gitlab registry - gitlab

I have jfrog backup with the below format:
./repositories
./repositories/digital
./repositories/digital/admin
./repositories/digital/admin/revealjs-digital-signage
./repositories/digital/admin/revealjs-digital-signage/v1.artifactory-metadata
./repositories/digital/admin/revealjs-digital-signage/v1.artifactory-metadata/artifactory-folder.xml
./repositories/digital/admin/revealjs-digital-signage/_uploads
./repositories/digital/admin/revealjs-digital-signage/v1
./repositories/digital/admin/revealjs-digital-signage/v1/sha256__6b8971be6dd8206db197c075c2078e750466e8c8086a6781363b3b509236b127.artifactory-metadata
./repositories/digital/admin/revealjs-digital-signage/v1/sha256__6b8971be6dd8206db197c075c2078e750466e8c8086a6781363b3b509236b127.artifactory-metadata/properties.xml
./repositories/digital/admin/revealjs-digital-signage/v1/sha256__6b8971be6dd8206db197c075c2078e750466e8c8086a6781363b3b509236b127.artifactory-metadata/artifactory-file.xml
./repositories/digital/admin/revealjs-digital-signage/v1/sha256__c5ec31d8205545276d3ec1e2a8a77308aa52865dda22241b8e32d4e46daaf82a
./repositories/digital/admin/revealjs-digital-signage/v1/sha256__91cedd97621ca699948e
I need a way or tool that can migrate Jfrog artifacts to gitlab.

You can tell the runner to upload any files or directories you need like this:
artifacts:
paths:
- ./repositories/
The full documentation for the artifact keyword is here: https://docs.gitlab.com/ee/ci/yaml/#artifacts

Related

Grab all artifacts from gitlab-ci dir and make them an artifact

I am trying to display all the artifacts .html files in a specific directory:
$TOOLS_PATH/terraform_results/html that are generated during a build job stage when my gitlab pipeline is run.
when this is run in the gitlab pipeline I get a warning:
Uploading artifacts... WARNING: tools/terraform_results/html/*: no matching files
I'm invoking this via gitlab-ci.yml file via:
artifacts:
paths:
- "$TOOLS_PATH/terraform_results/html/*"
What am i doing wrong?
Drop the quotes and wildcard and indent the paths list. This will zip everything in the html folder. You can use wildcards like *.html to filter filenames if needed:
artifacts:
paths:
- $TOOLS_PATH/terraform_results/html/
If that doesn't work, read the gitlab-ci.yml reference on artifacts:paths to make sure you didn't miss anything. For example:
Paths are relative to the project directory ($CI_PROJECT_DIR) and can’t directly link outside it.
Make sure that the tools directory is located in the root of your project directory. You can confirm the directory exists by adding ls $CI_PROJECT_DIR to your scripts section and checking the runner logs.
For .html only files
artifacts:
paths:
- $TOOLS_PATH/terraform_results/html/*.html

What is the best way to have .env file at pipeline job level in azure devops

Could you please suggest what is the best way to have .env file available for my azure devops pipeline. (Note we are ignoring the .env file to be pushed to Git repository)
My Node.js utility application code base is in azure-devops Git.
I have an azure build pipeline (YML version) which is a CI pipeline (doing just compile & test).
Unit test uses API call which needs API secret token to run.
These tokens are stored in .env file (I used dotenv package of Node.js)
But we are not pushing .env file to Git.
So how should I make .env file available to my CI pipeline.
You can use secure files on azure devops.
1, First upload the .env file to azure devops Secure file
Go to your azure devops project portal. Pipelines--> Library--> Secure files--> +Secure file.
2, Then add Download Secure file Task in your yaml pipeline to download .env file to the agent.
- task: DownloadSecureFile#1
inputs:
secureFile: '.env'
if the task is given the name mySecureFile, its path can be referenced in the pipeline as $(mySecureFile.secureFilePath). Alternatively, downloaded secure files can be found in the directory given by $(Agent.TempDirectory)
3, Then you can Copy File task to copy the .env file to the right place.

Gitlab: How to trigger a script when a file is changed?

I have a repository in Gitlab and what I wish to have is a setup in which when a specific file in a specific branch is changed, I want a script/job to be triggered, which will read and make operations based on this new version of the changed file.
That script can be in another machine and be accessed through SSH, or it can be inside the same repository and be executed somehow.
Is there any way to do this with Gitlab CI/CD?
Edit: I'm using GitLab Enterprise Edition 11.2.3-ee aadca99
You can use only/except changes to do this.
It has been introduced in Gitlab 11.4 and it works with files and directories within your repository, example :
docker build:
script: docker build -t my-image:$CI_COMMIT_REF_SLUG .
only:
changes:
- Dockerfile
- docker/scripts/*
- dockerfiles/**/*
- more_scripts/*.{rb,py,sh}

Gitlab CI Web Deployment

So we are currently moving away from our current deployment provider: Beanstalk, which is great but we are on the top tier and we keep running out of space or hitting our repository limits. So we are moving away so please do not suggest any other SaaS provider.
I personally use Gitlab for my own projects and a few company projects and it's amazing we use a self hosted version on our local server in our company building.
We have CI setup and currently are using the following deployment code (I have minified the bits just to the deployment for development) - this uses the shell executer for deploying as we deploy to an existing linux server.
variables:
HOSTNAME: '<hostname>'
USERNAME: '<username>'
PASSWORD: '<password>'
PATH_DEV: '/path/to/www'
# Define the stages (we can add as many as we want)
stages:
# - build
- deploy
# The code for development deployment
deploy_dev:
stage: deploy
script:
- echo "Deploying to development environment..."
- rm .gitlab-ci.yml
- rsync -urltvz --filter=':- .gitignore' --exclude=".git" -e "sshpass -p"$PASSWORD" ssh -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null" * $USERNAME#$HOSTNAME:$PATH_DEV
- echo "Finished deploying."
environment:
name: Development
url: http://dev.domain.com
only:
- envdev
The Problem:
When we use the above code to deploy it's perfect and works really well, and it deploys all the code after optimisation etc, but we have found a little bug here.
When you delete a file then the rsync command will not delete the file, now I did some searching and found the --remove flag you can add, and it worked - but it deleted all the user uploaded content as well. Now I added the .gitignore in to the filtering, so it would ignore some the files in their (which are usually user generated) or configuration files or/and libraries (npm, etc.). This is fine until a user started uploading files using the media manager in our framework which stores in a folder that is not in the .gitignore file and it can't because it contains other files, as we also add our own files in there so they're editable by the user, so now I am unsure how to manage this.
What we are looking for is a CI setup, which will upload file changes to the server, so it would search through the latest commits, and find the latest files that have been changed and then push only them files up. Of course I would like to do this with the Gitlab CI still, so any ideas examples or tutorials would be amazing.
Thanks in advance.
~ Danny
May it helps: https://github.com/banago/PHPloy
Looks this tool designed for php project, but I think it can use other web deployment.
how it works:
PHPloy stores a file called .revision on your server. This file contains the hash of the commit that you have deployed to that server. When you run phploy, it downloads that file and compares the commit reference in it with the commit you are trying to deploy to find out which files to upload. PHPloy also stores a .revision file for each submodule in your repository.

GitLab Pages deployment step fails after successfull build

I am trying to host a reveal.js presentation via gitlab pages. The repository can be found here: https://gitlab.com/JanGregor/demo-slides
My .gitlab-ci.yml is fairly simple:
image: node:4.2.2
pages:
cache:
paths:
- node_modules/
script:
- npm install
- node_modules/.bin/gulp
artifacts:
paths:
- build
only:
- master
After a commit to master though, something goes wrong. The pages task itself is executed and runs just fine. It even shows in the logs that my build directory has been scanned and that the artefacts have been found.
Oddly, the subsequent pages:deploy task fails. It only says :
pages failed to extract
Any help would be greatly appreciated, since I have no clue where to look to next. The documentation itself isn't really helpful when trying to implement an deployment flow with npm.
Thanks in advance folks !
Apparently a page can only be published from a folder in under the artifacts that is called "public".
From the GitLab Pages documentation:
To make use of GitLab Pages, the contents of .gitlab-ci.yml must follow the rules below:
A special job named pages must be defined
Any static content which will be served by GitLab Pages must be placed under a public/ directory
artifacts with a path to the public/ directory must be defined
Also mentioned (somewhat tangentially) in the "GitLab Pages from A to Z" guide:
... and GitLab Pages will only consider files in a directory called public.

Resources