Artifact not available in script - gitlab

My pipeline has 4 stages
build - Should only happen on merge requests
test - Should only happen on merge requests
report - Should only happen on merge into master
release - Should only happen on merge into master
BUILD: During the build phase I build my test container and upload it to the container registry.
TEST: During the test phase I run the tests within the container, copy out the coverage report from the container and artifact the entire report directory.
REPORT: During the reporting stage I want to copy the artifact from my reporting stage into a Gitlab page directory so we can view the report.
RELEASE: Does terraform plan apply and building the production container.
Since my report and release stages are detached, I'm unable to upload the artifact that was created in a different stage. My work around is to upload the current cov report to /public/<commit-sha> and then move it to /public when it successfully merges with master. Might not be the best solution but I have limited knowledge on gitlab's pipelines.
The issue I'm having is pretty weird.
pages:
stage: report
dependencies:
- unittest
script:
- if [ "$CI_COMMIT_REF_NAME" == "master" ]; then mv public/$CI_COMMIT_SHA public/; else mv coverage/ public/$CI_COMMIT_SHA; fi
artifacts:
paths:
- public
expire_in: 30 days
This complains that mv: can't rename 'coverage/': No such file or directory
However this works perfectly fine
pages:
stage: report
dependencies:
- unittest
script:
- mv coverage/ public
artifacts:
paths:
- public
expire_in: 30 days
If there's an easier solution to pass artifacts between jobs that would be great, but I'm not sure if I'm missing something really obvious in my script.

Related

Is there a way to exclude jobs from Gitlab stages?

I have a long gitlab CI in a mono repo kind of structure. It looks as follows:
Project A
--- .gitlab-ci.yaml (contains thousands of jobs across 3 stages(plan, test and apply) and also includes gitlab-ci folder)
--- gitlab-ci/my-ci.yaml (folder than contains specific ci files, this has 2 stages, plan and apply for my-new-code)
--- my-new-code (folder which I recently pushed)
--- other folders
I have added my-ci.yaml as a separate file to the gitlab-ci folder. But what happens is that those thousands of jobs from the gitlab-ci.yaml run along with the relevant job when I push changes only to my-new-code folder.
I understand that those jobs will run in the stages because they don't have any rules/only/except statements and it is impractical for me to add it to each one of them.
Is there a way I can exclude these jobs from running in the 2 stages when changes are made to the my-new-code folder?
You can have changes condition with folder or file names to it.
If you can see below the job will run only if specific changes made into src folder likewise you use for your jobs
test_pylint:
stage: test
image: python:3.7
allow_failure: true
before_script:
- pip install pylint pylint-junit
- pip install -e .
- python -V
script:
- pylint --output-format=pylint_junit.JUnitReporter src/ | tee rspec.xml
artifacts:
paths:
- rspec.xml
reports:
junit: rspec.xml
when: always
expire_in: 1 week
rules:
- if: '($CI_PIPELINE_SOURCE == "merge_request_event")'
when: always
changes:
- src/**/*
If you want to exclude jobs based on folder consider using except changes as it would exclude that file/folder like below example which will not allow the job to run if any changes made into md files
build:
script: npm run build
except:
changes:
- "*.md"
PS: This is merge request pipelines you can frame the if condition for commit based pipelines

Gitlab ci issue with passing artifacts to Downstream pipeline with trigger and needs keywords

I am working on a multi-pipeline project, and using trigger keyword to trigger a downstream pipeline, but I'm not able to pass artifacts created in the upstream project. I am using needs to get the artifact like so:
Downstream Pipeline block to get artifacts:
needs:
- project: workspace/build
job: build
ref: master
artifacts: true
Upstream Pipeline block to trigger:
build:
stage: build
artifacts:
paths:
- ./policies
expire_in: 2h
only:
- master
script:
- echo 'Test'
allow_failure: false
triggerUpstream:
stage: deploy
only:
- master
trigger:
project: workspace/deploy
But I am getting the following error:
This job depends on other jobs with expired/erased artifacts:
I'm not sure what's wrong.
Looks like there is a problem sharing artifacts between pipelines as well as between projects. It is known bug and has been reported here:
https://gitlab.com/gitlab-org/gitlab/-/issues/228586
You can find a workaround there but since it needs to add access token to project it is not the best solution.
Your upstream pipeline job "Build" is set to only store its artifacts for 2 hours (from the expire_in: 2h line. Your downstream pipeline must have run at least 2 hours later than the artifacts were created, so the artifact expired and was erased, generating that error.
To solve it you can either update the expire_in field to however long you need them to be active (so for example if you know the downstream pipeline will run up to 5 days later, set it to 5d for 5 days), or rerun the Build job to recreate the artifacts.
You can read more about the expire_in keyword and artifacts in general from the docs
It isn't a problem with expired artifacts, the error is incorrect. In my case I am able to download the artifacts as a zip directly from the UI on the executed job. My expire_in is set to 1 week yet I am still getting this message.

Gitlab CI: build for CI and Merge request but publish only CI to pages

I have a .gitlab-ci.yml file which I want to use to run a script for merge request validation. The same script should be used in CI, but only there the result should be published to gitlab pages. Also, only for the CI, the result should be cached.
This is a simplified version of the current .gitlab-ci.yml:
pages:
stage: deploy
script:
- mkdir public/
- touch public/file.txt
artifacts:
paths:
- public
only:
- master
cache:
paths:
- fdroid
(The real-world code is in the fdroid-firefox gitlab repo.)
There are 2 ways how the pipeline is being triggered. Depending on this, I do or do not want to publish to pages:
by merge request validation. In this case, I want to execute the script part, but I don't want to publish or cache the result (otherwise, anyone with permissions to create a merge request could overwrite the gitlab pages content).
by CI (which is triggered both after check-in to master branch and following a schedule). In this case, I want the result to be cached and the gitlab pages to be updated.
I already tried splitting up the stages:
stages:
- build
- deploy
build_repo:
stage: build
script:
- mkdir public/
- touch public/file.txt
pages:
stage: deploy
script: echo "publish to Gitlab pages"
artifacts:
paths:
- public
only:
- master
cache:
paths:
- fdroid
(Original .gitlab-ci.yml file)
But by doing this, the pages:deploy stage faled because it does not have access to the result of the build stage. The pages:deploy stage shows an error symbol and on the tooltip it says missing pages artifacts. (real world log).
The log says:
Uploading artifacts for successful job
00:01
Uploading artifacts...
WARNING: public: no matching files
ERROR: No files to upload
What am I doing wrong that I don't have access to the result of the build stage?
How can I run the script section in both cases but still deploy to pages only from master branch?
You don't save your public path artifacts in your build job. And that's why they are missing at next deploy stage pages job.
You have this:
build_repo:
stage: build
script:
- your script
Try to save artifacts in your build job like this:
build_repo:
stage: build
script:
- your script
artifacts:
when: always
paths:
- public
So they will be passed to the next stage deploy and pages job could see them.

GitLab CI - Run pipeline when the contents of a file changes

I have a mono-repo with several projects (not my design choice).
Each project has a .gitlab-ci.yml setup to run a pipeline when a "version" file is changed. This is nice because a user can check-in to stage or master (for a hot-fix) and a build is created and deployed to a test environment.
The problem is when a user does a merge from master to stage and commits back to stage (to pull in any hot-fixes). This causes ALL the pipelines to run; even for projects that do not have actual content changes.
How do I allow the pipeline to run from master and/or stage but ONLY when the contents of the "version" file change? Like when a user changes the version number.
Here is an example of the .gitlab-ci.yml (I have 5 of these, 1 for each project in the mono-repo)
#
# BUILD-AND-TEST - initial build
#
my-project-build-and-test:
stage: build-and-test
script:
- cd $MY_PROJECT_DIR
- dotnet restore
- dotnet build
only:
changes:
- "MyProject/.gitlab-ci.VERSION.yml"
# no needs: here because this is the first step
#
# PUBLISH
#
my-project-publish:
stage: publish
script:
- cd $MY_PROJECT_DIR
- dotnet publish --output $MY_PROJECT_OUTPUT_PATH --configuration Release
only:
changes:
- "MyProject/.gitlab-ci.VERSION.yml"
needs:
- my-project-build-and-test
... and so on ...
I am still new to git, GitLab, and CI/pipelines. Any help would be appreciated! (I have little say in changing the mono-repo)
The following .gitlab-ci.yml will run the test_job only if the file version changes.
test_job:
script: echo hello world
rules:
- changes:
- version
See https://docs.gitlab.com/ee/ci/yaml/#ruleschanges
See also
Run jobs only/except for modifications on a path or file

Depoying a certain build with gitlab

My CI has two main steps. Build and deploy. The result of build is that an artifact is uploaded to maven nexus. And currently manual deploy step just takes the latest artifact from nexus and deploys it.
stages:
- build
- deploy
full:
stage: build
image: ubuntu
script:
- // Build and upload to nexus here
deploy:
stage: deploy
script:
- // Take latest artifact from nexus and deploy
when: manual
But to me this doesn't seem to make that much sense to always deploy latest build from every pipeline. I think ideally deploy step of each pipeline should deploy the artifact that was build by the same pipelines build task. Otherwise deploy step of each pipeline will do exactly the same thing regardless when it is started.
So I have two questions.
1) How can I make my deploy step to deploy the version that was build by this run?
2) If I still want to keep the "deploy latest" functionality, then does gitlab support adding a task separate of each pipeline because as I explained this step doesn't make a lot of seance to be in pipeline? I imagine it being in a separate specific place.
Not too familiar with maven and nexus, but assuming you can name the artifact before you push it, you can add one of the built-in environment variables that dictates which pipeline it's from.
ie:
...
Build:
stage: build
script:
- ./buildAsNormal.sh > build$CI_PIPELINE_ID.extension
- ./pushAsNormal.sh
Deploy:
stage: deploy
script:
- ./deployAsNormal #(but specify the build$CI_PIPELINE_ID.extension file)
There are a lot of CI env variables you can use that are extremely useful. The full list of them is here. The difference with $CI_PIPELINE_ID and $CI_JOB_ID is that the pipeline id is constant for all jobs in the pipeline, no matter when they execute. That means the pipeline id will be the same even if you run a manual step a week after the automated steps. The job id is specific to each job.
Regarding your comment, the usage of artifacts: can solve your problem.
You can put the version number in a file and get the file in the next stage :
stages:
- build
- deploy
full:
stage: build
image: ubuntu
script:
- echo "1.0.0" > version
- // Build and upload to nexus here
artifacts:
paths:
- version
expire_in: 1 week
deploy:
stage: deploy
script:
- VERSION = $(cat version)
- // Take the artifact from nexus using VERSION variable and deploy
when: manual
An alternative is to build, push to nexus and use artifact: to pass the result of the build to the Deploy job :
stages:
- build
- deploy
full:
stage: build
image: ubuntu
script:
- // Build and put the result in out/ directory
- // Upload the result from out/ to nexus
artifacts:
paths:
- out/
expire_in: 1 week
deploy:
stage: deploy
script:
- // Take the artifact from in out/ directory and deploy it
when: manual

Resources