How to use the "cache" in gitlab? - gitlab

I have created the following test pipeline in gitlab:
job1:
stage: stage1
rules:
- if: $RUN != "run2"
when: always
- when: never
script:
- echo "Job stage1 updating file dates.txt"
- date >> data/dates.txt
cache:
key: statusfiles
paths:
- data/dates.txt
job2:
stage: stage2
rules:
- if: $RUN == "run2"
when: always
script:
- echo "Running stage 2"
- cat data/dates.txt
cache:
key: statusfiles
paths:
- data/dates.txt
where I want to use the "cache" feature of gitlab.
Here I first run job1 which update the file date.txt and add an entry to this example files, so it contains two lines.
However, when I run a new pipeline with job2 alone, the files contain only ONE line. The files seem to be the original, unmodified files.
How can I "upload" or "save" the files into the cache in job1, so I can use the updated file in a later run of job2?

Test first if the section "Share caches between jobs in the same branch" is relevant.
To have jobs in each branch use the same cache, define a cache with the key: >
$CI_COMMIT_REF_SLUG:
cache:
key: $CI_COMMIT_REF_SLUG
This configuration prevents you from accidentally overwriting the cache.
In your case, from the discussion, and from "How archiving and extracting works":
stages:
- build
- test
before_script:
- echo "Hello"
job A:
stage: build
rules:
- if: $RUN != "run3"
when: always
- when: never
script:
- mkdir -p data/
- date > data/hello.txt
cache:
key: build-cache
paths:
- data/
after_script:
- echo "World"
job B:
stage: test
rules:
- if: $RUN == "run3"
script:
- cat data/hello.txt
cache:
key: build-cache
paths:
- data/
By using a single runner on a single machine, you don’t have the issue where job B might execute on a runner different from job A.
This setup guarantees the cache can be reused between stages.
It only works if the execution goes from the build stage to the test stage in the same runner/machine. Otherwise, the cache might not be available.

Related

Gitlab CI - Cannot get local includes to work

Gitlab version: 14.3.3
I cannot get all jobs to be created and work in the pipeline. I get an infinite load on running pipelines.
I'm expecting
On a MR
build:merge-request-pipeline-dynamic_script
build:shared-pipeline-dump-variables
test:shared-pipeline-test-job-artifacting-always
test:shared-pipeline-test-job-artifacting-always-multi
On Main
build:main-pipeline-dynamic_script
build:shared-pipeline-dump-variables
test:shared-pipeline-test-job-artifacting-always
test:shared-pipeline-test-job-artifacting-always-multi
# .gitlab-ci.yml
image: ruby:2.3
# Global rules on all pipelines
workflow:
rules:
# Mask and prevent builds on noci commit SHA's
- if: $CI_COMMIT_TITLE =~ /^noci/
when: never
# Specific pipelines with specific rules
include:
- local: main-pipeline.yml
rules:
# Any commits to `main` i.e. we have merged something in
- if: $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == $CI_DEFAULT_BRANCH
- local: merge-request-pipeline.yml
rules:
# Merge requests going in will build
- if: $CI_PIPELINE_SOURCE == "merge_request_event"
# Catch all pipeline that will just build everything that we want all the time
# NB: This will obviously only be included if we pass the global workflow check
- local: shared-pipeline.yml
# main-pipeline.yml
variables:
BUILD_TYPE: commit to main
main-pipeline-dynamic_script:
stage: build
script:
- echo "Were running directly in main!"
- echo "USER VARIABLE LIST (SHOULD BE PRESENT) - BUILD_TYPE - $BUILD_TYPE"
# merge-request-pipeline.yml
variables:
BUILD_TYPE: merge-request
merge-request-pipeline-dynamic_script:
stage: build
script:
- echo "Were running on a merge request!"
- echo "USER VARIABLE LIST (SHOULD BE PRESENT) - BUILD_TYPE - $BUILD_TYPE"
# shared-pipeline.yml
variables:
NO_DEFAULT:
description: This has no default. It should interpolate as blank.
WITH_DEFAULT:
description: This has a default. It should interpolate as default.
value: default
shared-pipeline-dump-variables:
stage: build
script:
# ... removed not important
shared-pipeline-test-job-artifacting-always:
stage: test
script:
- # redundant
shared-pipeline-test-job-artifacting-always-multi:
stage: test
script:
- echo "This job will artifact logs"
- echo "log1" > log1.log
- echo "log2" > log2.log
- # removed redundant
artifacts:
paths:
- log1.log
- log2.log
- log3.log
- log/log4.log
- log/log5.log
- log/log6doesnotexist.log

gitlab CI and r issue with using manual and manual rules together

I am trying to setup CI in gitlab so
the second job (pushdev) will be available for running manually only after the devjob has run successfully.
the third job pushtostage will only run iff file has changed.
the way the jobs are setup, second and third jobs alway run. What is missing in the pipeline spec
devjob:
image: node:16
stage: publishdev
script:
- echo "running validation checks"
- npm run validate
rules:
- changes:
- ./src/myfile.txt
- when: manual
# - this jobs needs to run after "devjob" has run successfully
# and myfile.txt has changed
# - "needs" artifacts from the "lint" job
pushdev:
image: node:16
stage: publishdev
needs: [ "devjob", "lint"]
script:
- echo "Pushing changes after validation to dev"
- npm run pushdev
rules:
- changes:
- ./src/myfile.txt
when: on_success
- when: manual
pushtostage:
image: node:16
stage: pushstage
script:
- echo "Pushing changes to stage"
rules:
- changes:
- ./src/myfile.txt
- when: manual
I change your sample to look like this:
stages:
- publishdev
- pushstage
default:
image: ubuntu:20.04
lint:
stage: publishdev
script:
- echo "lint job"
devjob:
stage: publishdev
script:
- echo "running validation checks"
rules:
- changes:
- README.md
when: manual
allow_failure: false
pushdev:
stage: publishdev
needs: [ "devjob", "lint"]
script:
- echo "Pushing changes after validation to dev"
rules:
- changes:
- README.md
when: manual
allow_failure: false
pushtostage:
stage: pushstage
script:
- echo "Pushing changes to stage"
rules:
- changes:
- README.md
when: manual
allow_failure: false
I add allow_failure: false, because allow_failure when manual job default is true.
I merge your rules. because GitLab rules one - is one rule:
Rules are evaluated when the pipeline is created, and evaluated in order until the first match.
your .gitlab-ci.yml first job devjob is manual, so it is always a success, and your second job pushdev first rule changes and when: on_success always match, so it always run.
I change your .gitlab-ci.yml, first job devjob merge your rules when file change and set it is manual job and not allow_failure. and so on.
the sample code in Files · try-rules-stackoverflow-72594854-manual · GitLab PlayGround / Workshop / Tryci · GitLab

How to exclude gitlab-ci.yml changes from triggering a job

I am unable to find a solution for how to ignore changes made in .gitlab-ci.yml to trigger a job. So far I have tried the below options:
except:
changes:
- .gitlab-ci.yml
and
only
- Branch A
but every time i make changes in .gitlab.ci-yml file, jobs for Stage B get added in pipeline and show as skipped.
Below are the jobs defined in .gitlab-ci.yml. Do you have any suggestion here?
I do not want Stage B jobs get added in pipeline when:
i) push made against the .gitlab-ci.yml (either manual changing file or git push command)
ii) any merge request for .gitlab-ci.yml
stages:
- A
- B
Stage A:
stage: A
script:
- echo "TEST"
rules:
- if: '$CI_COMMIT_TAG =~ /^\d+\.\d+\.DEV\d+/'
tags:
- runner
Stage B:
stage: B
script:
- echo "TEST"
when: manual
tags:
- runner
With this setup the Stage B is not added if .gitlab-ci.yml is modified:
stages:
- A
- B
Stage A:
stage: A
script:
- echo "Stage A"
tags:
- runner
Stage B:
stage: B
script:
- echo "Stage B"
rules:
- changes:
- ".gitlab-ci.yml"
when: never
- when: manual
tags:
- runner
Otherwise Stage B is showed in the pipeline and can be run manually. Tested with GitLab CI CE 14.1.0.
Could you try using workflow rule? It should determine if the pipeline is created.
P.S: Someone complained a couple of years ago about not being able to activate manual jobs after the exception, but it looks like it was a bug. I can't find the issue mentioned in the post
Edit:
This conf skips any commit with changes README.md or .gitlab-ci.yml:
workflow:
rules:
- if:
changes:
- README.md
- .gitlab-ci.yml
when: never

Run specific job in Gitlab CI base on a condition

I have a repo QA/tests which I want to run all the jobs when there is a push to this repo.
I used a script to generate the jobs dynamically:
job-generator:
stage: generate
tags:
- kuber
script:
- scripts/generate-job.sh > generated-job.yml
artifacts:
paths:
- generated-job.yml
main:
trigger:
include:
- artifact: generated-job.yml
job: job-generator
strategy: depend
At the next step, I have another repo products/first which I want to run a specific job in QA/tests at every push in the products/first so I tried:
stages:
- test
tests:
stage: test
variables:
TARGET: first
trigger:
project: QA/tests
branch: master
strategy: depend
Then I tried to define a global TARGET: all variable in my main gitlab-ci.yml and override it with the TARGET: first in the above YAML.
generate-job.sh:
#!/bin/bash
PRODUCTS=("first" "second" "third")
for P in "${PRODUCTS[#]}"; do
cat << EOF
$P:
stage: test
tags:
- kuber
script:
- echo -e "Hello from $P"
rules:
- if: '"$TARGET" == "all"'
when: always
- if: '"$TARGET" == $P'
when: always
EOF
done
But no results. the downstream pipeline doesn't have any job at all!
Any idea?
I am not sure if this is now helpful, but this looks like an over complicated approach from the outside. I have to say i have limited knowledge and my answer is based on assumption:
the QA/tests repository contains certain test cases for all repositories
QA/tests has the sole purpose of containing the tests, not an overview over the projects etc.
My Suggestion
As QA/tests is only containing tests which should be executed against each project, i would create a docker image out of it which is contains all the tests and can actually execute them. (lets calls it qa-tests:latest)
Within my projects i would add a step which uses this images, with my source code of the project and executes the tests:
qa-test:
image: qa-tests:latest
script:
- echo "command to execute scripts"
# add rules here accordingly
this would solve the issue with each push into the repositories. For an easier usage, i could create a QA-Tests.gitlab-ci.yml file which can be included by the sub-projects with
include:
- project: QA/tests
file: QA-Tests.gitlab-ci.yml
this way you do not need to do updates with in the repositories if the ci snippet changes.
Finally to trigger the execution on each push, you only need to trigger the pipelines of the subprojects from the QA/tests.
Disclaimer
As i said, i have only a limited few, as the goal is described but not the motivation. With this approach you remove some of the directive calls - mainly the ones triggering from sub projects to QA/tests. And it generates a clear structure, but it might not fit your needs.
I solved it with:
gitlab-ci.yml:
variables:
TARGET: all
job-generator:
stage: generate
tags:
- kuber
script:
- scripts/generate-job.sh > generated-job.yml
artifacts:
paths:
- generated-job.yml
main:
variables:
CHILD_TARGET: $TARGET
trigger:
include:
- artifact: generated-job.yml
job: job-generator
strategy: depend
and use CHILD_TARGET in my generate-job.sh:
#!/bin/bash
PRODUCTS=("first" "second" "third")
for P in "${PRODUCTS[#]}"; do
cat << EOF
$P:
stage: test
tags:
- kuber
script:
- echo -e "Hello from $P"
rules:
- if: '\$CHILD_TARGET == "all"'
when: always
- if: '\$CHILD_TARGET == "$P"'
when: always
EOF
done
So I could call it from other projects like this:
stages:
- test
e2e-tests:
stage: test
variables:
TARGET: first
trigger:
project: QA/tests
branch: master
strategy: depend

Gitlab CI/CD: use multiple when conditions

I have like this gitlab ci cd configuration file:
image: docker:git
stages:
- develop
- production
default:
before_script:
- apk update && apk upgrade && apk add git curl
deploy:
stage: develop
script:
- echo "Hello World"
backup:
stage: develop
when:
- manual
- on_success
remove:
stage: develop
when:
- delayed
- on_success
start_in: 30 minutes
In my case job deploy runs automaticaly and job backup must runs manually only when successfully completed job deploy. But in my case this configuration doesn't works and I get error with message:
Found errors in your .gitlab-ci.yml:
jobs:backup when should be one of:
on_success
on_failure
always
manual
delayed
How I can use multiple when option arguments in my case?
Basically you can't because when does not expect an array. You can work around it though with needs. But this solution does only work if you run your jobs in different stages.
image: docker:git
stages:
- deploy
- backup
- remove
deploy:develop:
stage: deploy
script:
- exit 1
backup:develop:
stage: backup
script:
- echo "backup"
when: manual
needs: ["deploy:develop"]
remove:develop:
stage: remove
script:
- echo "remove"
when: delayed
needs: ["backup:develop"]
start_in: 30 minutes

Resources