I would like to run a particular job if a particular file has been changed.
Consider the following .gitlab-ci.yml:
stages:
- test
always:
stage: test
image: alpine
script:
- env | sort
change:
stage: test
image: alpine
rules:
- changes:
paths:
- dir/file
script:
- echo "dir/file"
If I modify and commit to the default (main) branch:
the dir/file, it triggers both the always and change job
any other file it will trigger only the always job
as expected.
Instead If I modify and commit to any other branch (ie create a merge request) it will run both jobs even if I do not modify the dir/file file.
To summarise:
[x] modify and commit dir/file: always and change jobs run
[x] modify and commit not dir/file file: only always job run
[ ] using the WebUI modify not dir/file, commit to new branch: always and change jobs run
Am I missing something?
Thanks
Update 1
stages:
- test
always:
stage: test
image: alpine
script:
- env | sort
change:
stage: test
image: alpine
rules:
- if: $CI_PIPELINE_SOURCE != "merge_request_event"
changes:
paths:
- dir/file
script:
- echo "dir/file"
Same as above:
[x] modify and commit dir/file: always and change jobs run
[x] modify and commit not dir/file file: only always job run
[ ] using the WebUI modify not dir/file, commit to new branch: always and change jobs run
First, check the indentation.
It is:
change:
stage: test
image: alpine
rules:
- changes:
paths:
- dir/file
^^ # this is important
If I modify and commit to any other branch (ie create a merge request)
Second, you can add additional criteria:
change:
stage: test
image: alpine
rules:
- if: $CI_PIPELINE_SOURCE != "merge_request_event"
changes:
paths:
- dir/file
^^ # this is important
That would exclude the case where a file is changed as part of a merge request.
Related
I am trying to setup CI in gitlab so
the second job (pushdev) will be available for running manually only after the devjob has run successfully.
the third job pushtostage will only run iff file has changed.
the way the jobs are setup, second and third jobs alway run. What is missing in the pipeline spec
devjob:
image: node:16
stage: publishdev
script:
- echo "running validation checks"
- npm run validate
rules:
- changes:
- ./src/myfile.txt
- when: manual
# - this jobs needs to run after "devjob" has run successfully
# and myfile.txt has changed
# - "needs" artifacts from the "lint" job
pushdev:
image: node:16
stage: publishdev
needs: [ "devjob", "lint"]
script:
- echo "Pushing changes after validation to dev"
- npm run pushdev
rules:
- changes:
- ./src/myfile.txt
when: on_success
- when: manual
pushtostage:
image: node:16
stage: pushstage
script:
- echo "Pushing changes to stage"
rules:
- changes:
- ./src/myfile.txt
- when: manual
I change your sample to look like this:
stages:
- publishdev
- pushstage
default:
image: ubuntu:20.04
lint:
stage: publishdev
script:
- echo "lint job"
devjob:
stage: publishdev
script:
- echo "running validation checks"
rules:
- changes:
- README.md
when: manual
allow_failure: false
pushdev:
stage: publishdev
needs: [ "devjob", "lint"]
script:
- echo "Pushing changes after validation to dev"
rules:
- changes:
- README.md
when: manual
allow_failure: false
pushtostage:
stage: pushstage
script:
- echo "Pushing changes to stage"
rules:
- changes:
- README.md
when: manual
allow_failure: false
I add allow_failure: false, because allow_failure when manual job default is true.
I merge your rules. because GitLab rules one - is one rule:
Rules are evaluated when the pipeline is created, and evaluated in order until the first match.
your .gitlab-ci.yml first job devjob is manual, so it is always a success, and your second job pushdev first rule changes and when: on_success always match, so it always run.
I change your .gitlab-ci.yml, first job devjob merge your rules when file change and set it is manual job and not allow_failure. and so on.
the sample code in Files · try-rules-stackoverflow-72594854-manual · GitLab PlayGround / Workshop / Tryci · GitLab
I am trying to run the below script as gitlab-ci.yml but somehow I am not able to trigger the CICD pipeline with a merge request.
Steps I am following:
Create a branch from feature/new and name it as feature/abc
Make test changes to feature/abc
Create Merge request with target branch as feature/new.
stages: # List of stages for jobs, and their order of execution
- deployment
- testing
workflow:
rules:
- if: '$CI_PIPELINE_SOURCE == "merge_request_event" && $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == "feature/new"'
deployment-dev: # This job runs in the deploy stage.
stage: deployment
image: google/cloud-sdk
services:
- docker:dind
script:
- echo hello
environment:
name: main/$CI_COMMIT_REF_NAME
when: on_success
testing-dev: # This job also runs in the test stage.
stage: testing
image: google/cloud-sdk
services:
- docker:dind
script:
- echo testing in progress
I want my pipelines to run:
backend_tests job for any push with changes in backend/ folder,
frontend_tests job for any push with changes in frontend/ folder,
old_code_tests job for push with changes in backend/ folder, or when MR is created and there are changes in backend folder in the source branch
My problem is that with the following .gitlab-ci.yml the old_code_tests job will run for any push to the branch with open MR, if there are already changes in the backend folder - even if push did not introduced such changes. I have no idea how to avoid this. The job is created correctly when MR is being created.
In other words: if there already is MR and branch contains backend changes and I push frontend only changes, the old_code_test job should NOT run - and it unexpectedly runs with given configuration.
I do not want to run old_code_tests job if there is no MR for given branch - and it works.
stages:
- frontend_tests
- backend_tests
- old_code
old_code_test:
extends: .test_old_code_template
stage: old_code
needs: []
script:
- echo "Test old code"
rules:
- if: '$CI_PIPELINE_SOURCE == "merge_request_event"'
- if: '$CI_PIPELINE_SOURCE == "push"'
when: never
- changes:
- backend/**/*
backend_tests:
stage: backend_tests
needs: []
extends: .backend_template
script:
- echo "Test backend"
rules:
- if: '$CI_PIPELINE_SOURCE == "push" && $CI_OPEN_MERGE_REQUESTS'
when: never
- changes:
- backend/**/*
frontend_tests:
stage: frontend_tests
needs: []
extends: .frontend_template
script:
- echo "Frontend test"
rules:
- if: '$CI_PIPELINE_SOURCE == "push" && $CI_OPEN_MERGE_REQUESTS'
when: never
- changes:
- frontend/**/*
Try to use only keyword in your old_code stage:
only:
changes:
- backend/**/*
I have a repo QA/tests which I want to run all the jobs when there is a push to this repo.
I used a script to generate the jobs dynamically:
job-generator:
stage: generate
tags:
- kuber
script:
- scripts/generate-job.sh > generated-job.yml
artifacts:
paths:
- generated-job.yml
main:
trigger:
include:
- artifact: generated-job.yml
job: job-generator
strategy: depend
At the next step, I have another repo products/first which I want to run a specific job in QA/tests at every push in the products/first so I tried:
stages:
- test
tests:
stage: test
variables:
TARGET: first
trigger:
project: QA/tests
branch: master
strategy: depend
Then I tried to define a global TARGET: all variable in my main gitlab-ci.yml and override it with the TARGET: first in the above YAML.
generate-job.sh:
#!/bin/bash
PRODUCTS=("first" "second" "third")
for P in "${PRODUCTS[#]}"; do
cat << EOF
$P:
stage: test
tags:
- kuber
script:
- echo -e "Hello from $P"
rules:
- if: '"$TARGET" == "all"'
when: always
- if: '"$TARGET" == $P'
when: always
EOF
done
But no results. the downstream pipeline doesn't have any job at all!
Any idea?
I am not sure if this is now helpful, but this looks like an over complicated approach from the outside. I have to say i have limited knowledge and my answer is based on assumption:
the QA/tests repository contains certain test cases for all repositories
QA/tests has the sole purpose of containing the tests, not an overview over the projects etc.
My Suggestion
As QA/tests is only containing tests which should be executed against each project, i would create a docker image out of it which is contains all the tests and can actually execute them. (lets calls it qa-tests:latest)
Within my projects i would add a step which uses this images, with my source code of the project and executes the tests:
qa-test:
image: qa-tests:latest
script:
- echo "command to execute scripts"
# add rules here accordingly
this would solve the issue with each push into the repositories. For an easier usage, i could create a QA-Tests.gitlab-ci.yml file which can be included by the sub-projects with
include:
- project: QA/tests
file: QA-Tests.gitlab-ci.yml
this way you do not need to do updates with in the repositories if the ci snippet changes.
Finally to trigger the execution on each push, you only need to trigger the pipelines of the subprojects from the QA/tests.
Disclaimer
As i said, i have only a limited few, as the goal is described but not the motivation. With this approach you remove some of the directive calls - mainly the ones triggering from sub projects to QA/tests. And it generates a clear structure, but it might not fit your needs.
I solved it with:
gitlab-ci.yml:
variables:
TARGET: all
job-generator:
stage: generate
tags:
- kuber
script:
- scripts/generate-job.sh > generated-job.yml
artifacts:
paths:
- generated-job.yml
main:
variables:
CHILD_TARGET: $TARGET
trigger:
include:
- artifact: generated-job.yml
job: job-generator
strategy: depend
and use CHILD_TARGET in my generate-job.sh:
#!/bin/bash
PRODUCTS=("first" "second" "third")
for P in "${PRODUCTS[#]}"; do
cat << EOF
$P:
stage: test
tags:
- kuber
script:
- echo -e "Hello from $P"
rules:
- if: '\$CHILD_TARGET == "all"'
when: always
- if: '\$CHILD_TARGET == "$P"'
when: always
EOF
done
So I could call it from other projects like this:
stages:
- test
e2e-tests:
stage: test
variables:
TARGET: first
trigger:
project: QA/tests
branch: master
strategy: depend
I have a pipeline with 3 stages: build, deploy-test and deploy-prod. I want stages to have following behavior:
always run build
run deploy-test automatically when on master or manually when on other branches
run deploy-prod manually, only available on master branch
My pipeline configuration seems to achieve that but I have a problem when trying to merge branches into master. I don't want to execute deploy-test stage on every branch before doing merge. Right now I am required to do that as the merge button is disabled with a message Pipeline blocked. The pipeline for this merge request requires a manual action to proceed. The setting Pipelines must succeed in project is disabled.
I tried adding additional rule to prevent deploy-test stage from running in merge requests but it didn't change anything:
rules:
- if: '$CI_MERGE_REQUEST_ID'
when: never
- if: '$CI_COMMIT_BRANCH == "master"'
when: on_success
- when: manual
Full pipeline configuration:
stages:
- build
- deploy-test
- deploy-prod
build:
stage: build
script:
- echo "build"
deploy-test:
stage: deploy-test
script:
- echo "deploy-test"
rules:
- if: '$CI_COMMIT_BRANCH == "master"'
when: on_success
- when: manual
deploy-prod:
stage: deploy-prod
script:
- echo "deploy-prod"
only:
- master
The only way I got it to work was to set ☑️ Skipped pipelines are considered successful in Setttings > General > Merge requests > Merge Checks
and marking the manual step as "allow_failure"
upload:
stage: 'upload'
rules:
# Only allow uploads for a pipeline source whitelisted here.
# See: https://docs.gitlab.com/ee/ci/jobs/job_control.html#common-if-clauses-for-rules
- if: $CI_COMMIT_BRANCH
when: 'manual'
allow_failure: true
After this clicking the Merge when Pipeline succeeds button …
… will merge the MR without any manual interaction:
I've opened a merge request from branch "mybranch" into "master" with the following .gitlab-ci.yml:
image: alpine
stages:
- build
- deploy-test
- deploy-prod
build:
stage: build
script:
- echo "build"
# run deploy-test automatically when on master or manually when on other branches
# Don't run on merge requests
deploy-test:
stage: deploy-test
script:
- echo "deploy-test"
rules:
- if: $CI_MERGE_REQUEST_ID
when: never
- if: '$CI_COMMIT_BRANCH == "master"'
when: on_success
- when: manual
# run deploy-prod manually, only available on master branch
deploy-prod:
stage: deploy-prod
script:
- echo "deploy-prod"
rules:
- if: '$CI_COMMIT_BRANCH == "master"'
when: manual
Notes:
only is deprecated, so I replaced it with if
I added Alpine image to make the jobs run faster (slimmer container); it doesn't affect the logic
When I pushed changes to branch "mybranch", GitLab did the following:
showed a blue "Merge when pipeline succeeds" button on my MR
ran "build" stage
skipped "deploy-prod" stage (only available on "master" branch)
gave me a manual "play" button to run the job on "mybranch"
at this point, the pipeline status is "blocked" and the MR is showing "Pipeline blocked. The pipeline for this merge request requires a manual action to proceed"
now I manually start the "deploy-test" stage by selecting the Play icon in the Pipelines screen
pipeline status indicator changes to "running" and then to "passed"
my merge request shows the pipeline passed and gives me the green "Merge" button
There are a number of variables that are available to the pipeline on runtime - Predefined variables reference
Some are available specifically for pipelines associated with merge requests - Predefined variables for merge request pipelines
You can utilize one or more of these variables to determine if you would want to run the deploy-test job for that merge request.
For example, you could use mention the phrase "skip_cicd" in your merge request title, access it with CI_MERGE_REQUEST_TITLE variable and create a rule. Your pipeline would look somewhat like this (please do test the rule, I have edited the pipeline off the top of my head and could be wrong) -
stages:
- build
- deploy-test
- deploy-prod
build:
stage: build
script:
- echo "build"
deploy-test:
stage: deploy-test
script:
- echo "deploy-test"
rules:
- if: '$CI_MERGE_REQUEST_TITLE == *"skip_cicd"*'
when: never
- if: '$CI_COMMIT_BRANCH == "master"'
when: on_success
- when: manual
deploy-prod:
stage: deploy-prod
script:
- echo "deploy-prod"
only:
- master