how to add logical negation rules in .gitlab-ci.yml - gitlab

We need to support such piplines in gitlab:
when changes are commit in /sdk/, sdk-pipline will run;
when other changes commit except for /sdk/, main-pipline will run;
SDK pipline script is written as below:
run_sdk_build_pipeline:
stage: trigger
trigger:
strategy: depend
include: "$CI_PROJECT_DIR/.gitlab-ci/pipelines/sdk.gitlab-ci.yml"
rules:
- changes:
- sdk/**/*
Main pipline script is written as below:
run_main_pipeline:
stage: trigger
trigger:
strategy: depend
include: "$CI_PROJECT_DIR/.gitlab-ci/pipelines/main.gitlab-ci.yml"
rules:
- changes:
- // want to trigger it when changes commit except for /sdk/*
How to write this rule condition?
Expect to get help!

In your case, only:changes / except:changes examples may help
run_sdk_build_pipeline:
stage: trigger
trigger:
strategy: depend
include: "$CI_PROJECT_DIR/.gitlab-ci/pipelines/sdk.gitlab-ci.yml"
except:
changes:
- sdk/**/*

Related

Overriding external gitlab.yml file

Suppose we have a .gitlab-ci.yml file that reads in a common.gitlab-ci.complete.yml file:
include: common.gitlab-ci.complete.yml
Suppose that common.gitlab-ci.complete.yml has the following stages:
stages:
- build
- unit-test
- mutation-test
In the .gitlab-ci.yml file, how do we ignore the unit-test and mutation-test stages? Would it be something like:
include: common.gitlab-ci.complete.yml
stages:
- build
#- unit-test
#- mutation-test
Would these stages override the one in the common.gitlab-ci.complete.yml file?
(I have not yet checked this code for accuracy. Treat it as pseudocode.)
You're asking to remove all jobs in a stage. If you look at the documentation for the include: keyword, you'll note that the YAML is always merged with your current CI script. So no, you can't just add a statement like "Disable all jobs in stage A".
Here's an idea... recall that
If a stage is defined but no jobs use it, the stage is not visible in the pipeline...
https://docs.gitlab.com/ee/ci/yaml/#stages
So what you could do is add a mechanism for disabling all jobs in that stage. Then the stage will disappear as well. That mechanism might be checking a variable in rules.
So if you wrote the common.gitlab-ci.complete.yml with templates that check a variable to disable the job...
stages:
- build
- unit-test
- package
.unit-test-stage:
stage: unit-test
when: on_success
rules:
- if: $DISABLE_UNIT_TESTS
when: never
unit-test-job-a:
extends: .unit-test-stage
script:
- echo "do stuff"
unit-test-job-b:
extends: .unit-test-stage
script:
- echo "do more stuff"
Then maybe your top-level .gitlab-ci.yml would simply set that known variable in its global variables section like this:
include: common.gitlab-ci.complete.yml
variables:
DISABLE_UNIT_TESTS: 1

gitlab-ci.yml only on master branch

I have a gitlab-ci.yml file like this, and want to run it only on Branch Master. If there is a push into develop branch the Pipeline should NOT start.
I tried with 'only' keyword, but it shows an Error.
stages:
- info
- build
- test
- review
- cleanup
- deploy-dev
- integration-test
- deploy-test
- system-test
- deploy-production
only:
refs:
- master
To define a trigger rule for every stage you can use the workflow keyword, like this:
workflow:
rules:
- if: $CI_COMMIT_TAG
when: never
- if: $CI_COMMIT_BRANCH == 'master'
This has to be on the "root" of your yaml, as it is not part of any job in particular.
In the example above, I am telling the pipeline to avoid running when a repository tag is pushed and to run only when a commit is done in the master branch.
You can use this as a base and add other conditions to trigger the stages in the pipeline, the complete documentation on this matters can be found here: https://docs.gitlab.com/ee/ci/yaml/#workflow
I thin you have an indentation problem here.
It should be something like:
stages:
- job_1
- job_2
- ....
- job_n
job_1:
stage: job_1
....
only:
refs:
- master
job_2:
stage: job_2
....
only:
refs:
- master
....
You need to define the target branch for each stage.

Run specific job in Gitlab CI base on a condition

I have a repo QA/tests which I want to run all the jobs when there is a push to this repo.
I used a script to generate the jobs dynamically:
job-generator:
stage: generate
tags:
- kuber
script:
- scripts/generate-job.sh > generated-job.yml
artifacts:
paths:
- generated-job.yml
main:
trigger:
include:
- artifact: generated-job.yml
job: job-generator
strategy: depend
At the next step, I have another repo products/first which I want to run a specific job in QA/tests at every push in the products/first so I tried:
stages:
- test
tests:
stage: test
variables:
TARGET: first
trigger:
project: QA/tests
branch: master
strategy: depend
Then I tried to define a global TARGET: all variable in my main gitlab-ci.yml and override it with the TARGET: first in the above YAML.
generate-job.sh:
#!/bin/bash
PRODUCTS=("first" "second" "third")
for P in "${PRODUCTS[#]}"; do
cat << EOF
$P:
stage: test
tags:
- kuber
script:
- echo -e "Hello from $P"
rules:
- if: '"$TARGET" == "all"'
when: always
- if: '"$TARGET" == $P'
when: always
EOF
done
But no results. the downstream pipeline doesn't have any job at all!
Any idea?
I am not sure if this is now helpful, but this looks like an over complicated approach from the outside. I have to say i have limited knowledge and my answer is based on assumption:
the QA/tests repository contains certain test cases for all repositories
QA/tests has the sole purpose of containing the tests, not an overview over the projects etc.
My Suggestion
As QA/tests is only containing tests which should be executed against each project, i would create a docker image out of it which is contains all the tests and can actually execute them. (lets calls it qa-tests:latest)
Within my projects i would add a step which uses this images, with my source code of the project and executes the tests:
qa-test:
image: qa-tests:latest
script:
- echo "command to execute scripts"
# add rules here accordingly
this would solve the issue with each push into the repositories. For an easier usage, i could create a QA-Tests.gitlab-ci.yml file which can be included by the sub-projects with
include:
- project: QA/tests
file: QA-Tests.gitlab-ci.yml
this way you do not need to do updates with in the repositories if the ci snippet changes.
Finally to trigger the execution on each push, you only need to trigger the pipelines of the subprojects from the QA/tests.
Disclaimer
As i said, i have only a limited few, as the goal is described but not the motivation. With this approach you remove some of the directive calls - mainly the ones triggering from sub projects to QA/tests. And it generates a clear structure, but it might not fit your needs.
I solved it with:
gitlab-ci.yml:
variables:
TARGET: all
job-generator:
stage: generate
tags:
- kuber
script:
- scripts/generate-job.sh > generated-job.yml
artifacts:
paths:
- generated-job.yml
main:
variables:
CHILD_TARGET: $TARGET
trigger:
include:
- artifact: generated-job.yml
job: job-generator
strategy: depend
and use CHILD_TARGET in my generate-job.sh:
#!/bin/bash
PRODUCTS=("first" "second" "third")
for P in "${PRODUCTS[#]}"; do
cat << EOF
$P:
stage: test
tags:
- kuber
script:
- echo -e "Hello from $P"
rules:
- if: '\$CHILD_TARGET == "all"'
when: always
- if: '\$CHILD_TARGET == "$P"'
when: always
EOF
done
So I could call it from other projects like this:
stages:
- test
e2e-tests:
stage: test
variables:
TARGET: first
trigger:
project: QA/tests
branch: master
strategy: depend

Gitlab CI ignores jobs in child-pipleine

I built a .gitlab-ci-yaml which looks like this and triggers two child-pipelines:
stages:
- prepare
- triggers
include: 'global-gitlab-ci.yaml'
...
frontend:
stage: triggers
trigger:
include: frontend-gitlab-ci.yaml
backend:
stage: triggers
trigger:
include: backend-gitlab-ci.yaml
the child-pipelines both look like this:
stages:
- build
- test
include: 'global-gitlab-ci.yaml'
test_frontend:
stage: test
image: ...
script:
- ...
build_frontend:
stage: build
image: ...
script:
- ...
When I run this pipelines, I get the following:
The frontend-pipeline (as well as the backend-pipeline) only shows one job - the second one is ignored.
What's the matter? Is the child-pipline not supposed to contain a complete pipeline with several jobs?
(Btw global-gitlab-ci.yaml just contains default: definitions)

Trigger a pipeline when specific file is changed

I am creating a new CI pipeline that will be triggered anytime a .bicep file is changed and then zip up all of the files.
# Pipeline is triggered anytime there is a change to .bicep files
trigger:
branches:
include:
- "feature/*"
pool:
vmImage: ubuntu-latest
steps:
- script: echo Hello, world!
displayName: 'Run a one-line script'
This pipeline works and is triggered anytime a change is made in the feature branch.
To target any .bicep files I am trying:
trigger:
branches:
include:
- "feature/*"
paths:
include:
- '**/*.bicep'
I also tried to specify the entire route that holds the files:
trigger:
branches:
include:
- "feature/*"
paths:
include:
- "src/Deployment/IaC/Bicep/*"
When I make a change to a .bicep file in the feature branch, the pipeline is never triggered so I know my syntax is wrong.
Wildcards are not supported anymore for azure pipelines.
Instead just set the relative path to your Bicep folder like so :
paths:
include:
- src/Deployment/IaC/Bicep
see : https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/azure-repos-git?tabs=yaml&view=azure-devops#paths

Resources