Gitlab CI ignores jobs in child-pipleine - gitlab

I built a .gitlab-ci-yaml which looks like this and triggers two child-pipelines:
stages:
- prepare
- triggers
include: 'global-gitlab-ci.yaml'
...
frontend:
stage: triggers
trigger:
include: frontend-gitlab-ci.yaml
backend:
stage: triggers
trigger:
include: backend-gitlab-ci.yaml
the child-pipelines both look like this:
stages:
- build
- test
include: 'global-gitlab-ci.yaml'
test_frontend:
stage: test
image: ...
script:
- ...
build_frontend:
stage: build
image: ...
script:
- ...
When I run this pipelines, I get the following:
The frontend-pipeline (as well as the backend-pipeline) only shows one job - the second one is ignored.
What's the matter? Is the child-pipline not supposed to contain a complete pipeline with several jobs?
(Btw global-gitlab-ci.yaml just contains default: definitions)

Related

how to add logical negation rules in .gitlab-ci.yml

We need to support such piplines in gitlab:
when changes are commit in /sdk/, sdk-pipline will run;
when other changes commit except for /sdk/, main-pipline will run;
SDK pipline script is written as below:
run_sdk_build_pipeline:
stage: trigger
trigger:
strategy: depend
include: "$CI_PROJECT_DIR/.gitlab-ci/pipelines/sdk.gitlab-ci.yml"
rules:
- changes:
- sdk/**/*
Main pipline script is written as below:
run_main_pipeline:
stage: trigger
trigger:
strategy: depend
include: "$CI_PROJECT_DIR/.gitlab-ci/pipelines/main.gitlab-ci.yml"
rules:
- changes:
- // want to trigger it when changes commit except for /sdk/*
How to write this rule condition?
Expect to get help!
In your case, only:changes / except:changes examples may help
run_sdk_build_pipeline:
stage: trigger
trigger:
strategy: depend
include: "$CI_PROJECT_DIR/.gitlab-ci/pipelines/sdk.gitlab-ci.yml"
except:
changes:
- sdk/**/*

Run specific job in Gitlab CI base on a condition

I have a repo QA/tests which I want to run all the jobs when there is a push to this repo.
I used a script to generate the jobs dynamically:
job-generator:
stage: generate
tags:
- kuber
script:
- scripts/generate-job.sh > generated-job.yml
artifacts:
paths:
- generated-job.yml
main:
trigger:
include:
- artifact: generated-job.yml
job: job-generator
strategy: depend
At the next step, I have another repo products/first which I want to run a specific job in QA/tests at every push in the products/first so I tried:
stages:
- test
tests:
stage: test
variables:
TARGET: first
trigger:
project: QA/tests
branch: master
strategy: depend
Then I tried to define a global TARGET: all variable in my main gitlab-ci.yml and override it with the TARGET: first in the above YAML.
generate-job.sh:
#!/bin/bash
PRODUCTS=("first" "second" "third")
for P in "${PRODUCTS[#]}"; do
cat << EOF
$P:
stage: test
tags:
- kuber
script:
- echo -e "Hello from $P"
rules:
- if: '"$TARGET" == "all"'
when: always
- if: '"$TARGET" == $P'
when: always
EOF
done
But no results. the downstream pipeline doesn't have any job at all!
Any idea?
I am not sure if this is now helpful, but this looks like an over complicated approach from the outside. I have to say i have limited knowledge and my answer is based on assumption:
the QA/tests repository contains certain test cases for all repositories
QA/tests has the sole purpose of containing the tests, not an overview over the projects etc.
My Suggestion
As QA/tests is only containing tests which should be executed against each project, i would create a docker image out of it which is contains all the tests and can actually execute them. (lets calls it qa-tests:latest)
Within my projects i would add a step which uses this images, with my source code of the project and executes the tests:
qa-test:
image: qa-tests:latest
script:
- echo "command to execute scripts"
# add rules here accordingly
this would solve the issue with each push into the repositories. For an easier usage, i could create a QA-Tests.gitlab-ci.yml file which can be included by the sub-projects with
include:
- project: QA/tests
file: QA-Tests.gitlab-ci.yml
this way you do not need to do updates with in the repositories if the ci snippet changes.
Finally to trigger the execution on each push, you only need to trigger the pipelines of the subprojects from the QA/tests.
Disclaimer
As i said, i have only a limited few, as the goal is described but not the motivation. With this approach you remove some of the directive calls - mainly the ones triggering from sub projects to QA/tests. And it generates a clear structure, but it might not fit your needs.
I solved it with:
gitlab-ci.yml:
variables:
TARGET: all
job-generator:
stage: generate
tags:
- kuber
script:
- scripts/generate-job.sh > generated-job.yml
artifacts:
paths:
- generated-job.yml
main:
variables:
CHILD_TARGET: $TARGET
trigger:
include:
- artifact: generated-job.yml
job: job-generator
strategy: depend
and use CHILD_TARGET in my generate-job.sh:
#!/bin/bash
PRODUCTS=("first" "second" "third")
for P in "${PRODUCTS[#]}"; do
cat << EOF
$P:
stage: test
tags:
- kuber
script:
- echo -e "Hello from $P"
rules:
- if: '\$CHILD_TARGET == "all"'
when: always
- if: '\$CHILD_TARGET == "$P"'
when: always
EOF
done
So I could call it from other projects like this:
stages:
- test
e2e-tests:
stage: test
variables:
TARGET: first
trigger:
project: QA/tests
branch: master
strategy: depend

Gitlab CI parent/child pipelines with complex subfolders

i have a problem with gitlab (community edition, version 14.1.2) CI with complex pipeline on my monorepo.
My structure is client/server:
root/
---- server/
-------- lib/
----------- libA/
----------- libB/
----------- libC/
-------- applications/
----------- appA/
----------- appB/
----------- appC/
---- client/
-------- applications/
------------- appA/
------------- appB/
...
Every folder (root, server, lib, libA, libB, libC etc...) have his own ".gitlab-ci.yml"
Root ".gitlab-ci.yml" is:
stages:
- build
- test
build-server:
stage: build
trigger:
include:
- local: 'server/.gitlab-ci.yml'
rules:
- changes:
- server/**/*
build-client:
stage: build
trigger:
include:
- local: 'client/.gitlab-ci.yml'
rules:
- changes:
- client/**/*
Server ".gitlab-ci.yml" is:
stages:
- build
- test
build-lib:
stage: build
trigger:
include:
- local: 'lib/.gitlab-ci.yml'
rules:
- changes:
- lib/**/*
build-applications:
stage: build
trigger:
include:
- local: 'applications/.gitlab-ci.yml'
rules:
- changes:
- applications/**/*
lib ".gitlab-ci.yml" is:
stages:
- build
- test
build-libA:
stage: build
script:
- echo "Execute libA build!"
rules:
- changes:
- libA/**/*
build-libB:
stage: build
script:
- echo "Execute libB build!"
rules:
- changes:
- libB/**/*
If i change a file inside libA only the ".gitlab-ci.yml" of root folder is triggered, other subfolders not detect file changes and not trigger the build.
The purpose of this configuration is that , for example, when i change a file inside libA, the pipeline detects the changes and build the libA.
Somone can help me to resolve? I hope the structure and the problem is clear. Thanks.
UPDATE
I'm using gitlab 14.1.0
Thanks to DavidC for the answer but with your solution I have not solved my problem, especially with the trigger $CI_PROJECT_PATH seems not to work.
After some time I finally got a solution (which can be evolved with variables)
Root ".gitlab-ci.yml" is:
stages:
- build
- test
build-server:
stage: build
trigger:
include:
- local: '/server/.gitlab-ci.yml'
rules:
- changes:
- server/**/*
build-client:
stage: build
trigger:
include:
- local: '/client/.gitlab-ci.yml'
rules:
- changes:
- client/**/*
Server ".gitlab-ci.yml" is:
stages:
- build
- test
build-lib:
stage: build
trigger:
include:
- local: '/server/lib/.gitlab-ci.yml'
rules:
- changes:
- server/lib/**/*
build-applications:
stage: build
trigger:
include:
- local: '/server/applications/.gitlab-ci.yml'
rules:
- changes:
- server/applications/**/*
lib ".gitlab-ci.yml" is:
stages:
- build
- test
build-libA:
stage: build
script:
- echo "Execute libA build!"
rules:
- changes:
- server/lib/libA/**/*
build-libB:
stage: build
script:
- echo "Execute libB build!"
rules:
- changes:
- server/lib/libB/**/*
Pay attention to this line from the gitlab documentation: "Parent and child pipelines were introduced with a maximum depth of one child pipeline level, which was subsequently increased to two. A parent pipeline can activate many child pipelines and these child pipelines can activate their own child pipelines. It is not possible to activate another level of child pipeline. " refer to: https://docs.gitlab.com/ee/ci/pipelines/parent_child_pipelines.html#nested-child-pipelines
Thanks for help!
It seems like GitLab child-pipeline context execution path is the same as the root directory of your repository, and is not relative to the path of the child-pipeline gitlab-ci.yml file.
The only solution so far seems to either give the path to the directory you want as a variable to your child-pipeline or to directly define it in the .gitlab-ci.yml.
Example :
Root ".gitlab-ci.yml" is:
stages:
- build
- test
build-server:
stage: build
variables:
CI_ROOT_DIR: server # you don't need to provide it, if you define it in the server/.gitlab-ci.yml file
trigger:
include:
- local: '$CI_ROOT_DIR/.gitlab-ci.yml'
rules:
- changes:
- $CI_ROOT_DIR/**/*
build-client:
stage: build
variables:
CI_ROOT_DIR: client
trigger:
include:
- local: '$CI_ROOT_DIR/.gitlab-ci.yml'
rules:
- changes:
- $CI_ROOT_DIR/**/*
Server ".gitlab-ci.yml" is:
stages:
- build
- test
variables:
CI_ROOT_DIR: $CI_PROJECT_PATH/server # default
build-lib:
stage: build
variables:
CI_ROOT_DIR: $CI_ROOT_DIR/lib # you don't need to provide it, if you define it in the server/lib/.gitlab-ci.yml file
trigger:
include:
- local: '$CI_ROOT_DIR/.gitlab-ci.yml'
rules:
- changes:
- $CI_ROOT_DIR/**/*
build-applications:
stage: build
variables:
CI_ROOT_DIR: $CI_ROOT_DIR/applications # you don't need to provide it, if you define it in the server/applications/.gitlab-ci.yml file
trigger:
include:
- local: '$CI_ROOT_DIR/.gitlab-ci.yml'
rules:
- changes:
- $CI_ROOT_DIR/**/*
lib ".gitlab-ci.yml" is:
stages:
- build
- test
variables:
CI_ROOT_DIR: $CI_PROJECT_PATH/server/lib # default
build-libA:
stage: build
script:
- echo "Execute libA build!"
rules:
- changes:
- $CI_ROOT_DIR/libA/**/*
build-libB:
stage: build
script:
- echo "Execute libB build!"
rules:
- changes:
- $CI_ROOT_DIR/libB/**/*
It would be better tho if it was possible to choose the context of the execution of the pipeline when triggered by the parent-pipeline or to have a CI_CHILD_PIPELINE_DIR variable available by Gitlab predefined environment variables

How do I make a stage pass only if one or more jobs succeed in GitLab CI?

I have a .gitlab-ci.yml that looks like this:
image: "python:3.7"
.python-tag:
tags:
- python
before_script:
- python --version
- pip install -r requirements.txt
- export PYTHONPATH=${PYTHONPATH}:./src
- python -c "import sys;print(sys.path)"
stages:
- Static Analysis
- Local Tests
- Integration Tests
- Deploy
mypy:
stage: Static Analysis
extends:
- .python-tag
script:
- mypy .
pytest-smoke:
stage: Local Tests
extends:
- .python-tag
script:
- pytest -m smoke
int-tests-1:
stage: Integration Tests
when: manual
allow_failure: false
trigger:
project: tests/gitlab-integration-testing-integration-tests
strategy: depend
int-tests-2:
stage: Integration Tests
when: manual
allow_failure: false
trigger:
project: tests/gitlab-integration-testing-integration-tests
strategy: depend
deploy:
stage: Deploy
extends:
- .python-tag
script:
- echo "Deployed!"
The Integrations stage has multiple jobs in it that take a decent chunk of time to run. It is unusual that all of the integration tests need to be run. This is why we stuck a manual flag on these, and the specific ones needed will be manually run.
How do I make it so that the Deploy stage requires that one or more of the jobs in Integration Tests has passed? I can either do all like I have now or I can do none by removing allow_failure: false from the integration test jobs.
I want to require that at least once has passed.
if each job generate an artifcact only when the job is successful
artifacts:
paths:
- success.txt
script:
# generate the success.txt file
you should be able to test if the file exist in the next stage
you need to add this (below) in the next stage to be able to read the file:
artifacts:
paths:
- success.txt

Azure DevOps - two dependent YAML pipelines in one repository

I have two .yml files in my repo. One for build, one for deployment. The main reason why I would like to keep build separate from the deployment is that I also would like to store variables for environments in my repo, e.i. in variables-dev.yml and variables-prod.yml files. So there is no need to create a new build every time (which includes running tests, docker image build etc.).
The file build.yml:
trigger:
paths:
exclude:
- build.yml
- deploy.yml
stages:
- stage: build
jobs:
...
And the deploy.yml, which I want to be triggered only on the completion of the build pipeline. That's why I add the first exclusion of all paths, but add one on pipeline resource.
trigger:
paths:
exclude:
- '*'
resources:
pipelines:
- pipeline: build
source: build
trigger:
branches:
include:
- '*'
stages:
- stage: dev
variables:
- template: variables-dev.yml
jobs:
- deployment: deploy_dev
environment: 'dev'
strategy:
runOnce:
deploy:
steps:
...
- stage: prod
dependsOn: dev
variables:
- template: variables-prod.yml
jobs:
- deployment: deploy_prod
environment: 'prod'
strategy:
runOnce:
deploy:
steps:
...
Unfortunately it does not seem to work. The top trigger blocks lower trigger. And if I remove the top trigger than the deploy pipeline is triggered at the same time with the build one.
you have to start your deploy.yml with trigger: none
trigger: none
resources:
pipelines:
- pipeline: ci-pipeline
source: my-build-pipeline
trigger:
enabled: true
branches:
include:
- master
Set your triggers for the second yml to none, then add this setting in the "Triggers" section of the UI. It will stage your builds as you describe

Resources