i have a problem with gitlab (community edition, version 14.1.2) CI with complex pipeline on my monorepo.
My structure is client/server:
root/
---- server/
-------- lib/
----------- libA/
----------- libB/
----------- libC/
-------- applications/
----------- appA/
----------- appB/
----------- appC/
---- client/
-------- applications/
------------- appA/
------------- appB/
...
Every folder (root, server, lib, libA, libB, libC etc...) have his own ".gitlab-ci.yml"
Root ".gitlab-ci.yml" is:
stages:
- build
- test
build-server:
stage: build
trigger:
include:
- local: 'server/.gitlab-ci.yml'
rules:
- changes:
- server/**/*
build-client:
stage: build
trigger:
include:
- local: 'client/.gitlab-ci.yml'
rules:
- changes:
- client/**/*
Server ".gitlab-ci.yml" is:
stages:
- build
- test
build-lib:
stage: build
trigger:
include:
- local: 'lib/.gitlab-ci.yml'
rules:
- changes:
- lib/**/*
build-applications:
stage: build
trigger:
include:
- local: 'applications/.gitlab-ci.yml'
rules:
- changes:
- applications/**/*
lib ".gitlab-ci.yml" is:
stages:
- build
- test
build-libA:
stage: build
script:
- echo "Execute libA build!"
rules:
- changes:
- libA/**/*
build-libB:
stage: build
script:
- echo "Execute libB build!"
rules:
- changes:
- libB/**/*
If i change a file inside libA only the ".gitlab-ci.yml" of root folder is triggered, other subfolders not detect file changes and not trigger the build.
The purpose of this configuration is that , for example, when i change a file inside libA, the pipeline detects the changes and build the libA.
Somone can help me to resolve? I hope the structure and the problem is clear. Thanks.
UPDATE
I'm using gitlab 14.1.0
Thanks to DavidC for the answer but with your solution I have not solved my problem, especially with the trigger $CI_PROJECT_PATH seems not to work.
After some time I finally got a solution (which can be evolved with variables)
Root ".gitlab-ci.yml" is:
stages:
- build
- test
build-server:
stage: build
trigger:
include:
- local: '/server/.gitlab-ci.yml'
rules:
- changes:
- server/**/*
build-client:
stage: build
trigger:
include:
- local: '/client/.gitlab-ci.yml'
rules:
- changes:
- client/**/*
Server ".gitlab-ci.yml" is:
stages:
- build
- test
build-lib:
stage: build
trigger:
include:
- local: '/server/lib/.gitlab-ci.yml'
rules:
- changes:
- server/lib/**/*
build-applications:
stage: build
trigger:
include:
- local: '/server/applications/.gitlab-ci.yml'
rules:
- changes:
- server/applications/**/*
lib ".gitlab-ci.yml" is:
stages:
- build
- test
build-libA:
stage: build
script:
- echo "Execute libA build!"
rules:
- changes:
- server/lib/libA/**/*
build-libB:
stage: build
script:
- echo "Execute libB build!"
rules:
- changes:
- server/lib/libB/**/*
Pay attention to this line from the gitlab documentation: "Parent and child pipelines were introduced with a maximum depth of one child pipeline level, which was subsequently increased to two. A parent pipeline can activate many child pipelines and these child pipelines can activate their own child pipelines. It is not possible to activate another level of child pipeline. " refer to: https://docs.gitlab.com/ee/ci/pipelines/parent_child_pipelines.html#nested-child-pipelines
Thanks for help!
It seems like GitLab child-pipeline context execution path is the same as the root directory of your repository, and is not relative to the path of the child-pipeline gitlab-ci.yml file.
The only solution so far seems to either give the path to the directory you want as a variable to your child-pipeline or to directly define it in the .gitlab-ci.yml.
Example :
Root ".gitlab-ci.yml" is:
stages:
- build
- test
build-server:
stage: build
variables:
CI_ROOT_DIR: server # you don't need to provide it, if you define it in the server/.gitlab-ci.yml file
trigger:
include:
- local: '$CI_ROOT_DIR/.gitlab-ci.yml'
rules:
- changes:
- $CI_ROOT_DIR/**/*
build-client:
stage: build
variables:
CI_ROOT_DIR: client
trigger:
include:
- local: '$CI_ROOT_DIR/.gitlab-ci.yml'
rules:
- changes:
- $CI_ROOT_DIR/**/*
Server ".gitlab-ci.yml" is:
stages:
- build
- test
variables:
CI_ROOT_DIR: $CI_PROJECT_PATH/server # default
build-lib:
stage: build
variables:
CI_ROOT_DIR: $CI_ROOT_DIR/lib # you don't need to provide it, if you define it in the server/lib/.gitlab-ci.yml file
trigger:
include:
- local: '$CI_ROOT_DIR/.gitlab-ci.yml'
rules:
- changes:
- $CI_ROOT_DIR/**/*
build-applications:
stage: build
variables:
CI_ROOT_DIR: $CI_ROOT_DIR/applications # you don't need to provide it, if you define it in the server/applications/.gitlab-ci.yml file
trigger:
include:
- local: '$CI_ROOT_DIR/.gitlab-ci.yml'
rules:
- changes:
- $CI_ROOT_DIR/**/*
lib ".gitlab-ci.yml" is:
stages:
- build
- test
variables:
CI_ROOT_DIR: $CI_PROJECT_PATH/server/lib # default
build-libA:
stage: build
script:
- echo "Execute libA build!"
rules:
- changes:
- $CI_ROOT_DIR/libA/**/*
build-libB:
stage: build
script:
- echo "Execute libB build!"
rules:
- changes:
- $CI_ROOT_DIR/libB/**/*
It would be better tho if it was possible to choose the context of the execution of the pipeline when triggered by the parent-pipeline or to have a CI_CHILD_PIPELINE_DIR variable available by Gitlab predefined environment variables
Related
I have a problem with gitlab ci child pipelines.
Need to trigger ci pipeline automatically after each commit in repo that have more than one app. Need to configure to detect which folder/files were modified in order to know which app pipeline to trigger
Example of structure
Main/
---- applicationsA/
-------- appA1/
-------- appA2/
-------- appA3/
---- applicationsB/
-------- appB1/
-------- appB2/
-------- appB3/
Main ".gitlab-ci.yml" is:
workflow:
rules:
- if: ‘$CI_PIPELINE_SOURE == “web”’
variables:
APPNAME: $APPNAME
stages:
- child-pipelines
appA1:
stage: child-pipelines
trigger:
include:
- local: applicationA/appA1/gitlab-ci.yml
strategy: depend
rules:
- if: $APPNAME == “appA1” && $CI_PIPELINE_SOURE == “web”
appA2:
stage: child-pipelines
trigger:
include:
- local: applicationA/appA2/gitlab-ci.yml
strategy: depend
rules:
- if: $APPNAME == “appA1” && $CI_PIPELINE_SOURE == “web”
...
appA1 ".gitlab-ci.yml" is:
stages:
- build
- test
build-appA1:
stage: build
script:
- echo "Execute appA1 build!"
publish-appA1:
stage: build
script:
- echo "Execute appA1 publish!"
appA2 ".gitlab-ci.yml" is:
stages:
- build
- test
build-appA2:
stage: build
script:
- echo "Execute appA1 build!"
publish-appA2:
stage: build
script:
- echo "Execute appA1 publish!"
The purpose of this configuration is that , for example, when i change a file inside app**, the pipeline detects the changes and build the app**.
You can use rules:changes with a glob pattern and only run a certain job if anything changes in the specific app folder:
appA1:
stage: child-pipelines
trigger:
include:
- local: applicationA/appA1/gitlab-ci.yml
strategy: depend
rules:
- if: '$APPNAME == "appA1" && $CI_PIPELINE_SOURE == "web"'
changes:
- Main/applicationsA/appA1/**/*
I have a monorepo where each package should be built as a docker image.
I created a trigger job for each package that triggers a child pipeline.
In the MR, my changes rule is being ignored and all child pipelines are triggered.
.gitlab-ci.yml
---
workflow:
rules:
- if: $CI_MERGE_REQUEST_ID || $CI_COMMIT_BRANCH
trigger-package-a:
stage: build
trigger:
include: .gitlab/ci/packages/package-gitlab-ci.yml
strategy: depend
rules:
- changes:
- "packages/package-a/**/*"
variables:
PACKAGE: package-a
trigger-package-b:
stage: build
trigger:
include: .gitlab/ci/packages/package-gitlab-ci.yml
strategy: depend
rules:
- changes:
- "packages/package-b/**/*"
variables:
PACKAGE: package-b
done_job:
stage: deploy
script:
- "echo DONE"
- "cat config.json"
stages:
- build
- deploy
package-gitlab-ci.yml
workflow:
rules:
- if: $CI_MERGE_REQUEST_ID
- changes:
- "packages/${PACKAGE}/**/*"
stages:
- bootstrap
- validate
cache:
key: "${PACKAGE}_${CI_COMMIT_REF_SLUG}"
paths:
- packages/${PACKAGE}/node_modules/
policy: pull
install-package:
stage: bootstrap
script:
- echo ${PACKAGE}}
- echo '{"package":${PACKAGE}}' > config.json
- "cd packages/${PACKAGE}/"
- yarn install --frozen-lockfile
artifacts:
paths:
- config.json
cache:
key: "${PACKAGE}_${CI_COMMIT_REF_SLUG}"
paths:
- packages/${PACKAGE}/node_modules/
policy: pull-push
lint-package:
script:
- yarn lint
stage: validate
needs: [install-package]
before_script:
- "cd packages/${PACKAGE}/"
test-package:
stage: validate
needs: [lint-package]
before_script:
- "echo working on ${PACKAGE}"
- "cd packages/${PACKAGE}/"
rules:
- if: $CI_MERGE_REQUEST_ID
script:
- yarn test
It looks like your downstream pipeline is defining a workflow with 2 independent rules: if and changes. This may cause the jobs to be included if the first condition in the if is met, i.e. if it is a MR pipeline. Try removing the dash in front of changes, as in the example here, to treat this as a single rule:
workflow:
rules:
- if: $CI_MERGE_REQUEST_ID
changes:
- "packages/${PACKAGE}/**/*"
EDIT: This recent issue states rules:changes does not work as expected with trigger. So you may actually need to remove the changes from the upstream pipeline and solve this in the downstream pipeline.
Side note, not directly related to your issue: the GitLab Docs provide a workflow template to run branch or MR pipelines without creating duplicates. You can use this in your upstream pipeline if it helps:
workflow:
rules:
- if: '$CI_PIPELINE_SOURCE == "merge_request_event"'
- if: '$CI_COMMIT_BRANCH && $CI_OPEN_MERGE_REQUESTS'
when: never
- if: '$CI_COMMIT_BRANCH'
I am trying to trigger gitlab CI/CD pipeline of ProjectA from ProjectB
gitlab-ci.yml of projectB
stages:
- deploy
staging:
stage: deploy
trigger:
project: projectA
branch: main
And my gitlab-ci.yml of projectA is
image: docker:19.03.13
stages:
- build
- staging
fromprojectB: <--- when I trigger pipeline from B, I want only this job to run in build stage
stage: build
script:
..............
fromprojectC:
stage: build
fromprojectD:
stage: build
script:
.......
deploy-to-stage:
stage: staging
script:
............
I want only the fromprojectB job to run among the build stage jobs when the pipeline is triggers from projectB
How can I do this
You can pass environment variables from projectB and then use them in rules in projectA.
Example based on your input:
projectA:
stages:
- build
fromprojectB:
stage: build
rules:
- if: '$RUN_JOB_B'
when: always
- when: never
allow_failure: true
script:
- echo "Compiling the code from project B"
- echo "Compile complete."
fromprojectC:
stage: build
rules:
- if: '$RUN_JOB_C'
when: always
- when: never
allow_failure: true
script:
- echo "Compiling the code from project C"
- echo "Compile complete."
and projectB:
stages:
- build
build-job:
stage: build
variables:
RUN_JOB_B: "true"
trigger:
project: itersive/internal/rd/projecta
branch: main
Pipeline in projectB:
Pipeline in projectA:
In this setup you cannot run pipeline in projectA - all jobs require environment variables - see rules section.
I have a repo QA/tests which I want to run all the jobs when there is a push to this repo.
I used a script to generate the jobs dynamically:
job-generator:
stage: generate
tags:
- kuber
script:
- scripts/generate-job.sh > generated-job.yml
artifacts:
paths:
- generated-job.yml
main:
trigger:
include:
- artifact: generated-job.yml
job: job-generator
strategy: depend
At the next step, I have another repo products/first which I want to run a specific job in QA/tests at every push in the products/first so I tried:
stages:
- test
tests:
stage: test
variables:
TARGET: first
trigger:
project: QA/tests
branch: master
strategy: depend
Then I tried to define a global TARGET: all variable in my main gitlab-ci.yml and override it with the TARGET: first in the above YAML.
generate-job.sh:
#!/bin/bash
PRODUCTS=("first" "second" "third")
for P in "${PRODUCTS[#]}"; do
cat << EOF
$P:
stage: test
tags:
- kuber
script:
- echo -e "Hello from $P"
rules:
- if: '"$TARGET" == "all"'
when: always
- if: '"$TARGET" == $P'
when: always
EOF
done
But no results. the downstream pipeline doesn't have any job at all!
Any idea?
I am not sure if this is now helpful, but this looks like an over complicated approach from the outside. I have to say i have limited knowledge and my answer is based on assumption:
the QA/tests repository contains certain test cases for all repositories
QA/tests has the sole purpose of containing the tests, not an overview over the projects etc.
My Suggestion
As QA/tests is only containing tests which should be executed against each project, i would create a docker image out of it which is contains all the tests and can actually execute them. (lets calls it qa-tests:latest)
Within my projects i would add a step which uses this images, with my source code of the project and executes the tests:
qa-test:
image: qa-tests:latest
script:
- echo "command to execute scripts"
# add rules here accordingly
this would solve the issue with each push into the repositories. For an easier usage, i could create a QA-Tests.gitlab-ci.yml file which can be included by the sub-projects with
include:
- project: QA/tests
file: QA-Tests.gitlab-ci.yml
this way you do not need to do updates with in the repositories if the ci snippet changes.
Finally to trigger the execution on each push, you only need to trigger the pipelines of the subprojects from the QA/tests.
Disclaimer
As i said, i have only a limited few, as the goal is described but not the motivation. With this approach you remove some of the directive calls - mainly the ones triggering from sub projects to QA/tests. And it generates a clear structure, but it might not fit your needs.
I solved it with:
gitlab-ci.yml:
variables:
TARGET: all
job-generator:
stage: generate
tags:
- kuber
script:
- scripts/generate-job.sh > generated-job.yml
artifacts:
paths:
- generated-job.yml
main:
variables:
CHILD_TARGET: $TARGET
trigger:
include:
- artifact: generated-job.yml
job: job-generator
strategy: depend
and use CHILD_TARGET in my generate-job.sh:
#!/bin/bash
PRODUCTS=("first" "second" "third")
for P in "${PRODUCTS[#]}"; do
cat << EOF
$P:
stage: test
tags:
- kuber
script:
- echo -e "Hello from $P"
rules:
- if: '\$CHILD_TARGET == "all"'
when: always
- if: '\$CHILD_TARGET == "$P"'
when: always
EOF
done
So I could call it from other projects like this:
stages:
- test
e2e-tests:
stage: test
variables:
TARGET: first
trigger:
project: QA/tests
branch: master
strategy: depend
I have two .yml files in my repo. One for build, one for deployment. The main reason why I would like to keep build separate from the deployment is that I also would like to store variables for environments in my repo, e.i. in variables-dev.yml and variables-prod.yml files. So there is no need to create a new build every time (which includes running tests, docker image build etc.).
The file build.yml:
trigger:
paths:
exclude:
- build.yml
- deploy.yml
stages:
- stage: build
jobs:
...
And the deploy.yml, which I want to be triggered only on the completion of the build pipeline. That's why I add the first exclusion of all paths, but add one on pipeline resource.
trigger:
paths:
exclude:
- '*'
resources:
pipelines:
- pipeline: build
source: build
trigger:
branches:
include:
- '*'
stages:
- stage: dev
variables:
- template: variables-dev.yml
jobs:
- deployment: deploy_dev
environment: 'dev'
strategy:
runOnce:
deploy:
steps:
...
- stage: prod
dependsOn: dev
variables:
- template: variables-prod.yml
jobs:
- deployment: deploy_prod
environment: 'prod'
strategy:
runOnce:
deploy:
steps:
...
Unfortunately it does not seem to work. The top trigger blocks lower trigger. And if I remove the top trigger than the deploy pipeline is triggered at the same time with the build one.
you have to start your deploy.yml with trigger: none
trigger: none
resources:
pipelines:
- pipeline: ci-pipeline
source: my-build-pipeline
trigger:
enabled: true
branches:
include:
- master
Set your triggers for the second yml to none, then add this setting in the "Triggers" section of the UI. It will stage your builds as you describe