Gitlab version: 14.3.3
I cannot get all jobs to be created and work in the pipeline. I get an infinite load on running pipelines.
I'm expecting
On a MR
build:merge-request-pipeline-dynamic_script
build:shared-pipeline-dump-variables
test:shared-pipeline-test-job-artifacting-always
test:shared-pipeline-test-job-artifacting-always-multi
On Main
build:main-pipeline-dynamic_script
build:shared-pipeline-dump-variables
test:shared-pipeline-test-job-artifacting-always
test:shared-pipeline-test-job-artifacting-always-multi
# .gitlab-ci.yml
image: ruby:2.3
# Global rules on all pipelines
workflow:
rules:
# Mask and prevent builds on noci commit SHA's
- if: $CI_COMMIT_TITLE =~ /^noci/
when: never
# Specific pipelines with specific rules
include:
- local: main-pipeline.yml
rules:
# Any commits to `main` i.e. we have merged something in
- if: $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == $CI_DEFAULT_BRANCH
- local: merge-request-pipeline.yml
rules:
# Merge requests going in will build
- if: $CI_PIPELINE_SOURCE == "merge_request_event"
# Catch all pipeline that will just build everything that we want all the time
# NB: This will obviously only be included if we pass the global workflow check
- local: shared-pipeline.yml
# main-pipeline.yml
variables:
BUILD_TYPE: commit to main
main-pipeline-dynamic_script:
stage: build
script:
- echo "Were running directly in main!"
- echo "USER VARIABLE LIST (SHOULD BE PRESENT) - BUILD_TYPE - $BUILD_TYPE"
# merge-request-pipeline.yml
variables:
BUILD_TYPE: merge-request
merge-request-pipeline-dynamic_script:
stage: build
script:
- echo "Were running on a merge request!"
- echo "USER VARIABLE LIST (SHOULD BE PRESENT) - BUILD_TYPE - $BUILD_TYPE"
# shared-pipeline.yml
variables:
NO_DEFAULT:
description: This has no default. It should interpolate as blank.
WITH_DEFAULT:
description: This has a default. It should interpolate as default.
value: default
shared-pipeline-dump-variables:
stage: build
script:
# ... removed not important
shared-pipeline-test-job-artifacting-always:
stage: test
script:
- # redundant
shared-pipeline-test-job-artifacting-always-multi:
stage: test
script:
- echo "This job will artifact logs"
- echo "log1" > log1.log
- echo "log2" > log2.log
- # removed redundant
artifacts:
paths:
- log1.log
- log2.log
- log3.log
- log/log4.log
- log/log5.log
- log/log6doesnotexist.log
Related
Although this seems to be very easy but seems like I am doing something silly somewhere. I am trying to automate my project where I have decided to run either of the jobs where one is for staging and one is for production. I want my deploy-on-staging job to run when my tag released ends with -stage and deploy-on-prod job to run when my tag release ends with -prod. For this I have used only keyword (instead of rules keyword as it was becoming pain for me). But only keyword is not supposed to be working as expected, for me.
The issue I am getting is even after using the only keyword, both of my jobs are still running. Any pointer would be very helpful.
I am just pasting the code of my gitlab-ci.yml which would be useful to you to resolve this issue. Please ping me if you want something else also from the gitlab-ci.yml file.
Here is my gitlab-ci section:
variables:
TagName: ${CI_COMMIT_TAG}
deploy-on-staging:
stage: deploy
# rules:
# - if: '$TagName == "*-stage"'
# - if: '$TagName == "*-prod"'
# when: manual
image: ubuntu:20.04
# tags:
# - docker-executor
before_script:
- apt-get update
script:
- echo "It's here in stage"
- echo "$TagName =~ '/^*-stage/'"
only:
- tags
- "$TagName =~ '/^*-stage/'"
deploy-on-prod:
stage: deploy
image: ubuntu:20.04
# tags:
# - docker-executor
# rules:
# - if: '$TagName == "*-prod"'
# before_script:
# - apt-get update
script:
- echo "It's here in prod"
only:
- tags
- $TagName =~ '/^*-prod/'
Only/except are deprecated https://docs.gitlab.com/ee/ci/yaml/#only--except
to achieve that you want with rules, just use this example
rules:
- if: $CI_COMMIT_TAG =~ /-stage/i
when: always
- if: $CI_COMMIT_TAG =~ /-prod/i
when: never
I am trying to setup CI in gitlab so
the second job (pushdev) will be available for running manually only after the devjob has run successfully.
the third job pushtostage will only run iff file has changed.
the way the jobs are setup, second and third jobs alway run. What is missing in the pipeline spec
devjob:
image: node:16
stage: publishdev
script:
- echo "running validation checks"
- npm run validate
rules:
- changes:
- ./src/myfile.txt
- when: manual
# - this jobs needs to run after "devjob" has run successfully
# and myfile.txt has changed
# - "needs" artifacts from the "lint" job
pushdev:
image: node:16
stage: publishdev
needs: [ "devjob", "lint"]
script:
- echo "Pushing changes after validation to dev"
- npm run pushdev
rules:
- changes:
- ./src/myfile.txt
when: on_success
- when: manual
pushtostage:
image: node:16
stage: pushstage
script:
- echo "Pushing changes to stage"
rules:
- changes:
- ./src/myfile.txt
- when: manual
I change your sample to look like this:
stages:
- publishdev
- pushstage
default:
image: ubuntu:20.04
lint:
stage: publishdev
script:
- echo "lint job"
devjob:
stage: publishdev
script:
- echo "running validation checks"
rules:
- changes:
- README.md
when: manual
allow_failure: false
pushdev:
stage: publishdev
needs: [ "devjob", "lint"]
script:
- echo "Pushing changes after validation to dev"
rules:
- changes:
- README.md
when: manual
allow_failure: false
pushtostage:
stage: pushstage
script:
- echo "Pushing changes to stage"
rules:
- changes:
- README.md
when: manual
allow_failure: false
I add allow_failure: false, because allow_failure when manual job default is true.
I merge your rules. because GitLab rules one - is one rule:
Rules are evaluated when the pipeline is created, and evaluated in order until the first match.
your .gitlab-ci.yml first job devjob is manual, so it is always a success, and your second job pushdev first rule changes and when: on_success always match, so it always run.
I change your .gitlab-ci.yml, first job devjob merge your rules when file change and set it is manual job and not allow_failure. and so on.
the sample code in Files · try-rules-stackoverflow-72594854-manual · GitLab PlayGround / Workshop / Tryci · GitLab
I have a problem with gitlab ci child pipelines.
Need to trigger ci pipeline automatically after each commit in repo that have more than one app. Need to configure to detect which folder/files were modified in order to know which app pipeline to trigger
Example of structure
Main/
---- applicationsA/
-------- appA1/
-------- appA2/
-------- appA3/
---- applicationsB/
-------- appB1/
-------- appB2/
-------- appB3/
Main ".gitlab-ci.yml" is:
workflow:
rules:
- if: ‘$CI_PIPELINE_SOURE == “web”’
variables:
APPNAME: $APPNAME
stages:
- child-pipelines
appA1:
stage: child-pipelines
trigger:
include:
- local: applicationA/appA1/gitlab-ci.yml
strategy: depend
rules:
- if: $APPNAME == “appA1” && $CI_PIPELINE_SOURE == “web”
appA2:
stage: child-pipelines
trigger:
include:
- local: applicationA/appA2/gitlab-ci.yml
strategy: depend
rules:
- if: $APPNAME == “appA1” && $CI_PIPELINE_SOURE == “web”
...
appA1 ".gitlab-ci.yml" is:
stages:
- build
- test
build-appA1:
stage: build
script:
- echo "Execute appA1 build!"
publish-appA1:
stage: build
script:
- echo "Execute appA1 publish!"
appA2 ".gitlab-ci.yml" is:
stages:
- build
- test
build-appA2:
stage: build
script:
- echo "Execute appA1 build!"
publish-appA2:
stage: build
script:
- echo "Execute appA1 publish!"
The purpose of this configuration is that , for example, when i change a file inside app**, the pipeline detects the changes and build the app**.
You can use rules:changes with a glob pattern and only run a certain job if anything changes in the specific app folder:
appA1:
stage: child-pipelines
trigger:
include:
- local: applicationA/appA1/gitlab-ci.yml
strategy: depend
rules:
- if: '$APPNAME == "appA1" && $CI_PIPELINE_SOURE == "web"'
changes:
- Main/applicationsA/appA1/**/*
I have created the following test pipeline in gitlab:
job1:
stage: stage1
rules:
- if: $RUN != "run2"
when: always
- when: never
script:
- echo "Job stage1 updating file dates.txt"
- date >> data/dates.txt
cache:
key: statusfiles
paths:
- data/dates.txt
job2:
stage: stage2
rules:
- if: $RUN == "run2"
when: always
script:
- echo "Running stage 2"
- cat data/dates.txt
cache:
key: statusfiles
paths:
- data/dates.txt
where I want to use the "cache" feature of gitlab.
Here I first run job1 which update the file date.txt and add an entry to this example files, so it contains two lines.
However, when I run a new pipeline with job2 alone, the files contain only ONE line. The files seem to be the original, unmodified files.
How can I "upload" or "save" the files into the cache in job1, so I can use the updated file in a later run of job2?
Test first if the section "Share caches between jobs in the same branch" is relevant.
To have jobs in each branch use the same cache, define a cache with the key: >
$CI_COMMIT_REF_SLUG:
cache:
key: $CI_COMMIT_REF_SLUG
This configuration prevents you from accidentally overwriting the cache.
In your case, from the discussion, and from "How archiving and extracting works":
stages:
- build
- test
before_script:
- echo "Hello"
job A:
stage: build
rules:
- if: $RUN != "run3"
when: always
- when: never
script:
- mkdir -p data/
- date > data/hello.txt
cache:
key: build-cache
paths:
- data/
after_script:
- echo "World"
job B:
stage: test
rules:
- if: $RUN == "run3"
script:
- cat data/hello.txt
cache:
key: build-cache
paths:
- data/
By using a single runner on a single machine, you don’t have the issue where job B might execute on a runner different from job A.
This setup guarantees the cache can be reused between stages.
It only works if the execution goes from the build stage to the test stage in the same runner/machine. Otherwise, the cache might not be available.
I have a pipeline with 3 stages: build, deploy-test and deploy-prod. I want stages to have following behavior:
always run build
run deploy-test automatically when on master or manually when on other branches
run deploy-prod manually, only available on master branch
My pipeline configuration seems to achieve that but I have a problem when trying to merge branches into master. I don't want to execute deploy-test stage on every branch before doing merge. Right now I am required to do that as the merge button is disabled with a message Pipeline blocked. The pipeline for this merge request requires a manual action to proceed. The setting Pipelines must succeed in project is disabled.
I tried adding additional rule to prevent deploy-test stage from running in merge requests but it didn't change anything:
rules:
- if: '$CI_MERGE_REQUEST_ID'
when: never
- if: '$CI_COMMIT_BRANCH == "master"'
when: on_success
- when: manual
Full pipeline configuration:
stages:
- build
- deploy-test
- deploy-prod
build:
stage: build
script:
- echo "build"
deploy-test:
stage: deploy-test
script:
- echo "deploy-test"
rules:
- if: '$CI_COMMIT_BRANCH == "master"'
when: on_success
- when: manual
deploy-prod:
stage: deploy-prod
script:
- echo "deploy-prod"
only:
- master
The only way I got it to work was to set ☑️ Skipped pipelines are considered successful in Setttings > General > Merge requests > Merge Checks
and marking the manual step as "allow_failure"
upload:
stage: 'upload'
rules:
# Only allow uploads for a pipeline source whitelisted here.
# See: https://docs.gitlab.com/ee/ci/jobs/job_control.html#common-if-clauses-for-rules
- if: $CI_COMMIT_BRANCH
when: 'manual'
allow_failure: true
After this clicking the Merge when Pipeline succeeds button …
… will merge the MR without any manual interaction:
I've opened a merge request from branch "mybranch" into "master" with the following .gitlab-ci.yml:
image: alpine
stages:
- build
- deploy-test
- deploy-prod
build:
stage: build
script:
- echo "build"
# run deploy-test automatically when on master or manually when on other branches
# Don't run on merge requests
deploy-test:
stage: deploy-test
script:
- echo "deploy-test"
rules:
- if: $CI_MERGE_REQUEST_ID
when: never
- if: '$CI_COMMIT_BRANCH == "master"'
when: on_success
- when: manual
# run deploy-prod manually, only available on master branch
deploy-prod:
stage: deploy-prod
script:
- echo "deploy-prod"
rules:
- if: '$CI_COMMIT_BRANCH == "master"'
when: manual
Notes:
only is deprecated, so I replaced it with if
I added Alpine image to make the jobs run faster (slimmer container); it doesn't affect the logic
When I pushed changes to branch "mybranch", GitLab did the following:
showed a blue "Merge when pipeline succeeds" button on my MR
ran "build" stage
skipped "deploy-prod" stage (only available on "master" branch)
gave me a manual "play" button to run the job on "mybranch"
at this point, the pipeline status is "blocked" and the MR is showing "Pipeline blocked. The pipeline for this merge request requires a manual action to proceed"
now I manually start the "deploy-test" stage by selecting the Play icon in the Pipelines screen
pipeline status indicator changes to "running" and then to "passed"
my merge request shows the pipeline passed and gives me the green "Merge" button
There are a number of variables that are available to the pipeline on runtime - Predefined variables reference
Some are available specifically for pipelines associated with merge requests - Predefined variables for merge request pipelines
You can utilize one or more of these variables to determine if you would want to run the deploy-test job for that merge request.
For example, you could use mention the phrase "skip_cicd" in your merge request title, access it with CI_MERGE_REQUEST_TITLE variable and create a rule. Your pipeline would look somewhat like this (please do test the rule, I have edited the pipeline off the top of my head and could be wrong) -
stages:
- build
- deploy-test
- deploy-prod
build:
stage: build
script:
- echo "build"
deploy-test:
stage: deploy-test
script:
- echo "deploy-test"
rules:
- if: '$CI_MERGE_REQUEST_TITLE == *"skip_cicd"*'
when: never
- if: '$CI_COMMIT_BRANCH == "master"'
when: on_success
- when: manual
deploy-prod:
stage: deploy-prod
script:
- echo "deploy-prod"
only:
- master