Stuggling to Create GitLab-CI Pipeline that includes 'schedules' and 'changes' declarations - gitlab

I have a GitLab-CI pipeline in place with my Katalon Studio automation tests that I would like to have the following functionality:
Various nightly schedules that run based on scheduled variables being present.
Changes declaration so pushes to the repo only trigger a pipeline run if certain files have been touched.
I have the scheduled portion running as expected, but I am struggling on pairing that with the 'changes' declaration to only run the pipeline IF someone pushes after changing certain files. Can someone help? I am guessing this is an issue with my YAML formatting.
Here is an example snippet from my current GitLab-CI.yaml
Example Tests:
stage: Example
tags:
- aws-medium-runner
script:
- MY_SCRIPT
rules:
- if: $SCHEDULE_A == "true" # tied to schedule A in scheduler tool
when: always
- if: '$CI_PIPELINE_SOURCE == "push"'
changes: # Only run on pushes if changes have been made to certain directories
- Test\ Cases/Example/*
- Object\ Repository/Example/*
- Test\ Suites/Example/*
- Scripts/Example/*
when: always
dependencies:
- Set Release Version

You are missing 2 spaces on the changes: part.
It should work when you add those. (see the example below)
And in case you want to combine the 2 rules with the same conditions, you can combine them into one rule:
Example Tests:
stage: Example
tags:
- aws-medium-runner
script:
- MY_SCRIPT
rules:
- if: '$CI_PIPELINE_SOURCE == "push" || $SCHEDULE_A == "true"'
changes: # Only run on pushes if changes have been made to certain directories
- Test Cases/Example/*
- Object Repository/Example/*
- Test Suites/Example/*
- Scripts/Example/*
when: always
dependencies:
- Set Release Version

Related

How to run Job when Pipeline was triggered manually

I setup jobs to run only when pushing/merging to branch "dev", but I also want it so I'm able to run them if I trigger that pipeline manually. Something like this:
test:
stage: test
<this step should be run always>
build:
stage: build
rules:
- if: $CI_COMMIT_REF_NAME == "dev"
- if: <also run if the pipeline was run manually, but skip if it was triggered by something else>
This job is defined in a child "trigger" pipeline. This is how the parent looks like:
include:
- template: 'Workflows/MergeRequest-Pipelines.gitlab-ci.yml'
stages:
- triggers
microservice_a:
stage: triggers
trigger:
include: microservice_a/.gitlab-ci.microservice_a.yml
strategy: depend
rules:
- changes:
- microservice_a/*
The effect I want to achieve is:
Run test in all cases
Run build in the child pipeline only when pushing/merging to "dev"
Also run the build job when the pipeline is run maually
Do not run the build job on any other cases (like a MR)
The rules examples showcase:
job:
script: echo "Hello, Rules!"
rules:
- if: '$CI_PIPELINE_SOURCE == "merge_request_event"'
when: manual
allow_failure: true
- if: '$CI_PIPELINE_SOURCE == "schedule"'
The when:manual should be enough in your case: it does require that a job doesn’t run unless a user starts it.
Bonus question: This job is defined in a child "trigger" pipeline
Then it is related to gitlab-org/gitlab issue 201938, which is supposed to be fixed with GitLab 13.5 (Oct. 2020), but that only allow manual actions for parent-child pipeline (illustrated by this thread)
Double-check the environment variables as set in your child job
echo $CI_JOB_MANUAL
If true, that would indicate a job part of a manual triggered job.
While issue 22448 ("$CI_JOB_MANUAL should be set in all dependent jobs") points to this option not working, it includes a workaround.

Gitlab CI: Using wildcard with include keyword

I am using below to trigger pipeline if any changes are made in the folders under root directory.
trigger_serviceA:
stage: triggers
rules:
- if: '$CI_COMMIT_BRANCH == "dev"'
changes:
- serviceA/*
when: always
trigger:
include: serviceA/.gitlab-ci.yml
strategy: depend
However, pipeline is not getting triggered if there are any changes in the subfolders under serviceA.
When using only:changes the single wildcard does not covers subdirs - you'll have to use /**/* instead.
Altough the documentation does not say it, i could imagine that it's the same for rules:changes - so maybe you'll want to try that out.

gitlab: `changes` to files does not re-run if a previous stage failed

Imagine a simple .gitlab-ci.yml file with the following
stages:
- test
- build
test_job:
stage: test
rules:
- if: '$CI_COMMIT_BRANCH && $CI_COMMIT_BRANCH != $CI_DEFAULT_BRANCH'
script:
- exit 1
build_job:
stage: build
rules:
- if: '$CI_COMMIT_BRANCH && $CI_COMMIT_BRANCH != $CI_DEFAULT_BRANCH'
changes:
- web-app/**/*
script:
- echo "Building..."
The first time this pipeline is run, you will see both jobs, however the build_job will not be run because the test_job failed (exit 1).
Correcting the exit 1 (to exit 0, for example) and re-running, you will now only see the test_job because to gitlab, the web-app files havent changed, yet they havent successfully run previously.
So how do you ensure the build_job is run to success??
In the scenario you described, your second commit contains only 1 change to your .gitlab-ci.yml file, therefore your changes: rule correctly causes the build_job to be excluded from the pipeline as you have configured the pipeline.
For branch pipelines, GitLab will not consider previous commits when evaluating the changes: rules.
However, with pipelines for merge requests, all changes in the merge request are tested. Which sounds like the behavior you are expecting.
If your goal is for efficient pipelines tracking your features branches, your best course(s) of action might be something like this:
Always run the build job on branch pipelines
Only apply the changes: rule for pipelines for merge requests
Optionally, exclude branch pipelines when an MR is open
Optionally, adopt a workflow whereby which you open (draft) merge requests when creating your feature branches
An optimized CI configuration that does not skip build job on branches may look like this:
build_job:
rules:
- if: '$CI_PIPELINE_SOURCE == "merge_request_event"'
changes: # in the case of merge requests, we can be more efficient
- web-app/**/*
- if: '$CI_COMMIT_BRANCH && $CI_OPEN_MERGE_REQUESTS'
when: never # skip branch pipelines when MR pipelines exist
- if: '$CI_COMMIT_BRANCH' # but in branch pipelines, build every time

CI pipeline doesn't run automatically when included .yml contains workflow rules

I am trying to include an external GitLab CI YAML file into my project local .gitlab-ci.yml file. The external YAML, which is in my other GitLab project ci/cd > templates contains some workflow rules:
# ci-cd.yml
workflow:
rules:
- if: '$TRACK != null' # TRACK is the environment type (staging/production)
when: always
- if: '$CI_PIPELINE_SOURCE =~ /^trigger|pipeline|web|api$/'
when: always
Below is my project local .gitlab-ci.yml:
include:
- '/.gitlab-ci-test.yml'
- project: 'ci-cd/templates'
file: 'ci-cd.yml'
...
The problem is - none of the jobs I have defined inside locally included .gitlab-ci-test.yml get triggered when I push the changes to GitLab, even when the job has when: always set. Seems like the workflow rules in external ci-cd.yml are not letting the jobs run.
I've also tried to locally add a workflow rule to .gitlab-ci.yml that evaluates to true, because the GitLab workflow keyword docs say
When no rules evaluate to true, the pipeline does not run.
That means if any one of the rules evaluates to true, pipeline should have run, which did not happen when I added a local workflow rule.
EDIT - the external file which has workflow rules is used by many projects so this can't be modified to have "push" in $CI_PIPELINE_SOURCE. My intention is to let it be as a global rule and try to 'override' locally in my project.
I hope I was clear in issue description. Would appreciate any help!
You are missing push event for $CI_PIPELINE_SOURCE in you workflow rule.
workflow:
rules:
- if: '$TRACK != null' # TRACK is the environment type (staging/production)
when: always
- if: '$CI_PIPELINE_SOURCE =~ /^trigger|pipeline|push|web|api$/'
when: always
EDIT: if you are not able to change the workflow rule in the included file, the only option I see is to duplicate the workflow in your gitlab-ci.yml and add the missed push there.
workflow rules take precedence before all other rules and it is not possible to merge two workflow blocks. If a worfklow block is used in an included yml file and the gitlab-ci.yml itself, the workflow from the gitlab-ci.yml is used. You can check this in CI/CD -> Editor -> View merged YAML in gitlab.
include:
- '/.gitlab-ci-test.yml'
- project: 'ci-cd/templates'
file: 'ci-cd.yml'
workflow:
rules:
- if: '$TRACK != null' # TRACK is the environment type (staging/production)
when: always
- if: '$CI_PIPELINE_SOURCE =~ /^trigger|pipeline|push|web|api$/'
when: always
...

Gitlab-CI run Stage conditionally

As you see below I have two stages.
1- Build
2- Deploy
And I have 3 kind of branches
1- master
2- test
3- Other branches which created by developers. (Development)
I want to run build for all branches but I have a condition for deploy
1- If branch is test run deploy.
2- If branch is other branches don't run deploy.
3- if branch is master run deploy only manual.
But this three condition doesn't works properly with my gitlab-ci.yml
stages:
- build
- deploy
default:
image: node:latest
before_script:
- |
if [ "$CI_BUILD_REF_NAME" == "master" ]; then
ENV="prod"
elif [ "$CI_BUILD_REF_NAME" == "test" ]; then
ENV="test"
else
ENV="dev"
fi
- npm install
build:
stage: build
script:
- cp ./src/env-${ENV}.js ./src/env.js
- npm run build-compressed
deploy:
stage: deploy
rules: //this section doesn't work properly
- if: '$ENV == "test"'
when: always
- if: '$ENV == "prod"'
when: manual
- when: never
script:
- cp ./src/env-${ENV}.js ./src/env.js
- npm run build-compressed
- npm run publish:${ENV}
I recommend doing such a differentiation via Rules. As you can easily override variables with rules, see GitLab CI Doc.
Below you can find a short example
rules
- if: $CI_COMMIT_BRANCH =~ /test/
variables: # Override ENV defined
ENV: "test" # at the job level.
- if: $CI_COMMIT_BRANCH =~ /master/
variables: # Override ENV defined
ENV: "prod" # at the job level.
- variables: # not 100% sure about this one, as when can stand without a if clause, i assume variables can too :D
ENV: "dev"
Furthermore i would also do the rule check based on the branch within the deploy stage. - as it is easier to verify the branch, than to ensure at a later stage, that the Env variable is still set properly.
Also important to notice is the order of evaluation. Just because there is a before_script in the default section, does not mean it is evaluated just once for the whole. The before_script is evaluated once per Job.
Additionally the rules are evaluated, before the JOB runs, as it needs to determine if the job should be executed or not. This means you are not able to reference the variable from the before_script.
If you want to pass on the variable from the first job to the second job, you need to persist it in a dotenv-artifact - this way, the variables are available before the second stage start. (I am still not sure if you can use them for evaluation in rules, that is something you need to check)
I hope my additional clarifications, based on your comments, help you to find the best way.

Resources