Run a pre job before GitLab pipeline - gitlab

I want to run a job each time a new pipeline gets triggered. It's a kind of preparation job which should always be executed before every other job defined inside the .gitlab-ci.yml
For Example
stages:
- build
- test
my-prep-job:
stage: .pre
script:
# this is the job I want to run every time a pipeline gets triggered before other jobs
# also it will have an artifact that I want to use in rest of the job
...
artifacts:
...
Build:
stage: build
...
Test:
stage: test
....
Please, let me know if this is possible or if there is other way around.
Thanks in Advance...
Edit
I did try adding .pre under stages.
Thing is I had to rewrite the rules and add it to my-prep-job stages as well.
stages:
- .pre # I did add it over here
- build
- test
Also I had to add the rules to this stage as well so that it would not run on it's own on just a normal commit/push.
Is there any possibility to extend ".pre" stage of GitLab pipeline?

You could use !reference tags to include certain keyword sections.
For example:
.pre
script:
- echo from pre
example:
stage: test
script:
- !reference [.pre, script]
- ...
Will include the script part of .pre into the example job.
You can use !reference for most of the job keywords like artifacts or rules

Related

How run a particular stage in GitLab after the execution of child pipelines'?

I'm using GitLab and the CI config has got following stages:-
stages:
- test
- child_pipeline1
- child_pipeline2
- promote-test-reports
At any time, only one of the child pipeline will run i.e. either child_pipeline1 or child_pipeline2 after the test stage and not both at a time.
Now, I have added another stage called promote-test-reports which I would like to run in the end i.e. after the successful execution of any of child pipeline.
But I'm completely blocked here. This promote-test-reports is coming from a template which I have included in this main CI config file like:-
# Include the template file
include:
- project: arcesium/internal/vteams/commons/commons-helper
ref: promote-child-artifacts
file: 'templates/promote-child-artifacts.yml'
I'm overriding the GitLab project token in this same main file like below:-
test:
stage: promote-test-reports
trigger:
include: adapter/child_pipelin1.yml
strategy: depend
variables:
GITLAB_PRIVATE_TOKEN: $GITLAB_TOKEN
If you see the above stage definition in main CI config file, I'm trying to use strategy: depend to wait for successful execution of child_pipeline1 and then run this stage but it throwing error (jobs:test config contains unknown keys: trigger)and this approach is not working reason is I'm using scripts in the main definition of this stage (promote-test-reports) in the template and as per the documentation both scripts and strategy cannot go together.
Following is the definition of this stage in the template:-
test:
stage: promote-test-reports
image: "495283289134.dkr.ecr.us-east-1.amazonaws.com/core/my-linux:centos7"
before_script:
- "yum install unzip -y"
variables:
GITLAB_PRIVATE_TOKEN: $GITLAB_TOKEN
allow_failure: true
script:
- 'cd $CI_PROJECT_DIR'
- 'echo running'
when: always
rules:
- if: $CI_PIPELINE_SOURCE == 'web'
artifacts:
reports:
junit: "**/testresult.xml"
coverage_report:
coverage_format: cobertura
path: "**/**/coverage.xml"
coverage: '/TOTAL\s+\d+\s+\d+\s+(\d+%)/'
The idea of using strategy attribute failed. I cannot remove the logic of scripts in the template as well. May I know what is the alternate way of running my job(promote-test-reports) in the end and remember it is going to be an OR condition that run either after child_pipeline1 or child_pipeline2
I would really appreciate your help
Finally, I was able to do it by putting the strategy: depend on child pipelines. I was doing it incorrectly earlier by doing on stage, promote-test-reports

How can I skip a job based on the changes made in .gitlab-ci.yml?

I have two jobs in GitLab CI/CD: build-job (which takes ~10 minutes) and deploy-job (which takes ~2 minutes). In a pipeline, build-job succeeded while deploy-job failed because of a typo in the script. How can I fix this typo and run only deploy-job, instead of rerunning build-job?
My current .gitlab-ci.yml is as follows:
stages:
- build
- deploy
build-job:
image: ...
stage: build
only:
refs:
- main
changes:
- .gitlab-ci.yml
- pubspec.yaml
- test/**/*
- lib/**/*
script:
- ...
artifacts:
paths:
- ...
expire_in: 1 hour
deploy-job:
image: ...
stage: deploy
dependencies:
- build-job
only:
refs:
- main
changes:
- .gitlab-ci.yml
- pubspec.yaml
- test/**/*
- lib/**/*
script:
- ...
I imagine something like:
the triggerer (me) fixes the typo in deploy-job's script and pushes the changes
the pipeline is triggered
the runner looks at the files described by build-job/only/changes and detects a single change in the .gitlab-ci.yml file
the runner looks at the .gitlab-ci.yml file and detects there IS a change, but since this change does not belong to the build-job section AND the job previously succeeded, this job is skipped
the runner looks at the files described by deploy-job/only/changes and detects a single change in the .gitlab-ci.yml file
the runner looks at the .gitlab-ci.yml file and detects there is a change, and since this change belongs to the deploy-job section, this job is executed
This way, only deploy-job is executed. Can this be done, either using rules or only?
There is a straight foward way, you just have to have access to gitlab CI interface.
Push and let the pipeline start normally.
Cancel the fist job.
Run the desired one (deploy).
Another way is to create a new job, intended to be runned only when needed, in which you can set a non-dependency deploy. Just copy the current deploy code without the dependencies. Also add the manual mode with
job:
script: echo "Hello, Rules!"
rules:
- if: $CI_PIPELINE_SOURCE == "merge_request_event"
when: manual
allow_failure: true
As docs.

GitLab CI, How to make sure job execute only if the previous job did?

I have 2 stages with multiple jobs and the jobs in the first stage have some rules that tell them if the need to run or not, so what I am trying to do is to tell some of the jobs in the second stage to execute only if the relevant job in the first stage ran.
I don't want to use the same rules I used for the first stage job to prevent conflicts.
Is there a way to do that?
stages:
- build
- deploy
Build0:
stage: build
extends:
- .Build0Rules
- .Build0Make
Build1:
stage: build
extends:
- .Build1Rules
- .Build1Make
Deploy0:
stage: deploy
dependencies:
- Build0
script:
- bash gitlab-ci/deploy0.sh
Deploy1:
stage: deploy
dependencies:
- Build1
script:
- bash gitlab-ci/deploy1.sh
Thank you in advance :)
No you cannot specify that a job should be added to the pipeline if another job was added to the pipeline. Each job can specify whether it is added to the pipeline using only/except conditions or rules, but these are not able to reference other jobs.
It is possible to generate a pipeline yaml file and then trigger it, but I think this would not be ideal because of the amount of work involved.
stages:
- Build
- Deploy
build:
stage: Build
script:
- do something...
artifacts:
paths:
- deploy-pipeline-gitlab-ci.yml
deploy:
stage: Deploy
trigger:
include:
- artifact: deploy-pipeline-gitlab-ci.yml
job: build
strategy: depend
I would recommend using similar only/except conditions or rules on each job to build the pipeline that you want.
Yes you can. You should check the keyword needs that allow to do what you want: execute a job based on the execution of other jobs, ignoring stages order.
The documentation: https://docs.gitlab.com/ee/ci/yaml/#needs
Here is also an exemple of how to build a DAG (direct acrylic graph) using needs: https://about.gitlab.com/blog/2020/12/10/basics-of-gitlab-ci-updated/#directed-acyclic-graphs-get-faster-and-more-flexible-pipelines
In your case:
Deploy0:
stage: deploy
needs: ["Build0"]
script:
- bash gitlab-ci/deploy0.sh
Deploy1:
stage: deploy
needs: ["Build1"]
script:
- bash gitlab-ci/deploy1.sh
Note you can also specify multiple jobs in the needs command:
needs: ["build0", "test0", "test1"]

Is there any way to dynamically edit a variable in one job and then pass it to a trigger/bridge job in Gitlab CI?

I need to pass a file path to a trigger job where the file path is found within a specified json file in a separate job. Something along the lines of this...
stages:
- run_downstream_pipeline
variables:
- FILE_NAME: default_file.json
.get_path:
stage: run_downstream_pipeline
needs: []
only:
- schedules
- triggers
- web
script:
- apt-get install jq
- FILE_PATH=$(jq '.file_path' $FILE_NAME)
run_pipeline:
extends: .get_path
variables:
PATH: $FILE_PATH
trigger:
project: my/project
branch: staging
strategy: depend
I can't seem to find any workaround to do this, as using extends won't work since Gitlab wont allow for a script section in a trigger job.
I thought about trying to use the Gitlab API trigger method, but I want the status of the downstream pipeline to actually show up in the pipeline UI and I want the upstream pipeline to depend on the status of the downstream pipeline, which from my understanding is not possible when triggering via the API.
Any advice would be appreciated. Thanks!
You can use artifacts:reports:dotenv for setting variables dynamically for subsequent jobs.
stages:
- one
- two
my_job:
stage: "one"
script:
- FILE_PATH=$(jq '.file_path' $FILE_NAME) # In script, get the environment URL.
- echo "FILE_PATH=${FILE_PATH}" >> variables.env # Add the value to a dotenv file.
artifacts:
reports:
dotenv: "variables.env"
example:
stage: two
script: "echo $FILE_PATH"
another_job:
stage: two
trigger:
project: my/project
branch: staging
strategy: depend
Variables in the dotenv file will automatically be present for jobs in subsequent stages (or that declare needs: for the job).
You can also pull artifacts into child pipelines, in general.
But be warned you probably don't want to override the PATH variable, since that's a special variable used to help you find builtin binaries.

Adds needs relations to GitLab CI yaml but got an error: the job was not added to the pipeline

I am trying to add needs between jobs in the Gitlab CI yaml configuration file.
stages:
- build
- test
- package
- deploy
maven-build:
stage: build
only:
- merge_requests
- master
- branches
...
test:
stage: test
needs: [ "maven-build" ]
only:
- merge_requests
- master
...
docker-build:
stage: package
needs: [ "test" ]
only:
- master
...
deploy-stage:
stage: deploy
needs: [ "docker-build" ]
only:
- master
...
deploy-prod:
stage: deploy
needs: [ "docker-build" ]
only:
- master
when: manual
...
I have used the GitLab CI online lint tools to check my syntax, it is correct.
But when I pushed the codes, it always complains:
'test' job needs 'maven-build' job
but it was not added to the pipeline
You can also test your .gitlab-ci.yml in CI Lint
The GitLab CI did not run at all.
Update: Finally I made it. I think the needs position is sensitive, move all needs under the stage, it works. My original scripts included some other configuration between them.
CI-jobs that depend on each other need to have the same limitations!
In your case that would mean to share the same only targets:
stages:
- build
- test
maven-build:
stage: build
only:
- merge_requests
- master
- branches
test:
stage: test
needs: [ "maven-build" ]
only:
- merge_requests
- master
- branches
that should work from my experience^^
Finally I made it. I think the needs position is sensitive, move all needs under the stage, it works
Actually... that might no longer be the case with GitLab 14.2 (August 2021):
Stageless pipelines
Using the needs keyword in your pipeline configuration helps to reduce cycle times by ignoring stage ordering and running jobs without waiting for others to complete.
Previously, needs could only be used between jobs on different stages.
In this release, we’ve removed this limitation so you can define a needs relationship between any job you want.
As a result, you can now create a complete CI/CD pipeline without using stages by including needs in every job to implicitly configure the execution order.
This lets you define a less verbose pipeline that takes less time to create and can run even faster.
See Documentation and Issue.
The rule in both jobs should be that same or otherwise GitLab cannot create job dependency between the jobs when the trigger rule is different.
I don't know why, but if the jobs are in different stages (as in my case), you have to define the jobs that will be done later with "." at the start.
Another interesting thing is GitLab's own CI/CD Lint online text editor does not complain there is an error. So you have to start the pipeline to see the error.
Below, notice the "." in ".success_notification" and ".failure_notification"
stages:
- prepare
- build_and_test
- deploy
- notification
#SOME CODE
build-StandaloneWindows64:
<<: *build
image: $IMAGE:$UNITY_VERSION-windows-mono-$IMAGE_VERSION
variables:
BUILD_TARGET: StandaloneWindows64
.success_notification:
needs:
- job: "build-StandaloneWindows64"
artifacts: true
stage: notification
script:
- wget https://raw.githubusercontent.com/DiscordHooks/gitlab-ci-discord-webhook/master/send.sh
- chmod +x send.sh
- ./send.sh success $WEBHOOK_URL
when: on_success
.failure_notification:
needs:
- job: "build-StandaloneWindows64"
artifacts: true
stage: notification
script:
- wget https://raw.githubusercontent.com/DiscordHooks/gitlab-ci-discord-webhook/master/send.sh
- chmod +x send.sh
- ./send.sh failure $WEBHOOK_URL
when: on_failure
#SOME CODE

Resources