GitLab runner skips artifacts for failed job with when: always when using "extends" - gitlab

I have a "parent" job defined in a shared (between projects) yaml "default-ci.yml":
.build_solution:
stage: exe_builds
# Declare variables to be overridden
variables:
LVSLN: null
OUTPUT_DIR: null
# optionally use CONFIGURATION to specify a configuration
script:
- . $SCRIPTS_DIR\build_solution.ps1
-SolutionPath "$LVSLN"
-OutputDir "$OUTPUT_DIR"
artifacts:
when: always
paths:
- $ESCAPED_ARTIFACTS_DIR\
expire_in: 1 week
In the specific project yaml I define values for the variables
include: "default-ci.yml"
Build My Project:
extends: .build_solution
variables:
LVSLN: build.lvsln
OUTPUT_DIR: build
With it set up as above, the artifacts are only stored for successful jobs despite the parent stating when: always. At the end of the job summary of a failed job, it goes straight from the after script to Cleaning up project directory and file based variables: no Uploading artifacts for failed job (in other words it's not even trying to find/upload artifacts).
But, when I move the artifacts section to the child yml, so it becomes
include: "default-ci.yml"
Build My Project:
extends: .build_solution
variables:
LVSLN: build.lvsln
OUTPUT_DIR: build
artifacts:
when: always
paths:
- $ESCAPED_ARTIFACTS_DIR\
expire_in: 1 week
I do get artifacts from failed jobs with Uploading artifacts for failed job appearing in the summary.
The artifacts section should be the same for all projects extending .build_solution so I do not want to have to define it in each of the children, it should be defined in the parent.
It looks like what's happening is the parent "artifacts" section is not applying or is being overridden, but I can't find why or where that would be happening when the child has no artifacts section. The rest of the .build_solution job is working as defined in the parent as I see output from my "build_solution.ps1" script and it was passed the correct parameters.

Related

How run a particular stage in GitLab after the execution of child pipelines'?

I'm using GitLab and the CI config has got following stages:-
stages:
- test
- child_pipeline1
- child_pipeline2
- promote-test-reports
At any time, only one of the child pipeline will run i.e. either child_pipeline1 or child_pipeline2 after the test stage and not both at a time.
Now, I have added another stage called promote-test-reports which I would like to run in the end i.e. after the successful execution of any of child pipeline.
But I'm completely blocked here. This promote-test-reports is coming from a template which I have included in this main CI config file like:-
# Include the template file
include:
- project: arcesium/internal/vteams/commons/commons-helper
ref: promote-child-artifacts
file: 'templates/promote-child-artifacts.yml'
I'm overriding the GitLab project token in this same main file like below:-
test:
stage: promote-test-reports
trigger:
include: adapter/child_pipelin1.yml
strategy: depend
variables:
GITLAB_PRIVATE_TOKEN: $GITLAB_TOKEN
If you see the above stage definition in main CI config file, I'm trying to use strategy: depend to wait for successful execution of child_pipeline1 and then run this stage but it throwing error (jobs:test config contains unknown keys: trigger)and this approach is not working reason is I'm using scripts in the main definition of this stage (promote-test-reports) in the template and as per the documentation both scripts and strategy cannot go together.
Following is the definition of this stage in the template:-
test:
stage: promote-test-reports
image: "495283289134.dkr.ecr.us-east-1.amazonaws.com/core/my-linux:centos7"
before_script:
- "yum install unzip -y"
variables:
GITLAB_PRIVATE_TOKEN: $GITLAB_TOKEN
allow_failure: true
script:
- 'cd $CI_PROJECT_DIR'
- 'echo running'
when: always
rules:
- if: $CI_PIPELINE_SOURCE == 'web'
artifacts:
reports:
junit: "**/testresult.xml"
coverage_report:
coverage_format: cobertura
path: "**/**/coverage.xml"
coverage: '/TOTAL\s+\d+\s+\d+\s+(\d+%)/'
The idea of using strategy attribute failed. I cannot remove the logic of scripts in the template as well. May I know what is the alternate way of running my job(promote-test-reports) in the end and remember it is going to be an OR condition that run either after child_pipeline1 or child_pipeline2
I would really appreciate your help
Finally, I was able to do it by putting the strategy: depend on child pipelines. I was doing it incorrectly earlier by doing on stage, promote-test-reports

How to run child pipeline in gitlab

I have two projects in Gitlab and I want to start the pipeline of three other projects from the one project. It is important to me that the pipelines run in a certain order.
ProjectRoot -> ProjectA -> ProjectB -> ...
Each project has its own gitlab-ci.yml with corresponding content.
ProjectRoot
...
projecta:
stage: projecta
trigger:
include: 'app/projecta'
ref: 'test
file: '/.gitlab-ci.yml'
strategy: depend
needs:
- root
when: on_success
...
ProjectA
test:
stage: test
script:
- cat requirements.txt
allow_failure: true
The pipeline is triggered and I also see the job in Gitlab, but it exits with an error.
cat: can't open 'requirements.txt': No such file of directory
Why doesn't the root pipeline access the content of the project or how can I build this so that the content is loaded from the project and I can keep a certain order.
The projects should go through one after the other and only if the previous job was successful.
With curl I get the pipeline to run, but there the next job does not wait for the previous one.

How can I skip a job based on the changes made in .gitlab-ci.yml?

I have two jobs in GitLab CI/CD: build-job (which takes ~10 minutes) and deploy-job (which takes ~2 minutes). In a pipeline, build-job succeeded while deploy-job failed because of a typo in the script. How can I fix this typo and run only deploy-job, instead of rerunning build-job?
My current .gitlab-ci.yml is as follows:
stages:
- build
- deploy
build-job:
image: ...
stage: build
only:
refs:
- main
changes:
- .gitlab-ci.yml
- pubspec.yaml
- test/**/*
- lib/**/*
script:
- ...
artifacts:
paths:
- ...
expire_in: 1 hour
deploy-job:
image: ...
stage: deploy
dependencies:
- build-job
only:
refs:
- main
changes:
- .gitlab-ci.yml
- pubspec.yaml
- test/**/*
- lib/**/*
script:
- ...
I imagine something like:
the triggerer (me) fixes the typo in deploy-job's script and pushes the changes
the pipeline is triggered
the runner looks at the files described by build-job/only/changes and detects a single change in the .gitlab-ci.yml file
the runner looks at the .gitlab-ci.yml file and detects there IS a change, but since this change does not belong to the build-job section AND the job previously succeeded, this job is skipped
the runner looks at the files described by deploy-job/only/changes and detects a single change in the .gitlab-ci.yml file
the runner looks at the .gitlab-ci.yml file and detects there is a change, and since this change belongs to the deploy-job section, this job is executed
This way, only deploy-job is executed. Can this be done, either using rules or only?
There is a straight foward way, you just have to have access to gitlab CI interface.
Push and let the pipeline start normally.
Cancel the fist job.
Run the desired one (deploy).
Another way is to create a new job, intended to be runned only when needed, in which you can set a non-dependency deploy. Just copy the current deploy code without the dependencies. Also add the manual mode with
job:
script: echo "Hello, Rules!"
rules:
- if: $CI_PIPELINE_SOURCE == "merge_request_event"
when: manual
allow_failure: true
As docs.

Is there any way to dynamically edit a variable in one job and then pass it to a trigger/bridge job in Gitlab CI?

I need to pass a file path to a trigger job where the file path is found within a specified json file in a separate job. Something along the lines of this...
stages:
- run_downstream_pipeline
variables:
- FILE_NAME: default_file.json
.get_path:
stage: run_downstream_pipeline
needs: []
only:
- schedules
- triggers
- web
script:
- apt-get install jq
- FILE_PATH=$(jq '.file_path' $FILE_NAME)
run_pipeline:
extends: .get_path
variables:
PATH: $FILE_PATH
trigger:
project: my/project
branch: staging
strategy: depend
I can't seem to find any workaround to do this, as using extends won't work since Gitlab wont allow for a script section in a trigger job.
I thought about trying to use the Gitlab API trigger method, but I want the status of the downstream pipeline to actually show up in the pipeline UI and I want the upstream pipeline to depend on the status of the downstream pipeline, which from my understanding is not possible when triggering via the API.
Any advice would be appreciated. Thanks!
You can use artifacts:reports:dotenv for setting variables dynamically for subsequent jobs.
stages:
- one
- two
my_job:
stage: "one"
script:
- FILE_PATH=$(jq '.file_path' $FILE_NAME) # In script, get the environment URL.
- echo "FILE_PATH=${FILE_PATH}" >> variables.env # Add the value to a dotenv file.
artifacts:
reports:
dotenv: "variables.env"
example:
stage: two
script: "echo $FILE_PATH"
another_job:
stage: two
trigger:
project: my/project
branch: staging
strategy: depend
Variables in the dotenv file will automatically be present for jobs in subsequent stages (or that declare needs: for the job).
You can also pull artifacts into child pipelines, in general.
But be warned you probably don't want to override the PATH variable, since that's a special variable used to help you find builtin binaries.

Get gitlab parent project details in child project

I am using below two gitlab repository
Parent Gitlab repo - Application code, for example - Angular application
Child Gitlab repo - For Gitlab Pipeline, has only gitlab-ci.yml file which contain script to run pipeline
I am calling pipeline/child-project gitlab-ci.yml file form parent using below steps
Parent Gitlab repo - gitlab-ci.yml file
include:
- project: 'my-group/child-project'
ref: master
file: '/templates/.gitlab-ci-template.yml'
Child-project - gitlab-ci.yml file
stages:
- test
- build
before_script:
- export PARENT_PROJECT_NAME = ?
- export PARENT_PROJECT_PIPELINE_ID = ?
- export PARENT_PROJECT_BRANCH_NAME = ?
job 1:
stage: test
script:
- echo "Runnig test for project ${PARENT_PROJECT_NAME}"
- node_modules/.bin/ng test
release_job:
stage: build
script: node_modules/.bin/ng build --prod
artifacts:
name: "project-$CI_COMMIT_REF_NAME"
paths:
- dist/
only:
- tags
How can I get the parent-repo details like parent-project name, pipeline-id & branch name in child-project which is running the pipeline?
One way is to define the variables in parent-project and use in child project, but is there any other way where we can directly access the parent-project detail in the child-project?
In your example, include is not the same as trigger. Include just merges all the files together into one giant pipeline so you should be able to access any variables you want from the included files, as long as the variable's scope is correct.
If you are actually looking to pass details to a child pipeline from a parent pipeline you could add a job that exports the variables and details you want to the dotenv, then have the child pipeline access that dotenv. This would allow the code to be dynamic inside of hard coding in the variables and directly passing them to the child pipelines
export-parent-details:
script:
- echo "PARENT_PROJECT_NAME=?" >> build.env
- echo "PARENT_PROJECT_PIPELINE_ID=?" >> build.env
- echo "PARENT_PROJECT_BRANCH_NAME=?" >> build.env
artifacts:
reports:
dotenv: build.env
trigger-child:
stage: docker_hub
trigger:
include:
- project: 'my-group/child-project'
ref: master
file: '/templates/.gitlab-ci-template.yml'
# use this variable in child pipeline to download artifacts from parent pipeline
variables:
PARENT_PIPELINE_ID: $CI_PIPELINE_ID
Then inside the child jobs, you should be able to access the parent artifacts from the parent
child-job:
needs:
- pipeline: $PARENT_PIPELINE_ID
job: export-parent-details
script:
- echo $PARENT_PROJECT_NAME
See
https://docs.gitlab.com/ee/ci/yaml/README.html#artifact-downloads-to-child-pipelines
https://docs.gitlab.com/ee/ci/multi_project_pipelines.html#pass-cicd-variables-to-a-downstream-pipeline-by-using-variable-inheritance
Another option could be to make an API call to the get the parent project's details since the runners have a read-only token under $CI_JOB_TOKEN, this method is dependent on repo access privileges and the details you want
curl -H "JOB_TOKEN: $CI_JOB_TOKEN" "https://gitlab.com/api/v4/{whatever the api call is}"
Since you’re including the child project’s configuration instead of triggering it, the two pipeline definition files are merged and become one before the pipeline starts, so there’s be no practical difference between this method and having the content of the child project’s definition in the parent’s.
Because of this, all of the predefined variables will be based on the parent project if the pipeline runs there. For example, variables like $CI_COMMIT_REF_NAME, $CI_PROJECT_NAME will point to the parent project and the parent project's branches.

Resources