Get gitlab parent project details in child project - gitlab

I am using below two gitlab repository
Parent Gitlab repo - Application code, for example - Angular application
Child Gitlab repo - For Gitlab Pipeline, has only gitlab-ci.yml file which contain script to run pipeline
I am calling pipeline/child-project gitlab-ci.yml file form parent using below steps
Parent Gitlab repo - gitlab-ci.yml file
include:
- project: 'my-group/child-project'
ref: master
file: '/templates/.gitlab-ci-template.yml'
Child-project - gitlab-ci.yml file
stages:
- test
- build
before_script:
- export PARENT_PROJECT_NAME = ?
- export PARENT_PROJECT_PIPELINE_ID = ?
- export PARENT_PROJECT_BRANCH_NAME = ?
job 1:
stage: test
script:
- echo "Runnig test for project ${PARENT_PROJECT_NAME}"
- node_modules/.bin/ng test
release_job:
stage: build
script: node_modules/.bin/ng build --prod
artifacts:
name: "project-$CI_COMMIT_REF_NAME"
paths:
- dist/
only:
- tags
How can I get the parent-repo details like parent-project name, pipeline-id & branch name in child-project which is running the pipeline?
One way is to define the variables in parent-project and use in child project, but is there any other way where we can directly access the parent-project detail in the child-project?

In your example, include is not the same as trigger. Include just merges all the files together into one giant pipeline so you should be able to access any variables you want from the included files, as long as the variable's scope is correct.
If you are actually looking to pass details to a child pipeline from a parent pipeline you could add a job that exports the variables and details you want to the dotenv, then have the child pipeline access that dotenv. This would allow the code to be dynamic inside of hard coding in the variables and directly passing them to the child pipelines
export-parent-details:
script:
- echo "PARENT_PROJECT_NAME=?" >> build.env
- echo "PARENT_PROJECT_PIPELINE_ID=?" >> build.env
- echo "PARENT_PROJECT_BRANCH_NAME=?" >> build.env
artifacts:
reports:
dotenv: build.env
trigger-child:
stage: docker_hub
trigger:
include:
- project: 'my-group/child-project'
ref: master
file: '/templates/.gitlab-ci-template.yml'
# use this variable in child pipeline to download artifacts from parent pipeline
variables:
PARENT_PIPELINE_ID: $CI_PIPELINE_ID
Then inside the child jobs, you should be able to access the parent artifacts from the parent
child-job:
needs:
- pipeline: $PARENT_PIPELINE_ID
job: export-parent-details
script:
- echo $PARENT_PROJECT_NAME
See
https://docs.gitlab.com/ee/ci/yaml/README.html#artifact-downloads-to-child-pipelines
https://docs.gitlab.com/ee/ci/multi_project_pipelines.html#pass-cicd-variables-to-a-downstream-pipeline-by-using-variable-inheritance
Another option could be to make an API call to the get the parent project's details since the runners have a read-only token under $CI_JOB_TOKEN, this method is dependent on repo access privileges and the details you want
curl -H "JOB_TOKEN: $CI_JOB_TOKEN" "https://gitlab.com/api/v4/{whatever the api call is}"

Since you’re including the child project’s configuration instead of triggering it, the two pipeline definition files are merged and become one before the pipeline starts, so there’s be no practical difference between this method and having the content of the child project’s definition in the parent’s.
Because of this, all of the predefined variables will be based on the parent project if the pipeline runs there. For example, variables like $CI_COMMIT_REF_NAME, $CI_PROJECT_NAME will point to the parent project and the parent project's branches.

Related

How run a particular stage in GitLab after the execution of child pipelines'?

I'm using GitLab and the CI config has got following stages:-
stages:
- test
- child_pipeline1
- child_pipeline2
- promote-test-reports
At any time, only one of the child pipeline will run i.e. either child_pipeline1 or child_pipeline2 after the test stage and not both at a time.
Now, I have added another stage called promote-test-reports which I would like to run in the end i.e. after the successful execution of any of child pipeline.
But I'm completely blocked here. This promote-test-reports is coming from a template which I have included in this main CI config file like:-
# Include the template file
include:
- project: arcesium/internal/vteams/commons/commons-helper
ref: promote-child-artifacts
file: 'templates/promote-child-artifacts.yml'
I'm overriding the GitLab project token in this same main file like below:-
test:
stage: promote-test-reports
trigger:
include: adapter/child_pipelin1.yml
strategy: depend
variables:
GITLAB_PRIVATE_TOKEN: $GITLAB_TOKEN
If you see the above stage definition in main CI config file, I'm trying to use strategy: depend to wait for successful execution of child_pipeline1 and then run this stage but it throwing error (jobs:test config contains unknown keys: trigger)and this approach is not working reason is I'm using scripts in the main definition of this stage (promote-test-reports) in the template and as per the documentation both scripts and strategy cannot go together.
Following is the definition of this stage in the template:-
test:
stage: promote-test-reports
image: "495283289134.dkr.ecr.us-east-1.amazonaws.com/core/my-linux:centos7"
before_script:
- "yum install unzip -y"
variables:
GITLAB_PRIVATE_TOKEN: $GITLAB_TOKEN
allow_failure: true
script:
- 'cd $CI_PROJECT_DIR'
- 'echo running'
when: always
rules:
- if: $CI_PIPELINE_SOURCE == 'web'
artifacts:
reports:
junit: "**/testresult.xml"
coverage_report:
coverage_format: cobertura
path: "**/**/coverage.xml"
coverage: '/TOTAL\s+\d+\s+\d+\s+(\d+%)/'
The idea of using strategy attribute failed. I cannot remove the logic of scripts in the template as well. May I know what is the alternate way of running my job(promote-test-reports) in the end and remember it is going to be an OR condition that run either after child_pipeline1 or child_pipeline2
I would really appreciate your help
Finally, I was able to do it by putting the strategy: depend on child pipelines. I was doing it incorrectly earlier by doing on stage, promote-test-reports

GitLab runner skips artifacts for failed job with when: always when using "extends"

I have a "parent" job defined in a shared (between projects) yaml "default-ci.yml":
.build_solution:
stage: exe_builds
# Declare variables to be overridden
variables:
LVSLN: null
OUTPUT_DIR: null
# optionally use CONFIGURATION to specify a configuration
script:
- . $SCRIPTS_DIR\build_solution.ps1
-SolutionPath "$LVSLN"
-OutputDir "$OUTPUT_DIR"
artifacts:
when: always
paths:
- $ESCAPED_ARTIFACTS_DIR\
expire_in: 1 week
In the specific project yaml I define values for the variables
include: "default-ci.yml"
Build My Project:
extends: .build_solution
variables:
LVSLN: build.lvsln
OUTPUT_DIR: build
With it set up as above, the artifacts are only stored for successful jobs despite the parent stating when: always. At the end of the job summary of a failed job, it goes straight from the after script to Cleaning up project directory and file based variables: no Uploading artifacts for failed job (in other words it's not even trying to find/upload artifacts).
But, when I move the artifacts section to the child yml, so it becomes
include: "default-ci.yml"
Build My Project:
extends: .build_solution
variables:
LVSLN: build.lvsln
OUTPUT_DIR: build
artifacts:
when: always
paths:
- $ESCAPED_ARTIFACTS_DIR\
expire_in: 1 week
I do get artifacts from failed jobs with Uploading artifacts for failed job appearing in the summary.
The artifacts section should be the same for all projects extending .build_solution so I do not want to have to define it in each of the children, it should be defined in the parent.
It looks like what's happening is the parent "artifacts" section is not applying or is being overridden, but I can't find why or where that would be happening when the child has no artifacts section. The rest of the .build_solution job is working as defined in the parent as I see output from my "build_solution.ps1" script and it was passed the correct parameters.

Is there any way to dynamically edit a variable in one job and then pass it to a trigger/bridge job in Gitlab CI?

I need to pass a file path to a trigger job where the file path is found within a specified json file in a separate job. Something along the lines of this...
stages:
- run_downstream_pipeline
variables:
- FILE_NAME: default_file.json
.get_path:
stage: run_downstream_pipeline
needs: []
only:
- schedules
- triggers
- web
script:
- apt-get install jq
- FILE_PATH=$(jq '.file_path' $FILE_NAME)
run_pipeline:
extends: .get_path
variables:
PATH: $FILE_PATH
trigger:
project: my/project
branch: staging
strategy: depend
I can't seem to find any workaround to do this, as using extends won't work since Gitlab wont allow for a script section in a trigger job.
I thought about trying to use the Gitlab API trigger method, but I want the status of the downstream pipeline to actually show up in the pipeline UI and I want the upstream pipeline to depend on the status of the downstream pipeline, which from my understanding is not possible when triggering via the API.
Any advice would be appreciated. Thanks!
You can use artifacts:reports:dotenv for setting variables dynamically for subsequent jobs.
stages:
- one
- two
my_job:
stage: "one"
script:
- FILE_PATH=$(jq '.file_path' $FILE_NAME) # In script, get the environment URL.
- echo "FILE_PATH=${FILE_PATH}" >> variables.env # Add the value to a dotenv file.
artifacts:
reports:
dotenv: "variables.env"
example:
stage: two
script: "echo $FILE_PATH"
another_job:
stage: two
trigger:
project: my/project
branch: staging
strategy: depend
Variables in the dotenv file will automatically be present for jobs in subsequent stages (or that declare needs: for the job).
You can also pull artifacts into child pipelines, in general.
But be warned you probably don't want to override the PATH variable, since that's a special variable used to help you find builtin binaries.

Setup Gitlab CI/CD environment variables via UI

I am new at gitlab CI/CD Settings. I don't want to put sensitive credentials (like API-Keys, passwords...) into my branch. For this, GitLab (and other CI/CD-Services) are able to set environment variables.
What I have done so far:
Via UI (Project ⇒ Settings ⇒ CI/CD ⇒ Variables)
First go to Project ⇒ Settings ⇒ CI/CD ⇒ Variables and add them like this:
enter image description here
Now here trying to get the File with all your config-values(e.g. with dotenv).
require("dotenv");
module.exports = process.env.NODE_ENV.trim() === "production" ? _config.production : _config.development;
Current .gitlab-ci.yaml file is:
image: node:8.9.0
cache:
paths:
- node_modules/
stages:
- ver
- init
- test
- build
- deploy
ver:
stage: ver
script:
- node -v
init:
stage: init-dev
script:
- npm install
tags:
- dev_server
only:
- dev
variables:
ENV_PRODUCTION: "/builds/AkdiD/8/abcde/projectName/ENV_PRODUCTION"
test:
stage: test
script:
- npm test
build:
stage: build
script:
- echo "BUILD_VERSION=production" >> build.env
artifacts:
reports:
dotenv: build.env
deploy:
stage: deploy-dev
script:
- npm run killcurrent
- echo $ENV_PRODUCTION
- echo $BUILD_VERSION
- npm run staging
tags:
- dev_server
only:
- dev
Question: where do I need to keep this ENV_PRODUCTION file name (yaml file or other place) so that server take that value ??
Edited variable like this- still server it not fetching these variables. Should I change/put something in .gitlab-ci.yml file?
Settings up a Custom environment variables of type File (GItLab 11.11+) does not seem the way to reference/set a list of variables, including ones with sensitive information.
A variable of type file is generally there to represent, for instance, a certificate.
You should define variables, possibly Group-level environment variables
You can define per-project or per-group variables that are set in the pipeline environment.
Group-level variables are stored out of the repository (not in .gitlab-ci.yml) and are securely passed to GitLab Runner, which makes them available during a pipeline run.
For Premium users who do not use an external key store or who use GitLab’s integration with HashiCorp Vault, we recommend using group environment variables to store secrets like passwords, SSH keys, and credentials.

GitLab CI - Run pipeline when the contents of a file changes

I have a mono-repo with several projects (not my design choice).
Each project has a .gitlab-ci.yml setup to run a pipeline when a "version" file is changed. This is nice because a user can check-in to stage or master (for a hot-fix) and a build is created and deployed to a test environment.
The problem is when a user does a merge from master to stage and commits back to stage (to pull in any hot-fixes). This causes ALL the pipelines to run; even for projects that do not have actual content changes.
How do I allow the pipeline to run from master and/or stage but ONLY when the contents of the "version" file change? Like when a user changes the version number.
Here is an example of the .gitlab-ci.yml (I have 5 of these, 1 for each project in the mono-repo)
#
# BUILD-AND-TEST - initial build
#
my-project-build-and-test:
stage: build-and-test
script:
- cd $MY_PROJECT_DIR
- dotnet restore
- dotnet build
only:
changes:
- "MyProject/.gitlab-ci.VERSION.yml"
# no needs: here because this is the first step
#
# PUBLISH
#
my-project-publish:
stage: publish
script:
- cd $MY_PROJECT_DIR
- dotnet publish --output $MY_PROJECT_OUTPUT_PATH --configuration Release
only:
changes:
- "MyProject/.gitlab-ci.VERSION.yml"
needs:
- my-project-build-and-test
... and so on ...
I am still new to git, GitLab, and CI/pipelines. Any help would be appreciated! (I have little say in changing the mono-repo)
The following .gitlab-ci.yml will run the test_job only if the file version changes.
test_job:
script: echo hello world
rules:
- changes:
- version
See https://docs.gitlab.com/ee/ci/yaml/#ruleschanges
See also
Run jobs only/except for modifications on a path or file

Resources