GitLab Templating: Multiple before_script blocks in default? - gitlab

This doesn't work correctly:
template1.yml
default:
before_script:
- echo "hello from one"
template2.yml
default:
before_script:
- echo "hello from two"
.gitlab-ci.yml (the actual pipeline)
include:
- project: 'templates'
file:
- 'template1.yml'
- 'template2.yml'
build1:
stage: build
script:
- echo "hello from the pipeline"
output (note the output from template1 is missing):
Hello from two
hello from the pipeline
I understand that GitLab may believe that there is 'ambiguity' in the ordering when merging blocks in to before_script, but it could use the order specified in the 'include'
section of the pipeline.
Does anyone know of a way that I can arbitrarily include templates in my pipeline in such a way that they can contribute to the default 'before_script' block?
It all must run sequentially in the same container (be in the same 'stage')
For example, using yaml anchors requires us to specify the names of the anchors rather than just the file that the anchors were in, so that doesn't work...

Related

Is it possible to include / extend anchor from another .gitlab-ci.yml file?

I am trying to reuse anchor from one yaml file in another yaml file,
parent-gitlab-ci.yml
.basic_check1: &basic_check1
script:
- echo "basic check1"
.basic_check2: &basic_check2
script:
- echo "basic check2"
child-gitlab-ci.yml
include:
- local: 'parent-gitlab-ci.yml'
stages:
- test
job1:
stage: test
script:
- *basic_check1
- *basic_check2
But I am getting error- 'This GitLab CI configuration is invalid: Unknown alias: basic_check1'. Is there any solution for this? I want to include script from two anchors. Tried using template job with before script as a hack but it doesn't solve my problem as I cant have two before scripts in a job to mimic two anchors like above.

Unexpected behaviour of "rules" in GitLab CI

I have some problems with understanding how and why "rules" in GitLab CI work.
I have written a minimal code showing the issue in a GitLab project: https://gitlab.com/daniel.grabowski/gitlab-ci-rules-problems It contains two directories ("files1/" and "files2/") with some files in them and a .gitlab-ci.yml file.
My configuration
Here's the CI configuration:
stages:
- build
workflow:
rules:
- if: $CI_PIPELINE_SOURCE == "push"
.job_tpl:
image: alpine:latest
stage: build
variables:
TARGET_BRANCH: $CI_DEFAULT_BRANCH
rules:
- if: $CI_COMMIT_BRANCH == $TARGET_BRANCH
changes:
- $FILES_DIR/**/*
variables:
JOB_ENV: "prod"
- if: $CI_COMMIT_BRANCH != $TARGET_BRANCH
changes:
- $FILES_DIR/**/*
when: manual
allow_failure: true
variables:
JOB_ENV: "dev"
script:
- echo "CI_COMMIT_BRANCH=$CI_COMMIT_BRANCH"
- echo "TARGET_BRANCH=$TARGET_BRANCH"
- echo "JOB_ENV=$JOB_ENV"
files1 job:
extends: .job_tpl
variables:
FILES_DIR: files1
files2 job:
extends: .job_tpl
variables:
FILES_DIR: files2
As you can see in the above code I'm using workflow to run only "branch pipelines" and have two "twin" jobs configured to watch for changes in one of the project's directories each. The TARGET_BRANCH variable is of course unnecessary in the demo project but i need something like this in the real one and it shows one of my problems. Additionally the jobs behave differently depending on the branch for which they are run.
My expectations
What I want to achieve is:
Each of the jobs should be added to a pipeline only when I push changes to files1/ or files2/ directory respectively.
When I push changes to a branch different then "main" a manual job responsible for the changed directory shoud be added to a pipeline.
When I merge changes to the "main" branch a job responsible for the changed directory shoud be added to a pipeline and it should be automatically started.
Test scenario
I'm creating a new branch from "main", make some change in the file1/test.txt and push the branch to GitLab.
what I expect: a pipeline created with only "files1 job" runnable manually
what I get: a pipeline with both jobs (both manual). Actually I've found explanation of such behaviour here: https://docs.gitlab.com/ee/ci/jobs/job_control.html#jobs-or-pipelines-run-unexpectedly-when-using-changes - "The changes rule always evaluates to true when pushing a new branch or a new tag to GitLab."
On the same branch I make another change in the file1/test.txt and make push.
what I expect: a pipeline created with only "files1 job" runnable manually
what I get: exactly what I expect since the branch isn't a "new" one
I create a Merge Request from my branch to main and make the merge.
what I expect: a pipeline created with only "files1 job" which starts automatically
what I get: a pipeline created with only "files1 job" but a manual one
My questions/problems
Can you suggest me any way to bypass the issue with "changes" evaluating always to "true" on new branches? Actually it behaves exactly as I want it if I don't use "rules" but let's assume I need "rules".
Why the jobs run as "manual" on the main branch in spite of the "if" condition in which both CI_COMMIT_BRANCH and TARGET_BRANCH variables are (or should be) set to "main". To debug it I'm printing those vars in job's "script" and when I run it on "main" pipeline I'm getting:
$ echo "CI_COMMIT_BRANCH=$CI_COMMIT_BRANCH"
CI_COMMIT_BRANCH=main
$ echo "TARGET_BRANCH=$TARGET_BRANCH"
TARGET_BRANCH=main
$ echo "JOB_ENV=$JOB_ENV"
JOB_ENV=dev
so theoretically CI should enter into the "automatic" job path.
Generally I find the CI "rules" quite inconvenient and confusing but as I understand it GitLab prefers them to "only/except" solution so I'm trying to refactor my CI/CD to use them which will fail if I don't find solution for the above difficulties :(

Is there any way to dynamically edit a variable in one job and then pass it to a trigger/bridge job in Gitlab CI?

I need to pass a file path to a trigger job where the file path is found within a specified json file in a separate job. Something along the lines of this...
stages:
- run_downstream_pipeline
variables:
- FILE_NAME: default_file.json
.get_path:
stage: run_downstream_pipeline
needs: []
only:
- schedules
- triggers
- web
script:
- apt-get install jq
- FILE_PATH=$(jq '.file_path' $FILE_NAME)
run_pipeline:
extends: .get_path
variables:
PATH: $FILE_PATH
trigger:
project: my/project
branch: staging
strategy: depend
I can't seem to find any workaround to do this, as using extends won't work since Gitlab wont allow for a script section in a trigger job.
I thought about trying to use the Gitlab API trigger method, but I want the status of the downstream pipeline to actually show up in the pipeline UI and I want the upstream pipeline to depend on the status of the downstream pipeline, which from my understanding is not possible when triggering via the API.
Any advice would be appreciated. Thanks!
You can use artifacts:reports:dotenv for setting variables dynamically for subsequent jobs.
stages:
- one
- two
my_job:
stage: "one"
script:
- FILE_PATH=$(jq '.file_path' $FILE_NAME) # In script, get the environment URL.
- echo "FILE_PATH=${FILE_PATH}" >> variables.env # Add the value to a dotenv file.
artifacts:
reports:
dotenv: "variables.env"
example:
stage: two
script: "echo $FILE_PATH"
another_job:
stage: two
trigger:
project: my/project
branch: staging
strategy: depend
Variables in the dotenv file will automatically be present for jobs in subsequent stages (or that declare needs: for the job).
You can also pull artifacts into child pipelines, in general.
But be warned you probably don't want to override the PATH variable, since that's a special variable used to help you find builtin binaries.

How to configure condition based pipeline | Azure Pipeline

I have come across a scenario where I want to build source code dependant upon the source directory.
I have 2 languages in the same git repository (dotnet & Python).
I wanted to build the source code using single Azure Pipelines
If the commit is done for both (dotnet & Python) all the task should get executed
& if the commit is done for a specific directory only the respective language should get executed.
Please let me know how can I achieve this using -condition or if there are any other alternatives
Below is my azure-pipelines.yml
#Trigger and pool name configuration
variables:
name: files
steps:
- script: $(files) = git diff-tree --no-commit-id --name-only -r $(Build.SourceVersion)"
displayName: 'display Last Committed Files'
## Here I am getting changed files
##Dotnet/Server.cs
##Py/Hello.py
- task: PythonScript#0 ## It should only get called when there are changes in /Py
inputs:
scriptSource: 'inline'
script: 'print(''Hello, FromPython!'')'
condition: eq('${{ variables.files }}', '/Py')
- task: DotNetCoreCLI#2 ## It should only get called when there are changes in /Dotnet
inputs:
command: 'build'
projects: '**/*.csproj'
condition: eq('${{ variables.files }}', '/Dotnet')
Any help will be appreciated
I don't think it's possible to do directly what you want. All these task conditions are evaluated at the beginning of the pipeline execution. So if you set pipeline variables in any particular task, even the first one, it's already too late.
If you really want to do this, you probably have to go scripting all the way. So you set the variables in the first script using syntax from here:
(if there are C# files) echo '##vso[task.setvariable variable=DotNet]true'
(if there are Py files) echo '##vso[task.setvariable variable=Python]true'
And then in other scripts you evaluate them like:
if $(DotNet) = 'true' then dotnet build
Something among these lines. That'll probably be quite subtle so maybe it would make sense to reconsider the flow on some higher level but without extra context it's hard to say.

Get gitlab parent project details in child project

I am using below two gitlab repository
Parent Gitlab repo - Application code, for example - Angular application
Child Gitlab repo - For Gitlab Pipeline, has only gitlab-ci.yml file which contain script to run pipeline
I am calling pipeline/child-project gitlab-ci.yml file form parent using below steps
Parent Gitlab repo - gitlab-ci.yml file
include:
- project: 'my-group/child-project'
ref: master
file: '/templates/.gitlab-ci-template.yml'
Child-project - gitlab-ci.yml file
stages:
- test
- build
before_script:
- export PARENT_PROJECT_NAME = ?
- export PARENT_PROJECT_PIPELINE_ID = ?
- export PARENT_PROJECT_BRANCH_NAME = ?
job 1:
stage: test
script:
- echo "Runnig test for project ${PARENT_PROJECT_NAME}"
- node_modules/.bin/ng test
release_job:
stage: build
script: node_modules/.bin/ng build --prod
artifacts:
name: "project-$CI_COMMIT_REF_NAME"
paths:
- dist/
only:
- tags
How can I get the parent-repo details like parent-project name, pipeline-id & branch name in child-project which is running the pipeline?
One way is to define the variables in parent-project and use in child project, but is there any other way where we can directly access the parent-project detail in the child-project?
In your example, include is not the same as trigger. Include just merges all the files together into one giant pipeline so you should be able to access any variables you want from the included files, as long as the variable's scope is correct.
If you are actually looking to pass details to a child pipeline from a parent pipeline you could add a job that exports the variables and details you want to the dotenv, then have the child pipeline access that dotenv. This would allow the code to be dynamic inside of hard coding in the variables and directly passing them to the child pipelines
export-parent-details:
script:
- echo "PARENT_PROJECT_NAME=?" >> build.env
- echo "PARENT_PROJECT_PIPELINE_ID=?" >> build.env
- echo "PARENT_PROJECT_BRANCH_NAME=?" >> build.env
artifacts:
reports:
dotenv: build.env
trigger-child:
stage: docker_hub
trigger:
include:
- project: 'my-group/child-project'
ref: master
file: '/templates/.gitlab-ci-template.yml'
# use this variable in child pipeline to download artifacts from parent pipeline
variables:
PARENT_PIPELINE_ID: $CI_PIPELINE_ID
Then inside the child jobs, you should be able to access the parent artifacts from the parent
child-job:
needs:
- pipeline: $PARENT_PIPELINE_ID
job: export-parent-details
script:
- echo $PARENT_PROJECT_NAME
See
https://docs.gitlab.com/ee/ci/yaml/README.html#artifact-downloads-to-child-pipelines
https://docs.gitlab.com/ee/ci/multi_project_pipelines.html#pass-cicd-variables-to-a-downstream-pipeline-by-using-variable-inheritance
Another option could be to make an API call to the get the parent project's details since the runners have a read-only token under $CI_JOB_TOKEN, this method is dependent on repo access privileges and the details you want
curl -H "JOB_TOKEN: $CI_JOB_TOKEN" "https://gitlab.com/api/v4/{whatever the api call is}"
Since you’re including the child project’s configuration instead of triggering it, the two pipeline definition files are merged and become one before the pipeline starts, so there’s be no practical difference between this method and having the content of the child project’s definition in the parent’s.
Because of this, all of the predefined variables will be based on the parent project if the pipeline runs there. For example, variables like $CI_COMMIT_REF_NAME, $CI_PROJECT_NAME will point to the parent project and the parent project's branches.

Resources