Is it possible to include / extend anchor from another .gitlab-ci.yml file? - gitlab

I am trying to reuse anchor from one yaml file in another yaml file,
parent-gitlab-ci.yml
.basic_check1: &basic_check1
script:
- echo "basic check1"
.basic_check2: &basic_check2
script:
- echo "basic check2"
child-gitlab-ci.yml
include:
- local: 'parent-gitlab-ci.yml'
stages:
- test
job1:
stage: test
script:
- *basic_check1
- *basic_check2
But I am getting error- 'This GitLab CI configuration is invalid: Unknown alias: basic_check1'. Is there any solution for this? I want to include script from two anchors. Tried using template job with before script as a hack but it doesn't solve my problem as I cant have two before scripts in a job to mimic two anchors like above.

Related

How to dynamically set parallel:matrix in gitlab yaml?

I have a .NET solution with multiple projects and for each project I have a separate test project. Currently, whenever I add a new project, I will add a separate test project for it and I need to manually add a new test to the pipeline test step.
I want to write a test step that will run all test projects parallel, but without me having to manually add a new test. Recently, I discovered gitlab has a parallel:matrix keyword, that seems a step in the right direction. I am already working on using it, instead of having separate implementations of a re-usable script, but if possible I want to also dynamically find tests in my test folder.
Current re-usable test script:
.test: &test
allow_failure: false
dependencies:
- build
image: mcr.microsoft.com/dotnet/sdk:6.0
script:
- echo ${TEST_NAME}
- echo ${RESULT_FILE_NAME}
- dotnet test --no-restore ./Tests/${TEST_NAME} -l "JUnit;LogFilePath=../../TestResults/${RESULT_FILE_NAME}.xml"
Example implementation:
Test1:
<<: *test
stage: test
variables:
TEST_NAME: "test1"
RESULT_FILE_NAME: "test1_results"
artifacts:
paths:
- ./TestResults/
What I'm trying to achieve:
test:
stage: test
dependencies:
- build
image: mcr.microsoft.com/dotnet/sdk:6.0
before_script:
- TEST_NAMES = ["test1", "test2"] //Want to find these dynamically
script:
- ls
- echo ${TEST_NAME}
- echo ${RESULT_FILE_NAME}
- dotnet test --no-restore ./Tests/${TEST_NAME} -l "JUnit;LogFilePath=../../TestResults/${TEST_NAME}.xml"
parallel:
matrix:
- TEST_NAME: TEST_NAMES
My current test step (added as exp_test until fully being able to replace test), where I'm expecting 2 parallel tests running, but instead it's only running 1 with the name of the variable, instead of using the variable as an array:
I found 1 answer on here that suggests to dynamically create a child pipeline yaml, but I want to see if it's possible to use parallel:matrix for this.

GitLab Templating: Multiple before_script blocks in default?

This doesn't work correctly:
template1.yml
default:
before_script:
- echo "hello from one"
template2.yml
default:
before_script:
- echo "hello from two"
.gitlab-ci.yml (the actual pipeline)
include:
- project: 'templates'
file:
- 'template1.yml'
- 'template2.yml'
build1:
stage: build
script:
- echo "hello from the pipeline"
output (note the output from template1 is missing):
Hello from two
hello from the pipeline
I understand that GitLab may believe that there is 'ambiguity' in the ordering when merging blocks in to before_script, but it could use the order specified in the 'include'
section of the pipeline.
Does anyone know of a way that I can arbitrarily include templates in my pipeline in such a way that they can contribute to the default 'before_script' block?
It all must run sequentially in the same container (be in the same 'stage')
For example, using yaml anchors requires us to specify the names of the anchors rather than just the file that the anchors were in, so that doesn't work...

How to use the project namespace in the environment URL of GitLab CI?

I'd like to parameterize the environment URL of one of my GitLab CI job in a project which belongs to a subgroup.
If for example I have:
CI_PROJECT_PATH = mygroup/mysubgroup/myproject
CI_PROJECT_NAMESPACE = mygroup/mysubgroup
CI_PROJECT_NAME = myproject
I'd like to have the URL to be something like https://mygroup.gitlab.com/-/mysubgroup/myproject/-/jobs/12345/artifacts/public/index.html.
But I cannot find a way to do this since there is no predefined variable for the "sub-namespace" (here mysubgroup) and there is no variable substitution in environment.url as far as I can see.
I tried in my gitlab-ci.yml things like that:
build:
stage: build
image: bash:latest
script:
- export # print the available variables
environment:
name: test
url: ${CI_SERVER_PROTOCOL}://${CI_PROJECT_ROOT_NAMESPACE}.${CI_PAGES_DOMAIN}/-/${CI_PROJECT_PATH#${CI_PROJECT_ROOT_NAMESPACE}/}/-/jobs/$CI_JOB_ID/artifacts/public/index.html
but the result is https://mygroup.gitlab.com/-/${CI_PROJECT_PATH#myproject/}/-/jobs/12345/artifacts/public/index.html.
References:
https://stackoverflow.com/a/58402821/1064669
https://gitlab.com/gitlab-org/gitlab/-/issues/350902
job:environment:url is directly processed by gitlab, which does not support this kind of parameter expansion.
You'll need to use bash or a similar shell for this to work as intended:
build:
stage: build
image: bash:latest
script:
- echo "DEPLOY_URL=${CI_SERVER_PROTOCOL}://${CI_PROJECT_ROOT_NAMESPACE}.${CI_PAGES_DOMAIN}/-/${CI_PROJECT_PATH#${CI_PROJECT_ROOT_NAMESPACE}/}/-/jobs/$CI_JOB_ID/artifacts/public/index.html" > deploy.env
- cat deploy.env
artifacts:
reports:
dotenv: deploy.env
environment:
name: test
url: $DEPLOY_URL
Based on the example given here: https://docs.gitlab.com/ee/ci/environments/#example-of-setting-dynamic-environment-urls
I think you need to escape the inner curly braces like this: (${CI_PROJECT_ROOT_NAMESPACE} should become $`{CI_PROJECT_ROOT_NAMESPACE`})
So the whole thing becomes:
url: ${CI_SERVER_PROTOCOL}://${CI_PROJECT_ROOT_NAMESPACE}.${CI_PAGES_DOMAIN}/-/${CI_PROJECT_PATH#$`{CI_PROJECT_ROOT_NAMESPACE`}/}/-/jobs/$CI_JOB_ID/artifacts/public/index.html

Is there any way to dynamically edit a variable in one job and then pass it to a trigger/bridge job in Gitlab CI?

I need to pass a file path to a trigger job where the file path is found within a specified json file in a separate job. Something along the lines of this...
stages:
- run_downstream_pipeline
variables:
- FILE_NAME: default_file.json
.get_path:
stage: run_downstream_pipeline
needs: []
only:
- schedules
- triggers
- web
script:
- apt-get install jq
- FILE_PATH=$(jq '.file_path' $FILE_NAME)
run_pipeline:
extends: .get_path
variables:
PATH: $FILE_PATH
trigger:
project: my/project
branch: staging
strategy: depend
I can't seem to find any workaround to do this, as using extends won't work since Gitlab wont allow for a script section in a trigger job.
I thought about trying to use the Gitlab API trigger method, but I want the status of the downstream pipeline to actually show up in the pipeline UI and I want the upstream pipeline to depend on the status of the downstream pipeline, which from my understanding is not possible when triggering via the API.
Any advice would be appreciated. Thanks!
You can use artifacts:reports:dotenv for setting variables dynamically for subsequent jobs.
stages:
- one
- two
my_job:
stage: "one"
script:
- FILE_PATH=$(jq '.file_path' $FILE_NAME) # In script, get the environment URL.
- echo "FILE_PATH=${FILE_PATH}" >> variables.env # Add the value to a dotenv file.
artifacts:
reports:
dotenv: "variables.env"
example:
stage: two
script: "echo $FILE_PATH"
another_job:
stage: two
trigger:
project: my/project
branch: staging
strategy: depend
Variables in the dotenv file will automatically be present for jobs in subsequent stages (or that declare needs: for the job).
You can also pull artifacts into child pipelines, in general.
But be warned you probably don't want to override the PATH variable, since that's a special variable used to help you find builtin binaries.

Moving scripts into a separate file in gitlab-ci.yml to avoid code duplication and including it from several files

I am trying to set up a CI with minimal code duplication using .gitlab-ci.yml.
With that, I am separating the configuration in separate files and reusing parts of it that are common.
I have a separate repository with Gitlab CI settings: gitlab-ci and several projects that use it to form their own CI pipelines.
Contents of gitlab-ci repository
template_jobs.yml:
.sample:
rules:
- if: '$CI_PIPELINE_SOURCE == "push"'
when: on_success
- when: never
jobs_architectureA.yml:
include:
- local: '/template_jobs.yml'
.script_core: &script_core
- echo "Running stage"
test_archA:
extends:
- .sample
stage: test
tags:
- architectureA
script:
- *script_core
jobs_architectureB.yml:
include:
- local: '/template_jobs.yml'
.script_core: &script_core
- echo "Running stage"
test_archB:
extends:
- .sample
stage: test
tags:
- architectureB
script:
- *script_core
Project with code contents:
In the actual project (separate repositories per project, and I have a lot of them), I have the following:
.gitlab-ci.yml:
stages:
- test
include:
- project: 'gitlab-ci'
file: '/jobs_architectureA.yml'
- project: 'gitlab-ci'
file: '/jobs_architectureB.yml'
This configuration works fine and allows to include only some architectures for some modules while sharing rules between the job templates.
However, it's easy to notice one code duplication: both jobs_architectureA.yml and jobs_architectureB.yml contain a common section:
.script_core: &script_core
- echo "Running stage"
It would be ideal to move it into a separate file: template_scripts.yml and include from both jobs_architectureA.yml* and jobs_architectureB.yml. However, that results in the invalid YAML (at least from Gitlab's point of view).
With that, I make a conclusion that I can share the rules as the mechanism of their usage is via extends keyword; however, I am not able to do it with the scripts: as it uses &/* anchoring mechanic on the YAML level.
Ideally, I want something along the lines of:
Contents of the ideal (conceptually) gitlab-ci repository
template_jobs.yml:
.sample:
rules:
- if: '$CI_PIPELINE_SOURCE == "push"'
when: on_success
- when: never
template_scripts.yml:
.script_core: &script_core
- echo "Running stage"
jobs_architectureA.yml:
include:
- local: '/template_jobs.yml'
- local: '/template_scripts.yml'
test_archA:
extends:
- .sample
stage: test
tags:
- architectureA
script:
- *script_core # this becomes invalid, as script_core is in the other file, even though it is included at the top
jobs_architectureB.yml:
include:
- local: '/template_jobs.yml'
- local: '/template_scripts.yml'
test_archB:
extends:
- .sample
stage: test
tags:
- architectureB
script:
- *script_core # this becomes invalid, as script_core is in the other file, even though it is included at the top
Am I doing something wrong?
Am I hitting a limitation of the Gitlab mechanic? Is it the implementation of the include directive in this specific YML type, that limits me?
Do I have options to achieve something close to the desired behaviour?
Note, while this might not look like a big deal, in reality, I have many more pieces to the scripts, and the actual script is much larger. Thus, currently, it is duplicated code all over the place which is very prone to mistakes.
my solution is to not include template_jobs.yml and template_scripts.yml directly in jobs_architectureA.yml but only in the "final" .gitlab-ci.yml
taking you exemple, /template_jobs.yml//template_scripts.yml do not change.
jobs_architectureA.yml loses the include:
test_archA:
extends:
- .sample
stage: test
tags:
- architectureB
script:
- *script_core # this becomes invalid, as script_core is in the other file, even though it is included at the top
and .gitlab-ci.yml becomes:
stages:
- test
include:
- local: '/template_jobs.yml'
- local: '/template_scripts.yml'
- project: 'gitlab-ci'
file: '/jobs_architectureA.yml'
- project: 'gitlab-ci'
file: '/jobs_architectureB.yml'
in reality, I have many more pieces to the scripts, and the actual script is much larger
Adding to Cyril's solution, GitLab 13.12 (May 2021) can help scale those includes:
Support wildcards when including YAML CI/CD configuration files
The includes: keyword for CI/CD pipelines lets you break one long .gitlab-ci.yml file into multiple smaller files to increase readability.
It also makes it easier to reuse configuration in multiple places.
Frequently there are multiple files included into a single pipeline, and they all might be stored in the same place.
In this release, we add support to use the * wildcard with the local includes: keyword. You can now make your includes: sections more dynamic, less verbose, and easier to read, check out how we are dogfooding it in GitLab.
See Documentation and Issue.
The extends: keyword can be used with multiple jobs to extend from (on recent versions of GitLab). You can simply do the following:
include:
- local: '/template_jobs.yml'
.script_core:
- echo "Running stage"
test_archA:
extends:
- .sample
- .script_core
stage: test
tags:
- architectureA
And after that move the .script_core to an import.

Resources