Gitlab pipeline: variables vs include conflict - gitlab

I would like to implement a manual triggered pipeline where the user must specify the value for 2 variables (from a given set of values). Since gitlab does not support a combo box yet, I found the following workaround:
variables:
DEPLOYMENT:
value: "NONE/env1/env2/env3"
description: "Choose deployment location"
BUILDMODE:
value: "package/install"
description: "Available options - package (local build, no deployment to nexus) or install (local build with deployment to nexus)"
This works as expected and the variables are displayed before launching the pipeline so that the user can edit their values and choose only one option.
Unfortunately, the variables are no longer shown when the following import is added to the gitlab-ci.yaml file:
include:
- project: "path_to_my_project"
ref: $CI_COMMIT_BRANCH
file:
- "pipeline/scripts.yaml"
Do you have any idea why the import change the behaviour for the variables above? (If I comment these lines, the variables are shown as expected).

Related

How do I label pipelines in GitLab?

How do I add a label to the GitLab pipelines when they run?
This would be extremely helpful when you run a few nightly (scheduled) pipelines for different configurations on the main branch. For example, we run a nightly main branch with several submodules, each set at a point in their development (a commit point SHA) and I want to label that 'MAIN'. We run a second pipeline that I want to label 'HEADs', which is a result of pulling all of the HEAD's of the submodule to see if changes will break the main trunk when they are merged in.
Currently it shows:
Last commit message.
Pipeline #
commit SHA
Branch name
'Scheduled'
That is helpful, but it is very difficult to tell them apart because only the pipeline # changes between the pipelines.
I have good news!!
Our friends at GitLab have been working on this feature. There is now a way to label your pipeline in release 15.5.1-ee.0!
It uses the workflow control with a new keyword name
workflow:
name: 'Pipeline for branch: $CI_COMMIT_BRANCH'
You can even use the workflow:rules pair to have different names for you pipeline:
variables:
PIPELINE_NAME: 'Default pipeline name'
workflow:
name: '$PIPELINE_NAME'
rules:
- if: '$CI_PIPELINE_SOURCE == "merge_request_event"'
variables:
PIPELINE_NAME: 'MR pipeline: $CI_COMMIT_BRANCH'
- if: '$CI_MERGE_REQUEST_LABELS =~ /pipeline:run-in-ruby3/'
variables:
PIPELINE_NAME: 'Ruby 3 pipeline'
Find the docs here: https://docs.gitlab.com/ee/ci/yaml/#workflow
This feature is disabled by default in 15.5 because it is so new.
You can enable the feature flag, which is named pipeline_name.
See this link to enable: https://docs.gitlab.com/ee/administration/feature_flags.html
(You need to use the Rails Console to enable it. Pretty easy.)
Note: Remember that the workflow keyword affects the entire pipeline instance.
This seems to be officially supported with GitLab 15.7 (December 2022)
Add custom names to pipelines with workflow:name:
For some projects, the same pipeline can be configured to run differently for different variables or conditions, creating very distinct outcomes for successful pipelines.
It can be hard for you to determine which version of that pipeline ran since there is no indication about the inputs used for that particular run.
While labels like scheduled and API help, it is sometimes still difficult to identify specific pipelines.
Now you can set a pipeline name using the keyword workflow:name to better identify the pipeline with string, a CI/CD variable, or a combination of both.
See Documentation and Issue.
Note:
If the name is an empty string, the pipeline is not assigned a name.
A name consisting of only CI/CD variables could evaluate to an empty string if all the variables are also empty.

How to inherit gitlab variables accross projects?

How can I let gitlab fill a global variable with from CI/CD secret, and then inherit this global variable in other projects?
templates/commons.yml:
variables:
TEST_VAR: $FILLED_FROM_SECRETS
project/.gitlab_ci.yml.
include:
- project: '/templates'
ref: master
file:
- 'commons.yml'
test:
stage: test
script:
- echo $TEST_VAR
Result: the variable is never set. Why?
(of course the FILLED_FORM_SECRETS variable is set in the commons project)
The problem you have is that include: only brings in the contents on the YAML file, not the project level settings or variables.
As possible alternatives, you can:
Set the variable in the template directly (not recommended for sensitive values)
Set variables set on your own self-hosted runners (note variables cannot be masked this way)
Set instance CI/CD variables
Set a required CI configuration to forcibly include a template to all projects (that template can include variables you need) (note variables cannot be masked this way)
Set group CI/CD variables (where all your projects live under the common group)
Retrieve your secrets using the vault integration or as part of your job script
With the include keyword the included files are merged with the .gitlab-ci.yml and then your .gitlab-ci.yml is executed in the repo where the pipeline is triggered. Therefore, only gobal variables in this repo or inherited variables from any parent groups are known. That's why TEST_VAR is not substituted with the value from the secret as the variable is defined in another repository.
According to the doc, the syntax you used require you provide the whole path for your project (all the part after gitlab.com/group/project).
Assuming your project path is gitlab.com/group/my_project, then you choose one of the following
include:
- project: 'group/my_project'
ref: master
file:
- 'templates/commons.yml'
# or simply, if 'templates' folder lives in the same project as your gitlab-ci.yml file
- '/templates/commons.yml'
test:
stage: test
script:
- echo $TEST_VAR
I personally used both ways in my work, but the doc shows other ways to implement this that you can have a look at.

Object reference not set to an instance of an object in azure pipeline

I'm trying to import variable groups depending on the current branch in azure build pipeline, but I get this error: 'Object reference not set to an instance of an object'.
I have simplified the case and I get this error when I have both of the lines (condition and import) in my .yaml file:
variables:
${{ if eq('bbb', 'master') }}:
- group: variables-shared
If I remove condition, everything works as expected. If I remove group importing, I get other errors related to undefined variable below (that is normal).
I am interested why I get this error
I also had this exact issue. The reasoning in the currently accepted answer is not correct, and you can in fact use conditional insertion in your azure-pipelines.yml file to conditionally include a particular variable group.
When the docs say that conditional insertion requires the use of template syntax, they're referring to template expression syntax, not the use of templates directly. Per the previous link, template expression syntax is the ${{ }} syntax for expanding expressions.
As gleaned from this example from the docs, The problem with the example in the question is actually a syntax error.
Incorrect:
variables:
${{ if eq('bbb', 'master') }}:
- group: variables-shared
Correct:
variables:
- ${{ if eq('bbb', 'master') }}:
- group: variables-shared
Note the leading - before the $ char on the second line. I've tested this in my own pipelines and it works, although it does freak out the yaml syntax checker in my IDE.
I actually discovered a hack yesterday for debugging this obnoxiously vague and wide-ranging error message.
For the build that fails with this message, if you click "Run new" and try to run the job manually by clicking "Run", it will typically give you a much more specific error message at that point.
For instance:
If I remove condition, everything works as expected.
I am interested why I get this error
Check the Expressions doc and you will find this: Conditionals only work when using template syntax.
That's why you can not use condition for your variable group.
The workaround is to use the template to store your variables rather than variable groups.
Please refer to Variable reuse:
# File: vars.yml
variables:
favoriteVeggie: 'brussels sprouts'
# File: azure-pipelines.yml
variables:
- template: vars.yml # Template reference
steps:
- script: echo My favorite vegetable is ${{ variables.favoriteVeggie }}.
I found two other potential causes to "An error occurred while loading the yaml build pipeline. Object reference not set to an instance of an object."
"stages:" was missing in stage template before the stage
Template file reference didn't exist (- template: 'template.yml')
I had also an object reference not set to an instance of an object with code 606802. The build pipeline had no errors at all.
The error was caused by a pre-validation build, where 1 parameter value had no default value.
After adding the default value, the PR validation build succeeded.

Merge inner parameter struct when using template - azure pipelines

I have a lot of default parameters in my template. I want to categorize them.
# template.yml
parameters:
azure:
name: cargo_test # Default job name
displayName: Cargo test # Default displayName
condition: true # Job condition
strategy: # Default strategy to test on Windows, MacOs and Linux.
matrix:
Linux:
vmImage: ubuntu-16.04
MacOS:
vmImage: macOS-10.13
Windows:
vmImage: vs2017-win2016
name: job_name
default_parameter1: default1
default_parameter2: defualt2
# rest of code
- job:A
template: template.yml
parameters:
azure:
name: test_name
This cause parameter.azure contains only one field name. I want to overwrite parameters.azure.name not all parameters.azure struct. Is it possible in azure pipelines?
I want to overwrite azure.name not all azure struct.
It seems that you are worrying if you just overwrite one parameter in .yml file which is using other template.yml file, it will affect all azure struct, right?
If so, you don't need worry about this. As what you defined in template.yml file, it has lots of parameters. After you use it in other .yml file: name: test_name , it only overwrite the value of parameter name with no effect on other parameters, and also this overwrite only available on current job.
For example, if in your use-template.yml:
- job:A
template: template.yml
parameters:
azure:
name: test_name
- job:B
template: template.yml
parameters:
azure:
condition: failed()
The overwriting of name, will only affect this parameter(name) value in Job A. After Job A finished, the value of name will reback to cargo_test in Job B.
In one word, the configuration in template.yml is fixed, the used in other yml will have any affect to the template.yml. So, you don't need to worry about how to categorize parameters which we does not support it until now.
You can check this simple example in official doc: Job templates.If have any misunderstanding about your idea, just feel free to correct me.
Updated:
Since we can get the value with parameters.azure.name, the Azure Devops should support these parameters categorize. And also, after tested, I got the same result with you. If overwrite parameters.azure.name, the rest parameters which in the same level with parameters.azure.name are all empty. I think this should be a issue which need our Product Group to fix it.
I have raise this issue report on our official Developer Community: When I overwrite the template parameter, the value be empty. You can follow this ticket, so that you can get the notification once it has any updated.
In addition, it seems no other work around to achieve parameters categorize. Just patience for this issue fixed. Once the fixed script release, our engineer would inform it in that ticket.

Is it possible to use a variable in the ref property of resources:repository for Azure DevOps YAML?

I have two AzureDevOps Git branches:
master
feature/mybranch
I have a multi-stage build pipeline defined in yaml, where some of the steps are templated into separate .yml files.
In my outer azure-pipelines.yml I reference a repository where my template .yml's live:
resources:
repositories:
- repository: templates
type: git
name: MyProject/MyRepo
when I'm building in the 'master' branch everything is good as by default the repository will look in refs/heads/master.
when I'm working in the feature branch and I want to test experimental changes to my template .yml files, I don't want it to fetch them from the master branch, I want it to use the files from the branch I am working in.
The following works and allows me to do this:
resources:
repositories:
- repository: templates
type: git
name: MyProject/MyRepo
ref: refs/heads/feature/mybranch
However, when I merge this back to master, I obviously don't want 'ref:' still pointing at the feature branch, so I'd like to generate the value of 'ref:' dynamically with a variable.
I've tried using ref: $(Build.SourceBranch) where $(Build.SourceBranch) should expand to 'refs/heads/feature/mybranch'
But it doesn't work. Error:
62638: "/azure-pipelines.yml: Could not get the latest source version for repository MySolution hosted on Azure Repos using ref refs/heads/$(Build.SourceBranch)."
Instead of referencing the repo in resources, use inline checkout as described here
https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/multi-repo-checkout?view=azure-devops#checking-out-a-specific-ref
- checkout: git://MyProject/MyRepo#features/tools
And this yaml element allows use of template expressions using variables, parameters e.g.
- checkout: git://${{ variables.repoName}}#${{ variables.branchRef }}
OR
- checkout: git://${{ parameters.repoName}}#${{ parameters.branchRef }}
And you can change that dynamically
Or the other alternative is use script task as below
- script: |
# note checkout: git://$(System.TeamProject)/${{ parameters.repoName }}#${{ parameters.repoRef }} this does not work if this task is run multiple times in same pipeline
# see here for more details :https://developercommunity.visualstudio.com/t/checkout-fails-with-an-error-occurred-while-loadin/1005991#T-N1270459
repoDir=$(Agent.BuildDirectory)/${{ parameters.repoName }}
/bin/rm -rf $repoDir
url_with_token=$(echo $(System.CollectionUri) | sed -e "s/https\:\/\//https\:\/\/$(System.AccessToken)\#/g")
git clone $url_with_token/$(System.TeamProject)/_git/${{ parameters.repoName }} $repoDir
cd $repoDir
git fetch origin '${{ parameters.repoRef }}':'localBranch'
git checkout localBranch
name: clone_script
displayName: Checkout using script ${{ parameters.repoName }}#${{ parameters.repoRef }}
I have added above as a template and its usage in a gist, to make it re-usable easily.
Hope that helps
https://gist.github.com/scorpionlion/1773d08b62ca5875cc2fd6dcdd0394d2
Is it possible to use a variable in the ref property of
resources:repository for Azure DevOps YAML?
For this question, the answer is Yes, it's possible.
About why you receive that error message, just is the variable($(Build.SourceBranch)) you used is incorrect. You should use $(Build.SourceBranchName).
As normal, for ref, we should input master or any other feature branches. Such as
ref: refs/heads/master
This may make you thought that this is same with the value of $(Build.SourceBranch). It looks same, I know, but different. In fact, for server, it will read the exactly branch name not the branch path, which we can clearly figure out with the classic editor type:
According with classic editor type, we can know here we should input the exactly branch name.
So, as the Predefined variables defined, the value of $(Build.SourceBranch) is the branch path, but for $(Build.SourceBranchName), it's represent a exactly branch name.
So, if you want to execute successfully, you need to use : $(Build.SourceBranchName). And it's worked on my side.
Hope this also can help you stay away from the error message.
Edit:
The complete script which is worked for me is:
resources:
repositories:
- repository: templates
type: git
name: MyApp/MyconApp
ref: $(Build.SourceBranchName)
The azure docs state
Variables can't be used to define a repository in a YAML statement.
So that seems to place some limitations on what you can do here. Perhaps there is a workaround that still allows you to do what you want.

Resources