Azure Build.SourceVersionMessage returns Null on pipeline task level - azure

In Azure pipeline I use Build.SourceVersionMessage variable to get last commit message. Based on that I want to decide if to build docker image or not (if commit message contains 'BUILD-DOCKER' then build docker image):
...
- task: Docker#0
condition: and(succeeded(), contains(variables['Build.SourceVersionMessage'], 'BUILD-DOCKER'))
...
Problem is that during pipeline execution commit message is a null:
Evaluating: and(succeeded(), contains(variables['Build.SourceVersionMessage'], 'BUILD-DOCKER'))
Expanded: and(True, contains(Null, 'BUILD-DOCKER'))
Result: False
Any idea why is it null?
Additionally e.g. variable Build.SourceBranch is resolved properly

You did not do anything wrong. Just, apologize to say, this is the issue which caused by us.
Because of some design reason which based considering on security, this variable was deleted from system by us. Our team has prepared the fixed code(revoke this deletion) and the PR is in progress.
The deployment procedure will be released as soon as possible. After the released finished, this variable will be shortly injected into the environment again.
Check this ticket to get the in time prompt by our engineer.
You can use below script to check the available variables in system we are still providing:
- task: Bash#3
inputs:
targetType: 'inline'
script: 'env | sort'
Please choose one from its result to set as in condition as a temporary work around which would not affect your build process.

Related

Non-constant Variables in Gitlab Pipelines

Surely many of you have encountered this and I would like to share my hacky solution. Essentially during the CI / CD process of a Gitlab pipeline most parameters are passed through "Variables". There are two issues that I've encountered with that.
Those parameters cannot be altered in Realtime - Say I'd want to execute jobs based on information from previous jobs it would always need to be saved in the cache as opposed to written to CI / CD variables
The execution of jobs is evaluated before the script, so the "rules" will only ever apply to the original parameters. Trouble arises when those are only available during runtime.
For complex pipelines one would want to pick and choose the tests automatically without having to respecify parameters every time. In my case I delivered a data product and depending on the content different steps had to be taken. How do we deal with those issues?
Changing parameters real-time:
https://docs.gitlab.com/ee/api/project_level_variables.html This API provides users with a way of interacting with the CI / CD variables. This will not work for any variables defined at the head of the YML files under the Variables tag. Rather this is a way to access the "Custom CI/CD Variables" https://docs.gitlab.com/ee/ci/variables/#custom-cicd-variables found under this link. This way custom variables can be created altered and deleted during running a pipeline. The only thing needed is a PRIVATE-TOKEN that has access rights to the API (I believe my token has all the rights).
job:
stage: example
script:
- 'curl --request PUT --header "PRIVATE-TOKEN: $ACCESS_TOKEN" "${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/variables/VARIABLE_NAME" --form "value=abc"'
Onto the next problem. Altering the variables won't let us actually control downstream jobs like this because of the fact that the "rules" block is executed before the pipeline is actually run. Hence it will use the variable before the curl request is sent.
job2:
stage: after_example
rules:
- if: $VARIABLE_NAME == "abc"
script:
- env
The way to avoid that is child pipelines. Child pipelines are initialized inside the parent pipeline and check the environment variables anew. A full example should illustrate my point.
variables:
PARAMETER: "Cant be changed"
stages:
- example
- after_example
- finally
job_1:
# Changing "VARIABLE_NAME" during runtime to "abc", VARIABLE_NAME has to exist
stage: example
script:
- 'curl --request PUT --header "PRIVATE-TOKEN: $ACCESS_TOKEN" "${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/variables/VARIABLE_NAME" --form "value=abc"'
job_2.1:
# This wont get triggered as we assume "abc" was not the value in VARIABLE_NAME before job_1
stage: after_example
rules:
- if: $VARIABLE_NAME == "abc"
script:
- env
job_3:
stage: after_example
trigger:
include:
- local: .donwstream.yml
strategy: depend
job_4:
stage: finally
script:
- echo "Make sure to properly clean up your variables to a default value"
# inside downstream.yml
stages:
- downstream
job_2.2:
# This will happen because the pipeline is initialized after job_1
stage: downstream
rules:
- if: $VARIABLE_NAME == "abc"
script:
- env
This coding bit probably won't run, however it exemplifies my point rather nicely. Job 2 should be executed based on an action that happens in Job 1. While the variables will be updated once we reach job 2.1, the rules check happens before so it will never be executed. Child pipelines do the rule check during the runtime of the Parent pipeline, this is why job 2.2 does run.
This variant is quite hacky and probably really inefficient, but for all intents and purposes it gets the job done.

How do I label pipelines in GitLab?

How do I add a label to the GitLab pipelines when they run?
This would be extremely helpful when you run a few nightly (scheduled) pipelines for different configurations on the main branch. For example, we run a nightly main branch with several submodules, each set at a point in their development (a commit point SHA) and I want to label that 'MAIN'. We run a second pipeline that I want to label 'HEADs', which is a result of pulling all of the HEAD's of the submodule to see if changes will break the main trunk when they are merged in.
Currently it shows:
Last commit message.
Pipeline #
commit SHA
Branch name
'Scheduled'
That is helpful, but it is very difficult to tell them apart because only the pipeline # changes between the pipelines.
I have good news!!
Our friends at GitLab have been working on this feature. There is now a way to label your pipeline in release 15.5.1-ee.0!
It uses the workflow control with a new keyword name
workflow:
name: 'Pipeline for branch: $CI_COMMIT_BRANCH'
You can even use the workflow:rules pair to have different names for you pipeline:
variables:
PIPELINE_NAME: 'Default pipeline name'
workflow:
name: '$PIPELINE_NAME'
rules:
- if: '$CI_PIPELINE_SOURCE == "merge_request_event"'
variables:
PIPELINE_NAME: 'MR pipeline: $CI_COMMIT_BRANCH'
- if: '$CI_MERGE_REQUEST_LABELS =~ /pipeline:run-in-ruby3/'
variables:
PIPELINE_NAME: 'Ruby 3 pipeline'
Find the docs here: https://docs.gitlab.com/ee/ci/yaml/#workflow
This feature is disabled by default in 15.5 because it is so new.
You can enable the feature flag, which is named pipeline_name.
See this link to enable: https://docs.gitlab.com/ee/administration/feature_flags.html
(You need to use the Rails Console to enable it. Pretty easy.)
Note: Remember that the workflow keyword affects the entire pipeline instance.
This seems to be officially supported with GitLab 15.7 (December 2022)
Add custom names to pipelines with workflow:name:
For some projects, the same pipeline can be configured to run differently for different variables or conditions, creating very distinct outcomes for successful pipelines.
It can be hard for you to determine which version of that pipeline ran since there is no indication about the inputs used for that particular run.
While labels like scheduled and API help, it is sometimes still difficult to identify specific pipelines.
Now you can set a pipeline name using the keyword workflow:name to better identify the pipeline with string, a CI/CD variable, or a combination of both.
See Documentation and Issue.
Note:
If the name is an empty string, the pipeline is not assigned a name.
A name consisting of only CI/CD variables could evaluate to an empty string if all the variables are also empty.

Merge inner parameter struct when using template - azure pipelines

I have a lot of default parameters in my template. I want to categorize them.
# template.yml
parameters:
azure:
name: cargo_test # Default job name
displayName: Cargo test # Default displayName
condition: true # Job condition
strategy: # Default strategy to test on Windows, MacOs and Linux.
matrix:
Linux:
vmImage: ubuntu-16.04
MacOS:
vmImage: macOS-10.13
Windows:
vmImage: vs2017-win2016
name: job_name
default_parameter1: default1
default_parameter2: defualt2
# rest of code
- job:A
template: template.yml
parameters:
azure:
name: test_name
This cause parameter.azure contains only one field name. I want to overwrite parameters.azure.name not all parameters.azure struct. Is it possible in azure pipelines?
I want to overwrite azure.name not all azure struct.
It seems that you are worrying if you just overwrite one parameter in .yml file which is using other template.yml file, it will affect all azure struct, right?
If so, you don't need worry about this. As what you defined in template.yml file, it has lots of parameters. After you use it in other .yml file: name: test_name , it only overwrite the value of parameter name with no effect on other parameters, and also this overwrite only available on current job.
For example, if in your use-template.yml:
- job:A
template: template.yml
parameters:
azure:
name: test_name
- job:B
template: template.yml
parameters:
azure:
condition: failed()
The overwriting of name, will only affect this parameter(name) value in Job A. After Job A finished, the value of name will reback to cargo_test in Job B.
In one word, the configuration in template.yml is fixed, the used in other yml will have any affect to the template.yml. So, you don't need to worry about how to categorize parameters which we does not support it until now.
You can check this simple example in official doc: Job templates.If have any misunderstanding about your idea, just feel free to correct me.
Updated:
Since we can get the value with parameters.azure.name, the Azure Devops should support these parameters categorize. And also, after tested, I got the same result with you. If overwrite parameters.azure.name, the rest parameters which in the same level with parameters.azure.name are all empty. I think this should be a issue which need our Product Group to fix it.
I have raise this issue report on our official Developer Community: When I overwrite the template parameter, the value be empty. You can follow this ticket, so that you can get the notification once it has any updated.
In addition, it seems no other work around to achieve parameters categorize. Just patience for this issue fixed. Once the fixed script release, our engineer would inform it in that ticket.

How to change pipeline variables for usage in the next build in Azure DevOps

The case I have is as follow:
I've created an Azure DevOps Pipeline with a Pipeline variable, let's say 'variable A'. The value of 'variable A' is 1. During the build, I change the value of 'variable A' to 2.
When the build runs for the second time I want the value of these 'variable A' but this is back 1 but I want that the value is 2 because on the previous build I set the value of 'variable A' to 2.
These are the methods I tried without success:
Method 1:
Write-Host "##vso[task.setvariable variable=A;]2"
Method 2:
$env:A = 2
The only thing that works but I don't think this is the way to go is to get the whole build definition via the rest api and put it back with the value of the variable changed.
Get
Update
Is there any other solution to this problem?
If you're specifically looking at an increasing number, then you can also use counters. These only woork in YAML based build definitions.
The format is as follows:
You can use any of the supported expressions for setting a variable. Here is an example of setting a variable to act as a counter that starts at 100, gets incremented by 1 for every run, and gets reset to 100 every day.
yaml
jobs:
- job:
variables:
a: $[counter(format('{0:yyyyMMdd}', pipeline.startTime), 100)]
steps:
- bash: echo $(a)
For more information about counters and other expressions, see expressions.
The counter is stored for the pipeline and is based on the prefix you provide in the counterr expression. The above expression uses the yyyymmdd to generate a prefix which is unique every day.
For UI driven build definitions, then indeed using the REST api to update the whole definiton would work, though it's really hard to work around all possibilities concerning paralelism.
How to change pipeline variables for usage in the next build in Azure DevOps
I am afraid you have to use the rest api to change the value of that pipeline variables.
That because when you use the script `"##vso[task.setvariable variable=testvar;]testvalue" to overwrite it, the overwrite value only work for current build pipeline.
When you use the execute the build again, it will still pull the value from pipeline variable value.
So, we have to update the value of that variables on the web portal. Then we need use the need the REST API (Definitions - Update) to update the value of the build pipeline definition variable from a build task:
Similar thread: How to modify Azure DevOps release definition variable from a release task?
Note:Change the API to the build definitions:
PUT https://dev.azure.com/{organization}/{project}/_apis/build/definitions/{definitionId}?api-version=5.0
Hope this helps.
I have found the easiest way to update variable values during a pipeline execution is to use the Azure CLI having also tried other methods with little or no success.
In a YAML pipeline, this may look something like so:
jobs:
- job: Update_Version
steps:
- task: AzureCLI#2
inputs:
azureSubscription: [your_subscription_id]
scriptType: 'pscore'
scriptLocation: 'inlineScript'
inlineScript: |
# set environment variable for current process
$env:AZURE_DEVOPS_EXT_PAT = $env:SYSTEM_ACCESSTOKEN
$oldVersionNumber = $(version-number)
$newVersionNumber = $oldVersionNumber + 1
az pipelines variable-group variable update --group-id [your_variable_group_id] --name version-number --organization [your_organization_url] --project [your_project_name] --value $newVersionNumber
env:
SYSTEM_ACCESSTOKEN: $(System.AccessToken)
The pipeline build service may also need permission to execute this command. To check, go to Pipelines -> Library -> Variable groups then edit the variable group containing your variable. Click on the Security button and make sure the user Project Collection Build Service has the Administrator role.
More information on the Azure CLI command can be found here. There is also another form of the command used to update variables that are not in a variable group, as described here.

How can I prevent a gitlab job running on push event

I have a simple gitlab-yaml file that I thought would run a job when only scheduled. However, it is getting fire on a push event as well.
Can anyone please tell me the correct way in which to specify that a job is only run when scheduled.
This is my gitlab-yaml file
job:on-schedule:
only:
- schedules
- branches
script:
- /usr/local/bin/phpunit -c phpunit_config.xml
Thanks
According to the GitLab documentation, branches means "When a branch is pushed".
https://docs.gitlab.com/ce/ci/yaml/README.html#only-and-except-simplified
So including branches in your only: section causes the pipeline job to also run on pushes to any branch.
You can either remove the branches entry, or if you wanted to restrict to pushes for a specific branch you could extend the branch entry to include project and branch name (branches#<project>/<branch>).
My suggestion is to reduce your YML to:
job:on-schedule:
only:
- schedules
script:
- /usr/local/bin/phpunit -c phpunit_config.xml

Resources