Azure Release Pipeline - Variable for 'Triggered' Branch Name - azure

So I have been working off of master for a while, and just recently added a 'release' branch that I will be working off of from now on.
In my Release Pipeline I have a PowerShell script that sets a custom variable using predefined variables.
$branchName = $Env:BUILD_SOURCEBRANCHNAME
$buildNumber = $Env:BUILD_BUILDNUMBER
$release = $branchName + "." + $buildNumber.ToString()
$pipeline.variables.NameVar.value = $release
If I push code to my release branch, this script will run at the end of my pipeline, and the variable should be changed to release.xxxx, but it is changed to master.xxxx.
Is there a reason the build variable build.sourcebranchname does not return my release branch name, and instead returns master? The build.buildnumber variable returns the correct value.

Try to go to Get sources and check whether you select the correct branch.
I've tested your script and it returned expected result:

So I figured out my issue. After adding the new release branch, I needed to edit my build pipeline to setup multiple branches. Basically following this doc from Microsoft.
Adding release/* as a branch filter allowed the build pipeline to build on the release branch, and not just the master branch. From that point on, when I used the build variables in my release pipeline, they all returned the proper value.

Related

How do I label pipelines in GitLab?

How do I add a label to the GitLab pipelines when they run?
This would be extremely helpful when you run a few nightly (scheduled) pipelines for different configurations on the main branch. For example, we run a nightly main branch with several submodules, each set at a point in their development (a commit point SHA) and I want to label that 'MAIN'. We run a second pipeline that I want to label 'HEADs', which is a result of pulling all of the HEAD's of the submodule to see if changes will break the main trunk when they are merged in.
Currently it shows:
Last commit message.
Pipeline #
commit SHA
Branch name
'Scheduled'
That is helpful, but it is very difficult to tell them apart because only the pipeline # changes between the pipelines.
I have good news!!
Our friends at GitLab have been working on this feature. There is now a way to label your pipeline in release 15.5.1-ee.0!
It uses the workflow control with a new keyword name
workflow:
name: 'Pipeline for branch: $CI_COMMIT_BRANCH'
You can even use the workflow:rules pair to have different names for you pipeline:
variables:
PIPELINE_NAME: 'Default pipeline name'
workflow:
name: '$PIPELINE_NAME'
rules:
- if: '$CI_PIPELINE_SOURCE == "merge_request_event"'
variables:
PIPELINE_NAME: 'MR pipeline: $CI_COMMIT_BRANCH'
- if: '$CI_MERGE_REQUEST_LABELS =~ /pipeline:run-in-ruby3/'
variables:
PIPELINE_NAME: 'Ruby 3 pipeline'
Find the docs here: https://docs.gitlab.com/ee/ci/yaml/#workflow
This feature is disabled by default in 15.5 because it is so new.
You can enable the feature flag, which is named pipeline_name.
See this link to enable: https://docs.gitlab.com/ee/administration/feature_flags.html
(You need to use the Rails Console to enable it. Pretty easy.)
Note: Remember that the workflow keyword affects the entire pipeline instance.
This seems to be officially supported with GitLab 15.7 (December 2022)
Add custom names to pipelines with workflow:name:
For some projects, the same pipeline can be configured to run differently for different variables or conditions, creating very distinct outcomes for successful pipelines.
It can be hard for you to determine which version of that pipeline ran since there is no indication about the inputs used for that particular run.
While labels like scheduled and API help, it is sometimes still difficult to identify specific pipelines.
Now you can set a pipeline name using the keyword workflow:name to better identify the pipeline with string, a CI/CD variable, or a combination of both.
See Documentation and Issue.
Note:
If the name is an empty string, the pipeline is not assigned a name.
A name consisting of only CI/CD variables could evaluate to an empty string if all the variables are also empty.

gitlab-ci check if branch is a tag or not and then executes commands based on branch type

I would like to check if a branch is a tag or not in a single stage in gitlab-ci.yml. Then based on whether a branch is a tag or not, a specific set of commands are executed:
IF tag:
A SET OF COMMANDS
ELSE:
ANOTHER SET OF COMMAND
I am aware that only: -tags can be used to check if a branch is a tag, but how can I check the branch and then apply different sets of commands based on the branch type in a single gitlab-ci stage? In this case, if a branch is not a tag, I would to apply a different set of commands.
some_stage:
only:
- tags
script:
- | SET OF COMMANDS
You can use Gitlab-CI predefined variables:
CI_COMMIT_BRANCH (The commit branch name. Available in branch pipelines, including pipelines for the default branch. Not available in merge request pipelines or tag pipelines.)
CI_COMMIT_TAG (The commit tag name. Available only in pipelines for tags.)
The "CI_COMMIT_BRANCH" variable will hold the branch name only if pipeline is triggered from branch, otherwise it will not hold any value.
The same applies on the "CI_COMMIT_TAG" variable that will hold values only in case of pipeline triggered from tag.
You can use rules to check these variables and decide if you want to execute a job or not.
for example, if you want to build docker images from tags only:
build_docker:
rules:
- if: '$CI_COMMIT_TAG != null && $CI_BUILD_TAG =~ *SOME_REGEX*'
when: on_success
Update: Or you can use variable inside script to do you check but i recommend that you split the logic into to jobs. one for tags work and another for branch job. and specify with rules when a job is executed.
last thing: there is a deprecated tag called CI_BUILD_TAG, so if you are using an old version of gitlab.

Add GitLab CI job to pipeline based on script command result

I have a GitLab CI pipeline with a 'migration' job which I want to be added only if certain files changed between current commit and master branch, but in my current project I'm forced to use GitLab CI pipelines for push event which complicates things.
Docs on rules:changes clearly states that it will glitch and will not work properly without MR (my case of push event), so that's out of question.
Docs on rules:if states that it only works with env variables. But docs on passing CI/CD variables to another job clearly states that
These variables cannot be used as CI/CD variables to configure a
pipeline, but they can be used in job scripts.
So, now I'm stuck. I can just skip running the job in question overriding the script and checking for file changes, but what I want is not adding the job in question to pipeline in first place.
While you can't add a job alone to a pipeline based on the output of a script, you can add a child pipeline dynamically based on the script output. The method of using rules: with dynamic variables won't work because rules: are evaluated at the time the pipeline is created, as you found in the docs.
However, you can achieve the same effect using dynamic child-pipelines feature. The idea is you dynamically create the YAML for the desired pipeline in a job. That YAML created by your job will be used to create a child pipeline, which your pipeline can depend on.
Sadly, to add/remove a Gitlab job based on variables created from a previous job is not possible for a given pipeline
A way to achieve this is if your break your current pipeline to an upstream and downstream
The upstream will have 2 jobs
The first one will use your script to define a variable
This job will trigger the downstream, passing this variable
Upstream
check_val:
...
script:
... Script imposes the logic with the needed checks
... If true
- echo "MY_CONDITIONAL_VAR=true" >> var.env
... If false
- echo "MY_CONDITIONAL_VAR=false" >> var.env
artifacts:
reports:
dotenv: var.env
trigger_your_original_pipeline:
...
variables:
MY_CONDITIONAL_VAR: "$MY_CONDITIONAL_VAR"
trigger:
project: "project_namespance/project"
The downstream would be your original pipeline
Downstream
...
migration:
...
rules:
- if: '$MY_CONDITIONAL_VAR == "true"'
Now the MY_CONDITIONAL_VAR will be available at the start of the pipeline, so you can impose rules to add or not the migration job

Override a defined CI/CD variable in a gitlab job

i am currently facing an issue using Gitlab-ci. i am using gitlab 13.12 community edition.
I want to be able to override the value of a CICD variable from within a gitlab-ci.yml job.
you might say that i could just pass out the value from one job to another but the goal is to change that value so that , on the next pipeline , all my jobs will use the updated one.
To be precise , i want to be able to change MY_CICD_VARIABLE_TO_CHANGE that i've defined in project > cicd > variables. and update this value from a gitlab-ci job
something like this but not only change it for the current pipeline but update it :
change_variable_value:
stage: "change_variable"
image: myimage
script:
- $MY_CICD_VARIABLE_TO_CHANGE="value_changed"
, i tried to do every single solutions specify here :
https://docs.gitlab.com/13.12/ee/ci/variables/index.html#override-a-defined-cicd-variable
nothing seems to work.
I also tried to use artefact but it seems that i am not able to pass artefact between 2 distinct pipelines, they have to be part of the same pipeline , even by curling the api (at least in the community edition)
Any idea is more than welcome :)

Azure Pipelines - Setting Custom Variable to Build Number

In my Azure Build Pipeline (classic, not YAML) I set my build number to be the branch name and then a revision number variable. This was my process for that:
Pipelines -> Pipelines -> {my pipeline} -> Edit -> Options -> Build Number Format
$(SourceBranchName)$(Rev:.r)
In my testing, that works great.
Now, in my Release Pipeline, the first script I run is a PowerShell script that takes the build number, and applies it to a local variable (MyBuild) I created. The script is as follows:
Write-Host "Pipeline = $($pipeline | ConvertTo-Json -Depth 100)"
$buildNumber = $Env:BUILD_BUILDNUMBER
$pipeline.variables.MyBuild.value = $buildNumber
This variable is used later in the pipeline to create a folder that houses my release files.
$(BuildDirectory)/$(MyBuild)/Debug
For some reason, my variable is always one build behind. For example, if my build number is master.5, the folder that is created by my Release Pipeline is master.4. I have tried changing the order my scripts are in the pipeline, but that doesn't solve anything. It is weird because my Build Pipeline is correct (always named properly, ex. master.1, master.2, master.3, etc.) but my Release Pipeline variable is always one revision behind.
Powershell script to update the custom build number
- powershell: |
[string]$version="$(Build.Repository.Name)_SomeCustomData_$(Build.BuildId)"
Write-Output "##vso[build.updatebuildnumber]$(Version)"
displayName: Set Build Number
I tested it and it works well. Below is my reproduction, you can refer to:
In release pipeline :
Write-Host '##vso[task.setvariable variable=MyBuild]$(Build.BuildNumber)'
md $(Agent.ReleaseDirectory)/$env:MyBuild/Debug
Select build source as release artifact, default version select Latest, enable Continuous deployment trigger. This creates a release every time a new build is available.
Test reuslt:
In addition, the point I am confused about is how do you use the $(BuildDirectory) in the release pipeline? Agent.BuildDirectory:
The local path on the agent where all folders for a given build pipeline are created. This predefined variable should not be available in the release pipeline, we should use Agent.ReleaseDirectory.You can refer to predefined variable.

Resources