I have a lot of default parameters in my template. I want to categorize them.
# template.yml
parameters:
azure:
name: cargo_test # Default job name
displayName: Cargo test # Default displayName
condition: true # Job condition
strategy: # Default strategy to test on Windows, MacOs and Linux.
matrix:
Linux:
vmImage: ubuntu-16.04
MacOS:
vmImage: macOS-10.13
Windows:
vmImage: vs2017-win2016
name: job_name
default_parameter1: default1
default_parameter2: defualt2
# rest of code
- job:A
template: template.yml
parameters:
azure:
name: test_name
This cause parameter.azure contains only one field name. I want to overwrite parameters.azure.name not all parameters.azure struct. Is it possible in azure pipelines?
I want to overwrite azure.name not all azure struct.
It seems that you are worrying if you just overwrite one parameter in .yml file which is using other template.yml file, it will affect all azure struct, right?
If so, you don't need worry about this. As what you defined in template.yml file, it has lots of parameters. After you use it in other .yml file: name: test_name , it only overwrite the value of parameter name with no effect on other parameters, and also this overwrite only available on current job.
For example, if in your use-template.yml:
- job:A
template: template.yml
parameters:
azure:
name: test_name
- job:B
template: template.yml
parameters:
azure:
condition: failed()
The overwriting of name, will only affect this parameter(name) value in Job A. After Job A finished, the value of name will reback to cargo_test in Job B.
In one word, the configuration in template.yml is fixed, the used in other yml will have any affect to the template.yml. So, you don't need to worry about how to categorize parameters which we does not support it until now.
You can check this simple example in official doc: Job templates.If have any misunderstanding about your idea, just feel free to correct me.
Updated:
Since we can get the value with parameters.azure.name, the Azure Devops should support these parameters categorize. And also, after tested, I got the same result with you. If overwrite parameters.azure.name, the rest parameters which in the same level with parameters.azure.name are all empty. I think this should be a issue which need our Product Group to fix it.
I have raise this issue report on our official Developer Community: When I overwrite the template parameter, the value be empty. You can follow this ticket, so that you can get the notification once it has any updated.
In addition, it seems no other work around to achieve parameters categorize. Just patience for this issue fixed. Once the fixed script release, our engineer would inform it in that ticket.
Related
I'm trying to use a variable yaml file where I store a variable that is the cron syntax for a build. I wish to use this variable for multiple build pipelines, and want to be able to change the time/day of the build without having to go into each pipeline and change each schedule within each yaml pipeline.
However, trying this current method hits an error:
variable.yml
variables:
- name: cronSyntax
value: "0 9 * * *"
azure-pipelines.yml
variables:
- template: variable.yml
schedules:
cron: ${{ cronSyntax }}
etc
I have also tried doing $(cronSyntax) but neither seem to work. Is it just a case that I cannot use variables within the schedule task in yaml? Any help greatly appreciated.
Thanks
Looks like you can't use pipeline variables when specifying schedules.
See the official documentation here : https://learn.microsoft.com/en-us/azure/devops/pipelines/process/scheduled-triggers?view=azure-devops&tabs=yaml#scheduled-triggers
I would like to implement a manual triggered pipeline where the user must specify the value for 2 variables (from a given set of values). Since gitlab does not support a combo box yet, I found the following workaround:
variables:
DEPLOYMENT:
value: "NONE/env1/env2/env3"
description: "Choose deployment location"
BUILDMODE:
value: "package/install"
description: "Available options - package (local build, no deployment to nexus) or install (local build with deployment to nexus)"
This works as expected and the variables are displayed before launching the pipeline so that the user can edit their values and choose only one option.
Unfortunately, the variables are no longer shown when the following import is added to the gitlab-ci.yaml file:
include:
- project: "path_to_my_project"
ref: $CI_COMMIT_BRANCH
file:
- "pipeline/scripts.yaml"
Do you have any idea why the import change the behaviour for the variables above? (If I comment these lines, the variables are shown as expected).
I'm wondering if using extends: in global variables: is supposed to work.
The documentation here states, that:
Keyword type: Job keyword. You can use it ONLY as part of a job.
(my highlighting).
However, if I use something like
stages:
- test
.common_variables:
__FOO: "foo"
variables:
extends: .common_variables
__BAR: "bar"
test:
stage: test
script:
- env
rules:
- when: always
then I find both __FOO as well as __BAR variables defined on the runner. However, I also find the extends variable (set to .common_variables defined on the runner).
So I wonder if this is the expected behaviour on which one can rely and the presence of extends variable is just a minor side effect or just a bug in the GitLab version that we use and this keyword is supported only inside jobs as the docs state?
P.S. note, that i specifically ask about extends because i plan to off-load the shared variables into a separate file in a separate repo to be able to reuse it in pipelines of multiple repositories
Yes it works like that.
But you can also read in addition how it can be used correctly if you want to run several scripts in one YAML file.
You can read more details here
I have two AzureDevOps Git branches:
master
feature/mybranch
I have a multi-stage build pipeline defined in yaml, where some of the steps are templated into separate .yml files.
In my outer azure-pipelines.yml I reference a repository where my template .yml's live:
resources:
repositories:
- repository: templates
type: git
name: MyProject/MyRepo
when I'm building in the 'master' branch everything is good as by default the repository will look in refs/heads/master.
when I'm working in the feature branch and I want to test experimental changes to my template .yml files, I don't want it to fetch them from the master branch, I want it to use the files from the branch I am working in.
The following works and allows me to do this:
resources:
repositories:
- repository: templates
type: git
name: MyProject/MyRepo
ref: refs/heads/feature/mybranch
However, when I merge this back to master, I obviously don't want 'ref:' still pointing at the feature branch, so I'd like to generate the value of 'ref:' dynamically with a variable.
I've tried using ref: $(Build.SourceBranch) where $(Build.SourceBranch) should expand to 'refs/heads/feature/mybranch'
But it doesn't work. Error:
62638: "/azure-pipelines.yml: Could not get the latest source version for repository MySolution hosted on Azure Repos using ref refs/heads/$(Build.SourceBranch)."
Instead of referencing the repo in resources, use inline checkout as described here
https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/multi-repo-checkout?view=azure-devops#checking-out-a-specific-ref
- checkout: git://MyProject/MyRepo#features/tools
And this yaml element allows use of template expressions using variables, parameters e.g.
- checkout: git://${{ variables.repoName}}#${{ variables.branchRef }}
OR
- checkout: git://${{ parameters.repoName}}#${{ parameters.branchRef }}
And you can change that dynamically
Or the other alternative is use script task as below
- script: |
# note checkout: git://$(System.TeamProject)/${{ parameters.repoName }}#${{ parameters.repoRef }} this does not work if this task is run multiple times in same pipeline
# see here for more details :https://developercommunity.visualstudio.com/t/checkout-fails-with-an-error-occurred-while-loadin/1005991#T-N1270459
repoDir=$(Agent.BuildDirectory)/${{ parameters.repoName }}
/bin/rm -rf $repoDir
url_with_token=$(echo $(System.CollectionUri) | sed -e "s/https\:\/\//https\:\/\/$(System.AccessToken)\#/g")
git clone $url_with_token/$(System.TeamProject)/_git/${{ parameters.repoName }} $repoDir
cd $repoDir
git fetch origin '${{ parameters.repoRef }}':'localBranch'
git checkout localBranch
name: clone_script
displayName: Checkout using script ${{ parameters.repoName }}#${{ parameters.repoRef }}
I have added above as a template and its usage in a gist, to make it re-usable easily.
Hope that helps
https://gist.github.com/scorpionlion/1773d08b62ca5875cc2fd6dcdd0394d2
Is it possible to use a variable in the ref property of
resources:repository for Azure DevOps YAML?
For this question, the answer is Yes, it's possible.
About why you receive that error message, just is the variable($(Build.SourceBranch)) you used is incorrect. You should use $(Build.SourceBranchName).
As normal, for ref, we should input master or any other feature branches. Such as
ref: refs/heads/master
This may make you thought that this is same with the value of $(Build.SourceBranch). It looks same, I know, but different. In fact, for server, it will read the exactly branch name not the branch path, which we can clearly figure out with the classic editor type:
According with classic editor type, we can know here we should input the exactly branch name.
So, as the Predefined variables defined, the value of $(Build.SourceBranch) is the branch path, but for $(Build.SourceBranchName), it's represent a exactly branch name.
So, if you want to execute successfully, you need to use : $(Build.SourceBranchName). And it's worked on my side.
Hope this also can help you stay away from the error message.
Edit:
The complete script which is worked for me is:
resources:
repositories:
- repository: templates
type: git
name: MyApp/MyconApp
ref: $(Build.SourceBranchName)
The azure docs state
Variables can't be used to define a repository in a YAML statement.
So that seems to place some limitations on what you can do here. Perhaps there is a workaround that still allows you to do what you want.
The case I have is as follow:
I've created an Azure DevOps Pipeline with a Pipeline variable, let's say 'variable A'. The value of 'variable A' is 1. During the build, I change the value of 'variable A' to 2.
When the build runs for the second time I want the value of these 'variable A' but this is back 1 but I want that the value is 2 because on the previous build I set the value of 'variable A' to 2.
These are the methods I tried without success:
Method 1:
Write-Host "##vso[task.setvariable variable=A;]2"
Method 2:
$env:A = 2
The only thing that works but I don't think this is the way to go is to get the whole build definition via the rest api and put it back with the value of the variable changed.
Get
Update
Is there any other solution to this problem?
If you're specifically looking at an increasing number, then you can also use counters. These only woork in YAML based build definitions.
The format is as follows:
You can use any of the supported expressions for setting a variable. Here is an example of setting a variable to act as a counter that starts at 100, gets incremented by 1 for every run, and gets reset to 100 every day.
yaml
jobs:
- job:
variables:
a: $[counter(format('{0:yyyyMMdd}', pipeline.startTime), 100)]
steps:
- bash: echo $(a)
For more information about counters and other expressions, see expressions.
The counter is stored for the pipeline and is based on the prefix you provide in the counterr expression. The above expression uses the yyyymmdd to generate a prefix which is unique every day.
For UI driven build definitions, then indeed using the REST api to update the whole definiton would work, though it's really hard to work around all possibilities concerning paralelism.
How to change pipeline variables for usage in the next build in Azure DevOps
I am afraid you have to use the rest api to change the value of that pipeline variables.
That because when you use the script `"##vso[task.setvariable variable=testvar;]testvalue" to overwrite it, the overwrite value only work for current build pipeline.
When you use the execute the build again, it will still pull the value from pipeline variable value.
So, we have to update the value of that variables on the web portal. Then we need use the need the REST API (Definitions - Update) to update the value of the build pipeline definition variable from a build task:
Similar thread: How to modify Azure DevOps release definition variable from a release task?
Note:Change the API to the build definitions:
PUT https://dev.azure.com/{organization}/{project}/_apis/build/definitions/{definitionId}?api-version=5.0
Hope this helps.
I have found the easiest way to update variable values during a pipeline execution is to use the Azure CLI having also tried other methods with little or no success.
In a YAML pipeline, this may look something like so:
jobs:
- job: Update_Version
steps:
- task: AzureCLI#2
inputs:
azureSubscription: [your_subscription_id]
scriptType: 'pscore'
scriptLocation: 'inlineScript'
inlineScript: |
# set environment variable for current process
$env:AZURE_DEVOPS_EXT_PAT = $env:SYSTEM_ACCESSTOKEN
$oldVersionNumber = $(version-number)
$newVersionNumber = $oldVersionNumber + 1
az pipelines variable-group variable update --group-id [your_variable_group_id] --name version-number --organization [your_organization_url] --project [your_project_name] --value $newVersionNumber
env:
SYSTEM_ACCESSTOKEN: $(System.AccessToken)
The pipeline build service may also need permission to execute this command. To check, go to Pipelines -> Library -> Variable groups then edit the variable group containing your variable. Click on the Security button and make sure the user Project Collection Build Service has the Administrator role.
More information on the Azure CLI command can be found here. There is also another form of the command used to update variables that are not in a variable group, as described here.