is it possible to update a "global" yml variable from a template?
I have a yml file with variables, which is referencing a template :
variables:
BuildConfiguration: 'release'
jobs:
- template: Templates/test.yml
parameters:
asd: true
Then, can I update the BuildConfiguration variable inside the test.yml?
I can read the value from it with it works.
$(BuildConfiguration),
But when I try to update it , it doesn't change
"##vso[task.setvariable variable=BuildConfiguration);]true";
Related
I am trying to loop through user-defined variables in an Azure DevOps YAML pipeline.
The variables have been created through the UI:
Below the YAML pipeline code that I'm using:
trigger:
- dev
- main
pr:
- dev
pool:
vmImage: ubuntu-latest
stages:
- stage:
jobs:
- job: TestVars
steps:
- ${{ each var in variables }}:
- script: |
echo ${{ var.key }}
echo ${{ var.value }}
displayName: ${{ var.key }}
When running the above pipeline only system and build variables are listed (e.g. system, system.hostType, build.queuedBy, etc.).
Any help to loop through user-defined variables would be much appreciated.
Unfortunately, no luck fetching the variables defined in UI. However, if your variables are non-secrets, you can bring them over into the YAML, and they will show up in the loop.
- stage:
variables:
myyamlvar: 1000 # this will show up in the loop
jobs:
- job: TestVars
steps:
- ${{ each var in variables }}:
- script: |
echo ${{ var.key }}
echo ${{ var.value }}
displayName: ${{ var.key }}
Alternatively, instead of using a compile time expression, you can list variables using a runtime construct, for example:
- job: TestRuntimeVars
steps:
- script: |
for var in $(compgen -e); do
echo $var ${!var};
done
This will list all variables including ones defined in the UI.
From the Microsoft docs link you provided, it specifies that:
"Unlike a normal variable, they are not automatically decrypted into
environment variables for scripts. You need to explicitly map secret
variables."
However, one workaround could potentially be to run an azure cli task and get the pipeline variables using az pipelines variable list
Assuming your intention is to get the actual values, in which case maybe that won't suffice. Having said that, you should consider a variable group even if you're not using them in other pipelines since the group can be linked to an Azure KeyVault and map the secrets as variables. You can store your sensitive values in a KeyVault and link it to the variable group which can be used like regular variables in your pipeline.
Or you can access KeyVault secrets right from the AzureKeyVault pipeline task.
To expand on the awnser below. It is a bit round about but you can use the azure devopps CLI. This may be a bit overkill but it does do the job.
trigger:
- main
pool:
vmImage: ubuntu-latest
steps:
- bash: az --version
displayName: 'Show Azure CLI version'
- bash: az devops configure --defaults organization=$(System.TeamFoundationCollectionUri) project=$(System.TeamProject) --use-git-aliases true
displayName: 'Set default Azure DevOps organization and project'
- bash: |
az pipelines variable list --pipeline-id $(System.DefinitionId)
displayName: 'Show build list varibales'
env:
AZURE_DEVOPS_EXT_PAT: $(System.AccessToken)
This approach was taken from a combination of:
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#list-variables
and
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#list-variables
If the agent is self hosted you may need to install the dev opps cli.
I've spent quite a bit of time on this now and can't seem to get it right so need to ask for advise.
I've got a GitHub repo which contains a series of ZIP files. For each Zip file I want to execute a template in DevOps, there could be 1 there could be 10. For each file I want to perform the same checks.
I wrote a PowerShell task that scans a gathers the list of zip files. I spent considerable time attemping to use OutPut variables and variables but it seems that you can't leverage an array at all using those in Azure Pipelines.
steps:
- ${{ each var in variables }}:
- task: PowerShell#2
displayName: 'Checking variables ${{ var.key }}'
inputs:
targetType: Inline
script: |
Write-Host ${{ var.name }}
Write-Host ${{ var.key }}
Write-Host ${{ var.value }}
Write-Host "${{ convertToJson(var) }}"
I was able to loop through all existing Variables in the pipeline but it appears task variables are not added to this list?
Write-Host "##vso[task.setvariable variable=filePath;isOutput=true]$YamlVar.FullName"
Using the above task.setvariable I can set and access a variable but it's a single string, obviously to loop through a template a number of times and perform the same steps it's got to work with the ${{ each var in variables }}: structure.
Also managed to write a variables.yml and add it to an artifact but I can't seem to find a way to access my new artifact to potentially leverage.
- job: B
dependsOn: A1
pool:
vmImage: windows-latest
variables:
- template: ./variables.yml
steps:
- ${{ each var in variables }}:
- task: PowerShell#2
displayName: 'Checking variables ${{ var.key }}'
inputs:
targetType: Inline
script: |
Write-Host ${{ var.name }}
Write-Host ${{ var.key }}
Write-Host ${{ var.value }}
Write-Host "${{ convertToJson(var) }}"
UPDATE
This is the closets answer so far, but it seem like when using.
- ${{ each var in variables }}:
The variable or parameter is not processed.so it's interpreted as is Micro $(allfiles) directly instead of expanded and leveraged.
https://stackoverflow.com/a/59451690/16318957
If I could some how just have my .yml artifact added to the build and then reference that it would work, but it appears I can't reference a template directly in a step that hasn't executed a download stage.
This would allow me to create a template file with the parameters and variables hard coded then, publish the file and use that artifact to do the right amount of steps.
When attempting to download my artifacts I'm not getting any files.
UPDATE
I've solved this one, I essentially passed the object array into the template using the DevOps API to call a separate pipeline. I run a PowerShell script passing in the variables required.
- task: PowerShell#2
inputs:
filePath: './PowerShell/deployment.ps1'
arguments: '-variable "TEST" -step "TEST_DEPLOYMENT" -env "TEST" -pat "$(pat)"'
Then essentially adjust the JSON variable and call the pipeline from the API with your array list. Your JSON needs to leverage objects as a JSON text.
$JSON = #'
{
"variables":{
"file1":{
"isSecret":false,
"value":"test11"
},
"file2":{
"isSecret":false,
"value":"test21"
}
},
"templateParameters":{
<zipfilefilenameshere>,
}
}
'#
Then trigger the pipeline with the API passing your object array.
uripipeline = "https://dev.azure.com/$($org)/$($proj)/_apis/pipelines/$($id)/runs?api-version=6.0-preview.1"
When you object is sent it needs to be structured like JSON text.
Example:
"[\"file1.zip\",\"file2.zip\"]"
I managed to get this to work by creating a separate pipeline, then calling the pipeline from the original and passing in my object array.
I'm using this API call to make it work.
https://dev.azure.com/$($org)/$($proj)/_apis/pipelines/$($id)/runs?api-version=6.0-preview.1
Some examples in my update at the end of the original question.
I have an java project which have gradle.properties file. Im extracting variables defined in gradle.properties as
##vso[task.setvariable variable=myVariable;]`my script to extract it from gradle.properties`
Then im using template from another repository that needs that variable but I can't use it within task, but when I try use it within - script: echo $variable as a step instead of task it is working.
When i try to use it within task it sees variable as $variable not a value.
Maybe there is a better way to extract variables to azure pipeline instead of using this approach?
Check the error message:
We get the error before the pipeline run the bash task, Since it cannot create the variable parampass, we get the parameters value is $(parampass) instead of the variable value.
Check this doc:
In a pipeline, template expression variables ${{ variables.var }} get processed at compile time, before runtime starts. Macro syntax variables $(var) get processed during runtime before a task runs. Runtime expressions $[variables.var] also get processed during runtime but were designed for use with conditions and expressions.
As a workaround:
pipeline.yml
pool:
vmImage: ubuntu-20.04
resources:
repositories:
- repository: common
type: git
name: organisation/repo-name
variables:
- name: parampass
value: xxx
stages:
- stage: "Build"
jobs:
- job: "Build"
steps:
- template: templatename.yml#common
parameters:
par1: ${{ variables.parampass}}
Result:
Probably you do not provide variable to your template
Example execution of template with provided parameter
- template: my/path/to/myTemplate.yml#MyAnotherRepositoryResourceName
parameters:
projectsToBeTested: $(someVariable)
And example template accepting parameters
steps:
- task: DotNetCoreCLI#2
displayName: 'Just testing'
inputs:
command: test
projects: ${{ parameters.projectsToBeTested}}
Please provide more information if it does not help.
Code looks like this:
pipeline.yml
pool:
vmImage: ubuntu-20.04
resources:
repositories:
- repository: common
type: git
name: organisation/repo-name
stages:
- stage: "Build"
jobs:
- job: "Build"
steps:
- bash: |
echo "##vso[task.setvariable variable=parampass]anything"
- template: templatename.yml#common
parameters:
par1: $(parampass)
templatename.yml
parameters:
- name: par1
steps:
- task: SonarCloudPrepare#1
displayName: SonarCloud analysis prepare
inputs:
SonarCloud: ${{ parameters.par1}}
organization: 'orgname'
scannerMode: 'Other'
extraProperties: |
# Additional properties that will be passed to the scanner,
# Put one key=value per line, example:
# sonar.exclusions=**/*.bin
sonar.projectKey= # same param pass case
sonar.projectName= # same param pass case
Generally, it does not matter if i do have parameters passed or if I'm using the template as if it were part of the pipeline code within. Output is always $(parampass) could not be found or smth
Please consider the following:
- job: Backend
steps:
- template: $(ClassLibraryTemplate)
parameters:
projectName: 'Core'
solutionPath: 'Source/Core.sln'
ClassLibraryTemplate is defined as a pipeline variable. But when I run the build, it fails because the variable is not replaced by its value and the template is not found.
Is it not possible to store the template name in a variable ?
For Azure DevOps YAML pipeline, the template get processed at compile time. However, the $(ClassLibraryTemplate) get processed at the runtime. That's why it fails.
More information: Understand variable syntax
You could define variable or parameter in your YAML pipeline, then use template expression. For parameter, you could specify value when queue/run pipeline in pop-up window.
For example:
parameters:
- name: temName
displayName: template name
type: string
default: steps/test.yml
trigger:
- none
variables:
- name: tem
value: steps/build.yml
jobs:
- job: Linux
pool:
vmImage: 'ubuntu-16.04'
steps:
- template: ${{ variables.tem }}
- template: ${{ parameters.temName }}
I am deploying Service Fabric Application packages and I have several (~15) devtest environments, any one of which can be used to test a code fix. I can pass in the Service Connection so deploying the final package is not the issue. What I can't figure out is how to set the other environment specific variables based on the target environment.
I tried using the Service Connection name to pick one of several variable template files:
variables:
- name: envTemplateFileTest
${{ if eq( variables['DevConnection'], 'Environ01' ) }}:
value: ../Templates/DEV01-Variables-Template.yml
${{ if eq( variables['DevConnection'], 'Environ02' ) }}:
value: ../Templates/DEV02-Variables-Template.yml
... (snip) ...
variables:
- template: ${{ variables.envTemplateFile }}
But UI variables are not set at compile time. So the template expressions see blank values and fail.
I could use a pipeline variable but then QA would have to make a file change and check it in each time they want to deploy to a different environment than last time.
What I currently have is an empty variable template and a powershell script that sets the values based on different script names.
- task: PowerShell#2
inputs:
targetType: 'filePath'
filePath: '$(Build.ArtifactStagingDirectory)\drop\Deployment\Code\Scripts\Set-$(DevConnection)Variables.ps1'
#arguments: # Optional
displayName: Set environment variables
There has got to be a better way than this. Please.
There is not a direct way to achieve this, as the template expression is parsed at compile time.
However I have workaround which no need to write additional ps script and avoid making a file change and check it in to your repo each time.
Since all your devtest environments has the same deployment steps. Then you can create steps template yaml to hold the deployment steps.
Then you can modify your azure-pipelines.yml like below example:
jobs:
- job: A
pool:
vmImage: 'windows-latest'
steps:
- powershell: |
$con = "$(connection)"
if($con -eq "environ1"){echo "##vso[task.setvariable variable=variablegroup;isOutput=true]environ1"}
if($con -eq "environ2"){echo "##vso[task.setvariable variable=variablegroup;isOutput=true]environ2"}
name: setvarStep
- script: echo '$(setvarStep.variablegroup)'
- job: environ1
pool:
vmImage: 'windows-latest'
dependsOn: A
condition: eq(dependencies.A.outputs['setvarStep.variablegroup'], 'environ1')
variables:
- template: environ1.yaml
steps:
- template: deploy-jobs.yaml
- job: environ2
pool:
vmImage: 'windows-latest'
dependsOn: A
condition: eq(dependencies.A.outputs['setvarStep.variablegroup'], 'environ2')
variables:
- template: environ2.yml
steps:
- template: deploy-jobs.yaml
Above yml pipeline use depenpencies and condition. The first job A will output a variable according to the variable (eg.$(connection)) you specify when running the pipeline. In the following jobs, there are conditions to evaluate the output variable. If condition is satisfied then the job will be executed, the job will be skipped if failed on condition.
What we decided to do was add a Powershell script step that sets the variables based on a string passed in.
- task: PowerShell#2
inputs:
targetType: 'filePath'
filePath: $(Build.ArtifactStagingDirectory)\drop\Deployment\Code\Scripts\Set-DefaultValues.ps1
displayName: Set default pipeline variables
Then we load the appropriate file and loop through the variables, setting each in turn.
param(
[string]
$EnvironmentName
)
$environmentValues = #{}
switch ($EnvironmentName) {
'DEV98' { . '.\Dev98-Values.ps1'}
'DEV99' { . '.\Dev99-Values.ps1'}
}
foreach ($keyName in $environmentValues.Keys) {
Write-Output "##vso[task.setvariable variable=$($keyName)]$($environmentValues[$keyName])"
}
This allows us to put the environment specific variables in a plain PSCustom object file and dot import it.
$environmentValues = #{
currentYear = '2020';
has_multiple_nodetypes = 'false';
protocol = 'http';
endpoint = 'vm-dev98.cloudapp.com';
... snip ...
}
So QA has an easier time maintaining the different environment files.
Hope this helps others out there.