Azure Pipeline Loops - azure

I've spent quite a bit of time on this now and can't seem to get it right so need to ask for advise.
I've got a GitHub repo which contains a series of ZIP files. For each Zip file I want to execute a template in DevOps, there could be 1 there could be 10. For each file I want to perform the same checks.
I wrote a PowerShell task that scans a gathers the list of zip files. I spent considerable time attemping to use OutPut variables and variables but it seems that you can't leverage an array at all using those in Azure Pipelines.
steps:
- ${{ each var in variables }}:
- task: PowerShell#2
displayName: 'Checking variables ${{ var.key }}'
inputs:
targetType: Inline
script: |
Write-Host ${{ var.name }}
Write-Host ${{ var.key }}
Write-Host ${{ var.value }}
Write-Host "${{ convertToJson(var) }}"
I was able to loop through all existing Variables in the pipeline but it appears task variables are not added to this list?
Write-Host "##vso[task.setvariable variable=filePath;isOutput=true]$YamlVar.FullName"
Using the above task.setvariable I can set and access a variable but it's a single string, obviously to loop through a template a number of times and perform the same steps it's got to work with the ${{ each var in variables }}: structure.
Also managed to write a variables.yml and add it to an artifact but I can't seem to find a way to access my new artifact to potentially leverage.
- job: B
dependsOn: A1
pool:
vmImage: windows-latest
variables:
- template: ./variables.yml
steps:
- ${{ each var in variables }}:
- task: PowerShell#2
displayName: 'Checking variables ${{ var.key }}'
inputs:
targetType: Inline
script: |
Write-Host ${{ var.name }}
Write-Host ${{ var.key }}
Write-Host ${{ var.value }}
Write-Host "${{ convertToJson(var) }}"
UPDATE
This is the closets answer so far, but it seem like when using.
- ${{ each var in variables }}:
The variable or parameter is not processed.so it's interpreted as is Micro $(allfiles) directly instead of expanded and leveraged.
https://stackoverflow.com/a/59451690/16318957
If I could some how just have my .yml artifact added to the build and then reference that it would work, but it appears I can't reference a template directly in a step that hasn't executed a download stage.
This would allow me to create a template file with the parameters and variables hard coded then, publish the file and use that artifact to do the right amount of steps.
When attempting to download my artifacts I'm not getting any files.
UPDATE
I've solved this one, I essentially passed the object array into the template using the DevOps API to call a separate pipeline. I run a PowerShell script passing in the variables required.
- task: PowerShell#2
inputs:
filePath: './PowerShell/deployment.ps1'
arguments: '-variable "TEST" -step "TEST_DEPLOYMENT" -env "TEST" -pat "$(pat)"'
Then essentially adjust the JSON variable and call the pipeline from the API with your array list. Your JSON needs to leverage objects as a JSON text.
$JSON = #'
{
"variables":{
"file1":{
"isSecret":false,
"value":"test11"
},
"file2":{
"isSecret":false,
"value":"test21"
}
},
"templateParameters":{
<zipfilefilenameshere>,
}
}
'#
Then trigger the pipeline with the API passing your object array.
uripipeline = "https://dev.azure.com/$($org)/$($proj)/_apis/pipelines/$($id)/runs?api-version=6.0-preview.1"
When you object is sent it needs to be structured like JSON text.
Example:
"[\"file1.zip\",\"file2.zip\"]"

I managed to get this to work by creating a separate pipeline, then calling the pipeline from the original and passing in my object array.
I'm using this API call to make it work.
https://dev.azure.com/$($org)/$($proj)/_apis/pipelines/$($id)/runs?api-version=6.0-preview.1
Some examples in my update at the end of the original question.

Related

Azure Pipeline: Update variables according to parameter

I have some very simple variables, which I would like to change according to the environment.
I have written the code below in very different ways (including indentation) but none was fruitful. Alternatives I see are
Use variable groups (trying to avoid to have too many of them)
Write a bash script which updates the variables (will work but I
think its not a super neat solution)
variables:
- group : secrets
- name: hello
value: world
${{ if eq(parameters.environment, 'dev') }}:
- name: RabbitMQ_replicaCount
value: 3
${{ if eq(parameters.environment, 'test') }}:
RabbitMQ_replicaCount: '1'
Any other ideas will be appriciated :)
I would rather go by a PS script/Bash script for this task. Why ? The logic part of build where manipulation is needed like setting or overriding var based on branch or env can be done in a better way in script rather than the build yaml itself. Also this part un-necessary elongates the yaml.
Step 1 : Define a var in the build pipe with default env name
and may be another var whose value you want to set based on condition
Step 2 : Add a yml file(lets name it BuildEnv.yml) in your repo which actually contains your PowerShell/Bash code:
steps:
- powershell: |
if($BuildEnv -ne "Test"){
Write-Host "##vso[task.setvariable variable=BuildEnv]Dev"
Write-Host "##vso[task.setvariable variable=RabbitMQ_replicaCount]11"
}
displayName: 'Override Build Env'
# MORE CODE HERE
Step 3: Plug your yml in the build pipe as a template-
trigger:
branches:
include:
- master
name: $(date:yyyy-MM-dd_HH.mm)_$(rev:.r)
stages:
- stage: Build_Stage
displayName: Build_Stage
jobs:
- job: Build_Job
pool:
name: ABC
steps:
- template: ..\BuildEnv.yml
#REST CODE
That's it. You are done.
Reference : Template usage in Azure DevOps build - https://learn.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azure-devops

How to loop through user-defined variables in a YAML pipeline?

I am trying to loop through user-defined variables in an Azure DevOps YAML pipeline.
The variables have been created through the UI:
Below the YAML pipeline code that I'm using:
trigger:
- dev
- main
pr:
- dev
pool:
vmImage: ubuntu-latest
stages:
- stage:
jobs:
- job: TestVars
steps:
- ${{ each var in variables }}:
- script: |
echo ${{ var.key }}
echo ${{ var.value }}
displayName: ${{ var.key }}
When running the above pipeline only system and build variables are listed (e.g. system, system.hostType, build.queuedBy, etc.).
Any help to loop through user-defined variables would be much appreciated.
Unfortunately, no luck fetching the variables defined in UI. However, if your variables are non-secrets, you can bring them over into the YAML, and they will show up in the loop.
- stage:
variables:
myyamlvar: 1000 # this will show up in the loop
jobs:
- job: TestVars
steps:
- ${{ each var in variables }}:
- script: |
echo ${{ var.key }}
echo ${{ var.value }}
displayName: ${{ var.key }}
Alternatively, instead of using a compile time expression, you can list variables using a runtime construct, for example:
- job: TestRuntimeVars
steps:
- script: |
for var in $(compgen -e); do
echo $var ${!var};
done
This will list all variables including ones defined in the UI.
From the Microsoft docs link you provided, it specifies that:
"Unlike a normal variable, they are not automatically decrypted into
environment variables for scripts. You need to explicitly map secret
variables."
However, one workaround could potentially be to run an azure cli task and get the pipeline variables using az pipelines variable list
Assuming your intention is to get the actual values, in which case maybe that won't suffice. Having said that, you should consider a variable group even if you're not using them in other pipelines since the group can be linked to an Azure KeyVault and map the secrets as variables. You can store your sensitive values in a KeyVault and link it to the variable group which can be used like regular variables in your pipeline.
Or you can access KeyVault secrets right from the AzureKeyVault pipeline task.
To expand on the awnser below. It is a bit round about but you can use the azure devopps CLI. This may be a bit overkill but it does do the job.
trigger:
- main
pool:
vmImage: ubuntu-latest
steps:
- bash: az --version
displayName: 'Show Azure CLI version'
- bash: az devops configure --defaults organization=$(System.TeamFoundationCollectionUri) project=$(System.TeamProject) --use-git-aliases true
displayName: 'Set default Azure DevOps organization and project'
- bash: |
az pipelines variable list --pipeline-id $(System.DefinitionId)
displayName: 'Show build list varibales'
env:
AZURE_DEVOPS_EXT_PAT: $(System.AccessToken)
This approach was taken from a combination of:
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#list-variables
and
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#list-variables
If the agent is self hosted you may need to install the dev opps cli.

How can I use variable created inside of pipeline (from gradle.properties file) within an template as a task

I have an java project which have gradle.properties file. Im extracting variables defined in gradle.properties as
##vso[task.setvariable variable=myVariable;]`my script to extract it from gradle.properties`
Then im using template from another repository that needs that variable but I can't use it within task, but when I try use it within - script: echo $variable as a step instead of task it is working.
When i try to use it within task it sees variable as $variable not a value.
Maybe there is a better way to extract variables to azure pipeline instead of using this approach?
Check the error message:
We get the error before the pipeline run the bash task, Since it cannot create the variable parampass, we get the parameters value is $(parampass) instead of the variable value.
Check this doc:
In a pipeline, template expression variables ${{ variables.var }} get processed at compile time, before runtime starts. Macro syntax variables $(var) get processed during runtime before a task runs. Runtime expressions $[variables.var] also get processed during runtime but were designed for use with conditions and expressions.
As a workaround:
pipeline.yml
pool:
vmImage: ubuntu-20.04
resources:
repositories:
- repository: common
type: git
name: organisation/repo-name
variables:
- name: parampass
value: xxx
stages:
- stage: "Build"
jobs:
- job: "Build"
steps:
- template: templatename.yml#common
parameters:
par1: ${{ variables.parampass}}
Result:
Probably you do not provide variable to your template
Example execution of template with provided parameter
- template: my/path/to/myTemplate.yml#MyAnotherRepositoryResourceName
parameters:
projectsToBeTested: $(someVariable)
And example template accepting parameters
steps:
- task: DotNetCoreCLI#2
displayName: 'Just testing'
inputs:
command: test
projects: ${{ parameters.projectsToBeTested}}
Please provide more information if it does not help.
Code looks like this:
pipeline.yml
pool:
vmImage: ubuntu-20.04
resources:
repositories:
- repository: common
type: git
name: organisation/repo-name
stages:
- stage: "Build"
jobs:
- job: "Build"
steps:
- bash: |
echo "##vso[task.setvariable variable=parampass]anything"
- template: templatename.yml#common
parameters:
par1: $(parampass)
templatename.yml
parameters:
- name: par1
steps:
- task: SonarCloudPrepare#1
displayName: SonarCloud analysis prepare
inputs:
SonarCloud: ${{ parameters.par1}}
organization: 'orgname'
scannerMode: 'Other'
extraProperties: |
# Additional properties that will be passed to the scanner,
# Put one key=value per line, example:
# sonar.exclusions=**/*.bin
sonar.projectKey= # same param pass case
sonar.projectName= # same param pass case
Generally, it does not matter if i do have parameters passed or if I'm using the template as if it were part of the pipeline code within. Output is always $(parampass) could not be found or smth

How to send different parameters to Azure pipeline templates based on the System.PullRequest.TargetBranch variable

I have a pipeline template that should receive a different input based on the Pull Request Target Branch.
Template:
parameters:
- name: BUILD_FOLDER
type: string
steps:
- script: |
echo "Build folder: ${{ parameters.BUILD_FOLDER }}"
displayName: 'Echo Build folder'
Pipeline YAML.
trigger: none
pr:
branches:
include:
- '*'
pool:
vmImage: ubuntu-latest
steps:
- template: templates/template.yml
parameters:
${{ if contains(variables['System.PullRequest.TargetBranch'], 'master') }}:
BUILD_FOLDER: base
${{ if not(contains(variables['System.PullRequest.TargetBranch'], 'master')) }}:
BUILD_FOLDER: $(System.PullRequest.TargetBranch)
I tried doing it like this but always it goes with ${{ if not(contains(variables['System.PullRequest.TargetBranch'], 'master')) }}: even the target branch is master in the Pull Request. Is there another way to do this?
Also the repository is in GitHub.
Thanks in advance.
I can reproduce the same issue. It is because the predefined variable System.PullRequest.TargetBranch cannot be evaluated at the compile time. It can be evaluated at run time (wrapped in $())without any problem. Expression wrapped in ${{}} will be evaluated at the build compile time. See Runtime expression syntax.
These variables that are marked not available in template in document cannot be parsed at compile time. The document causes little confusion, because it doesnot clearly state those variables cannot be parsed at compile time.
Since the template is evaluated at compile time. So the variable System.PullRequest.TargetBranch wrapped ${{}} is evaluated to an empty string.
I tested with below yaml. And the powershell task got executed:
- ${{ if eq(variables['System.PullRequest.TargetBranch'], '') }}:
- powershell: echo "i will out put empty"
The workaround for this is to set the variable value by script in an additional powershell task, as mentioned by Krzysztof Madej. I changed your yaml file a little bit. See below:
trigger: none
pr:
branches:
include:
- '*'
pool:
vmImage: ubuntu-latest
steps:
- powershell: |
$targetBranch = "$(System.PullRequest.TargetBranch)"
if($targetBranch -eq "master"){
Write-Host "##vso[task.setvariable variable=BUILD_FOLDER;]base"
}
if($targetBranch -ne "master"){
Write-Host "##vso[task.setvariable variable=BUILD_FOLDER;]$targetBranch"
}
- template: templates/template.yml
parameters:
BUILD_FOLDER: $(BUILD_FOLDER)
Some of variables can't be used in expressions like this. It is shown in the last column Available in templates? on this page
What you can do is move this to the template itself like this:
parameters:
- name: BASE_BUILD_FOLDER
type: string
default: base
steps:
- pwsh: |
$targetBranch = '$(System.PullRequest.TargetBranch)'
$buildFolder = '${{ parameters.BASE_BUILD_FOLDER }}'
if (!($targetBranch -like "*master")){
$buildFolder = $targetBranch
}
Write-Host "##vso[task.setvariable variable=BUILD_FOLDER;]$buildFolder"
- script: |
echo "Build folder: $(BUILD_FOLDER)"
displayName: 'Echo Build folder'
and then you can call it
trigger: none
pr:
branches:
include:
- '*'
pool:
vmImage: ubuntu-latest
steps:
- template: templates/template.yml

How to pass environment specific values to Azure pipeline?

I am deploying Service Fabric Application packages and I have several (~15) devtest environments, any one of which can be used to test a code fix. I can pass in the Service Connection so deploying the final package is not the issue. What I can't figure out is how to set the other environment specific variables based on the target environment.
I tried using the Service Connection name to pick one of several variable template files:
variables:
- name: envTemplateFileTest
${{ if eq( variables['DevConnection'], 'Environ01' ) }}:
value: ../Templates/DEV01-Variables-Template.yml
${{ if eq( variables['DevConnection'], 'Environ02' ) }}:
value: ../Templates/DEV02-Variables-Template.yml
... (snip) ...
variables:
- template: ${{ variables.envTemplateFile }}
But UI variables are not set at compile time. So the template expressions see blank values and fail.
I could use a pipeline variable but then QA would have to make a file change and check it in each time they want to deploy to a different environment than last time.
What I currently have is an empty variable template and a powershell script that sets the values based on different script names.
- task: PowerShell#2
inputs:
targetType: 'filePath'
filePath: '$(Build.ArtifactStagingDirectory)\drop\Deployment\Code\Scripts\Set-$(DevConnection)Variables.ps1'
#arguments: # Optional
displayName: Set environment variables
There has got to be a better way than this. Please.
There is not a direct way to achieve this, as the template expression is parsed at compile time.
However I have workaround which no need to write additional ps script and avoid making a file change and check it in to your repo each time.
Since all your devtest environments has the same deployment steps. Then you can create steps template yaml to hold the deployment steps.
Then you can modify your azure-pipelines.yml like below example:
jobs:
- job: A
pool:
vmImage: 'windows-latest'
steps:
- powershell: |
$con = "$(connection)"
if($con -eq "environ1"){echo "##vso[task.setvariable variable=variablegroup;isOutput=true]environ1"}
if($con -eq "environ2"){echo "##vso[task.setvariable variable=variablegroup;isOutput=true]environ2"}
name: setvarStep
- script: echo '$(setvarStep.variablegroup)'
- job: environ1
pool:
vmImage: 'windows-latest'
dependsOn: A
condition: eq(dependencies.A.outputs['setvarStep.variablegroup'], 'environ1')
variables:
- template: environ1.yaml
steps:
- template: deploy-jobs.yaml
- job: environ2
pool:
vmImage: 'windows-latest'
dependsOn: A
condition: eq(dependencies.A.outputs['setvarStep.variablegroup'], 'environ2')
variables:
- template: environ2.yml
steps:
- template: deploy-jobs.yaml
Above yml pipeline use depenpencies and condition. The first job A will output a variable according to the variable (eg.$(connection)) you specify when running the pipeline. In the following jobs, there are conditions to evaluate the output variable. If condition is satisfied then the job will be executed, the job will be skipped if failed on condition.
What we decided to do was add a Powershell script step that sets the variables based on a string passed in.
- task: PowerShell#2
inputs:
targetType: 'filePath'
filePath: $(Build.ArtifactStagingDirectory)\drop\Deployment\Code\Scripts\Set-DefaultValues.ps1
displayName: Set default pipeline variables
Then we load the appropriate file and loop through the variables, setting each in turn.
param(
[string]
$EnvironmentName
)
$environmentValues = #{}
switch ($EnvironmentName) {
'DEV98' { . '.\Dev98-Values.ps1'}
'DEV99' { . '.\Dev99-Values.ps1'}
}
foreach ($keyName in $environmentValues.Keys) {
Write-Output "##vso[task.setvariable variable=$($keyName)]$($environmentValues[$keyName])"
}
This allows us to put the environment specific variables in a plain PSCustom object file and dot import it.
$environmentValues = #{
currentYear = '2020';
has_multiple_nodetypes = 'false';
protocol = 'http';
endpoint = 'vm-dev98.cloudapp.com';
... snip ...
}
So QA has an easier time maintaining the different environment files.
Hope this helps others out there.

Resources