Azure devops pipeline access Json file inputs and perform for(each) loop - azure

I am using, Linux agent, I need to iterate over json input objects for each project.
I have below json file and It may have more than 200 charts, I need perform build, lint, templates and push to repository, I can do this using shell/python, but I thought to use Azure pipelines yaml code.
{
"helm": {
"charts": [
{
"project1": {
"secretName" : "mysecret1",
"setValues": ["a", "b", "c"]
}
},
{
"project2": {
"secretName" : "mysecret2",
"setValues": ["x", "y", "z"]
}
}
]
}
}
azure-pipelines.yaml:
trigger:
- '*'
variables:
buildConfiguration: 'Release'
releaseBranchName: 'dev'
stages:
- stage: 'Build'
pool:
name: 'linux'
displayName: 'Build helm Projects'
jobs:
- job: 'buildProjects'
displayName: 'Building all the helm projects'
steps:
- task: HelmInstaller#0
displayName: install helm
inputs:
helmVersion: 'latest'
installKubectl: false
- script: chmod -R 755 $(Build.SourcesDirectory)/
displayName: 'Set Directory permissions'
- task: PythonScript#0
inputs:
scriptSource: inline
script: |
import argparse, json, sys
parser = argparse.ArgumentParser()
parser.add_argument("--filePath", help="Provide the json file path")
args = parser.parse_args()
with open(args.filePath, 'r') as f:
data = json.load(f)
data = json.dumps(data)
print('##vso[task.setvariable variable=helmConfigs;]%s' % (data))
arguments: --filePath $(Build.SourcesDirectory)/build/helm/helmConfig.json
failOnStderr: true
displayName: setting up helm configs
- template: helmBuild.yml
parameters:
helmCharts: $(HELMCONFIGS)
Json input is saved to HELMCONFIGS variable in azure pipelines, As per Azure documents, it string type and we cannot convert to any other type like array.
helmBuild.yml file:
parameters:
- name: helmCharts
type: object
default: {}
steps :
- ${{ each helm in parameters.helmCharts }}:
- ${{ each charts in helm}}:
- ${{ each chart in charts }}:
- task: AzureKeyVault#1
inputs:
azureSubscription: 'A-B-C'
KeyVaultName: chart.secretName
SecretsFilter: '*'
RunAsPreJob: true
I am not able to access chart.secretName, How to access to secretNames input?

Related

Publish file content to service bus from CI pipeline

In my CI pipeline I am trying to publish message to service bus and its working when its just some hardcoded text or variables, here Using "PublishToAzureServiceBus" task .
But problem is when trying to a read file from repository and then publish that to service bus.
I have tried using read file using scripting language and put to variable but its not able to work as variable is not storing big json file.
Is there any way to read file directly when publishing message to service bus.
Below is sample code snippet for debugging
trigger:
- none
pool:
vmImage: ubuntu-latest
parameters:
- name: ProjectName
displayName: Project Name
type: string
default: DevOpsDemo
- name: repoName
displayName: repo Name
type: string
default: ProjectCode
- name: branchRef
displayName: Branch Name
type: string
default: main
variables:
- name: jobStatus
value: "Failed"
- name: projectFile
value: ""
stages:
- stage: Stage1
displayName: Stage 1
jobs:
- job: CheckOutRepo
displayName: CheckOut-Repo Display
steps:
- script: |
echo "Checkout for " ${{ parameters.ProjectName}} : ${{ parameters.repoName}} : ${{ parameters.branchRef}}
name: PrintMessage
- checkout: git://${{ parameters.ProjectName}}/${{ parameters.repoName}}#refs/heads/${{ parameters.branchRef}}
name: Checkout
- task: PythonScript#0
inputs:
scriptSource: 'inline'
script: |
import json
import requests
f = open('project-release.json')
projectFile = json.load(f)
print(projectFile)
f.close()
print("Afterclosing")
print(projectFile)
- script: |
echo "Project release file" $(cat project-release.json)
name: TestPrint
- task: CopyFiles#2
inputs:
SourceFolder: 'services'
Contents: '**'
TargetFolder: $(Build.ArtifactStagingDirectory)
name: CopyFiles
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: $(Build.ArtifactStagingDirectory)
ArtifactName: 'drop'
publishLocation: 'Container'
name: PublishArtifacts
- bash: |
echo "##vso[task.setvariable variable=jobStatus]Success"
name: setVar
- bash: |
echo "##vso[task.setvariable variable=jobStatus;isOutput=true]$(jobStatus)"
echo "##vso[task.setvariable variable=projectFile;isOutput=true]$(cat project-release.json)"
name: SetStatus
condition: always()
- stage: Stage2
displayName: Stage 2
condition: always()
jobs:
- job: Publish
pool: server
variables:
jobStatus: $[ stageDependencies.Stage1.CheckOutRepo.outputs['SetStatus.jobStatus'] ]
projectFile: $[ stageDependencies.Stage1.CheckOutRepo.outputs['SetStatus.projectFile'] ]
steps:
- task: PublishToAzureServiceBus#1
inputs:
azureSubscription: 'SBConnection'
messageBody: |
{
"Status": "$(jobStatus)",
"BuildID": "$(build.buildid)",
"BuildNumber":"$(build.buildnumber)",
"projectFile":$(cat project-release.json)
}
signPayload: false
waitForCompletion: false
condition: always()
I am able to solve this by using setvariable in bash script as below
pool:
vmImage: ubuntu-latest
stages:
- stage: Stage1
displayName: Stage 1
jobs:
- job: CheckOutRepo
displayName: CheckOut-Repo Display
steps:
- checkout: git://${{ parameters.ProjectName}}/${{ parameters.repoName}}#refs/heads/${{ parameters.branchRef}}
name: Checkout
- bash: |
data=$(cat project-release.json)
echo "##vso[task.setvariable variable=jobStatus;isOutput=true]$(jobStatus)"
echo "##vso[task.setvariable variable=data;isOutput=true]"$data
name: SetStatus
condition: always()
- stage: Stage2
displayName: Stage 2
condition: always()
jobs:
- job: Publish
pool: server
variables:
jobStatus: $[ stageDependencies.Stage1.CheckOutRepo.outputs['SetStatus.jobStatus'] ]
projectFile: $[ stageDependencies.Stage1.CheckOutRepo.outputs['SetStatus.data'] ]
steps:
- task: PublishToAzureServiceBus#1
inputs:
azureSubscription: 'SBConnection'
messageBody: |
{
"Status": "$(jobStatus)",
"BuildID": "$(build.buildid)",
"BuildNumber":"$(build.buildnumber)",
"projectFile":$(projectFile)
}
signPayload: false
waitForCompletion: false
condition: always()

Aure pipeline : use output variable in multi-stage parametrised

I am facing an issue I could not resolve.
regarding this documentation it seams to be possible to set a variable as output of a job in the first stage, then use it as input to set a job variable in second stage.
The pbm I'm facing, is that my stages are parametrised, so their name are not constant.
here my sample code :
name: mypipeline
parameters:
- name: Application
displayName: Application to deploy the front end using front door
type: string
- name: stage_vars
displayName: stage VARS
type: object
default:
- stage_1
- stage_2
- none
trigger: none
stages:
- stage: Build
displayName: ${{ parameters.Application }} - Build.
pool:
name: Azure Pipelines
vmImage: windows-2019
jobs:
- template: ../templates/job_BuildAndPublish.yml
parameters:
BuildId: $(Build.BuildId)
importTerratest: false
importARM: true
- ${{ each stageregion in parameters.stage_vars }}:
- ${{ if ne(stageregion, 'none') }}:
- template: ../templates/isolated_web/tf_plan_stage.yml
parameters:
BuildId: $(system.BuildId)
Predecessors: 'Build'
- template: ../templates/isolated_web/tf_apply_stage.yml
parameters:
Predecessors: '${{stageregion}}_prepare'
build template is not usefull, but the ones from tf_plan_stage and tf_apply_stage are below :
tf_plan_stage.yml :
parameters:
- name: BuildId
type: string
default: $(system.BuildId)
- name: stage
type: string
default: Deploy_prepare
- name: Predecessors
type: string
default: none
stages:
- stage: ${{ parameters.stage }}
? ${{ if and(ne(parameters.Predecessors, 'previous'), ne(parameters.Predecessors, 'none') ) }}
: dependsOn: ${{parameters.Predecessors}}
displayName: ${{ parameters.TF_VAR_STAGE }} Terraform plan & publish artifact for ${{ parameters.TF_VAR_APPLICATION }}
pool:
name: Azure Pipelines
vmImage: windows-2019
jobs:
- deployment: PreDeployTerraform
displayName: ${{ parameters.TF_VAR_STAGE }} Terraform plan & publish artifact
environment: PLAN_${{ parameters.TF_VAR_ENVIRONMENT }}
timeoutInMinutes: 480
strategy:
runOnce:
deploy:
steps:
- checkout: none
# set output planAttempt output available for later Apply job
- powershell: |
echo "jobAttempt is $(System.JobAttempt)"
echo "##vso[task.setvariable variable=planAttempt;isOutput=true;]$(System.JobAttempt)"
name: setVarJobAttempt
- powershell: |
echo "variable setVarJobAttempt.planAttempt value is : $(setVarJobAttempt.planAttempt)"
name: getplanAttemptVar
tf_apply_stage.yml :
parameters:
- name: stage
type: string
default: Deploy_prepare
- name: Predecessors
type: string
default: none
stages:
- stage: ${{ parameters.stage }}
? ${{ if and(ne(parameters.Predecessors, 'previous'), ne(parameters.Predecessors, 'none') ) }}
: dependsOn: ${{parameters.Predecessors}}
displayName: ${{ parameters.TF_VAR_STAGE }} download artifact & Terraform apply changes to ${{ parameters.TF_VAR_APPLICATION }}
pool:
name: Azure Pipelines
vmImage: windows-2019
jobs:
- deployment: DeployTerraform
displayName: ${{ parameters.TF_VAR_STAGE }} download artifact & Terraform apply changes
variables:
# this one fails in pipeline : "[not recognise"
planJobAttempt: $[ stageDependencies.[parameters.Predecessors].PreDeployTerraform.outputs['setVarJobAttempt.planAttempt'] ]
# this one runs with no result in pipeline : planJobAttempt is the string "stageDependencies['parameters.Predecessors'].PreDeployTerraform.outputs['setVarJobAttempt.planAttempt']"
# planJobAttempt: $[ stageDependencies['parameters.Predecessors'].PreDeployTerraform.outputs['setVarJobAttempt.planAttempt'] ]
environment: ${{ parameters.TF_VAR_ENVIRONMENT }}
timeoutInMinutes: 480
strategy:
runOnce:
deploy:
steps:
- checkout: none
- powershell: |
echo "jobAttempts outputs are : $[ stageDependencies.[parameters.Predecessors].PreDeployTerraform.outputs ]"
- powershell: |
echo "jobAttempt is $[ stageDependencies['parameters.Predecessors'].PreDeployTerraform.outputs['setVarJobAttempt.planAttempt'] ]"
- powershell: |
echo "plan file for terraform is : $(pipeline_artifact_folder_download)/$(planJobAttempt)_${{ parameters.Predecessors }}/$(artefact_terraform_plan)_${{ parameters.TF_VAR_STAGE }}"
I tried differents things to get my system.jobAttempts from tf_plan_stage stage to tf_apply_stage stage, but without any success, as the variable in the last stage (tf_apply) seams unable to find the value from a "parametrised" stage name.
is there a way for that ?
thank-you for any answers.
Okay,
Once again, I will answer my own question :D
I've finallt find the way to use the parameters.Predecessors value in my variable declaration.
And, second, the Microsoft documentation seams not to be updated.
The tf_apply_stage.yml should be like that below ;
parameters:
- name: stage
type: string
default: Deploy_prepare
- name: Predecessors
type: string
default: none
stages:
- stage: ${{ parameters.stage }}
? ${{ if and(ne(parameters.Predecessors, 'previous'), ne(parameters.Predecessors, 'none') ) }}
: dependsOn: ${{parameters.Predecessors}}
displayName: ${{ parameters.TF_VAR_STAGE }} download artifact & Terraform apply changes to ${{ parameters.TF_VAR_APPLICATION }}
pool:
name: Azure Pipelines
vmImage: windows-2019
jobs:
- deployment: DeployTerraform
displayName: ${{ parameters.TF_VAR_STAGE }} download artifact & Terraform apply changes
variables:
*# this one give he good value for Variable*
planJobAttempt: $[ stageDependencies.${{ parameters.Predecessors }}.PreDeployTerraform.outputs['PreDeployTerraform.setVarJobAttempt.planAttempt'] ]
environment: ${{ parameters.TF_VAR_ENVIRONMENT }}
timeoutInMinutes: 480
strategy:
runOnce:
deploy:
steps:
- checkout: none
- powershell: |
echo "plan file for terraform is : $(pipeline_artifact_folder_download)/$(planJobAttempt)_${{ parameters.Predecessors }}/$(artefact_terraform_plan)_${{ parameters.TF_VAR_STAGE }}"
I put in BOLD the main changes above :
planJobAttempt: $[ stageDependencies.${{ parameters.Predecessors
}}.PreDeployTerraform.outputs['PreDeployTerraform.setVarJobAttempt.planAttempt']
]
The Job name have to be set twice in variable declaration !
Thanks for those who read my post ;)
And many thanks to this post from #Kay07949 who gives the good answers ;)

How can I pass map variable to Azure Devops pipeline job?

I'm learning Azure Devops pipelines, my first project is to create simple vnet with subnet using Terraform. I figured how to pass simple key-value variables, but problem is how to pass for example list of strings or more important, map variable from Terraform.
I'm using it to create subnets using each key - each value loop.
There are files that I'm using, I'm getting error about syntax in pipeline.yaml for VirtualNetworkAddressSpace and VirtualNetworkSubnets values.
Can you please help me with this one?
variables.tf
variable RG_Name {
type = string
#default = "TESTMS"
}
variable RG_Location {
type = string
#default = "West Europe"
}
variable VirtualNetworkName {
type = string
#default = "TESTSS"
}
variable VirtualNetworkAddressSpace {
type = list(string)
#default = ["10.0.0.0/16"]
}
variable VirtualNetworkSubnets {
type = map
#default = {
#"GatewaySubnet" = "10.0.255.0/27"
#}
}
dev.tfvars
RG_Name = __rgNAME__
RG_Location = __rgLOCATION__
VirtualNetworkName = __VirtualNetworkName__
VirtualNetworkAddressSpace = __VirtualNetworkAddressSpace__
VirtualNetworkSubnets = __VirtualNetworkSubnets__
pipeline.yaml
resources:
repositories:
- repository: self
trigger:
- feature/learning
stages:
- stage: DEV
jobs:
- deployment: TERRAFORM
displayName: 'Terraform deployment'
pool:
nvmImage: 'ubuntu-latest'
workspace:
clean: all
variables:
- name: 'rgNAME'
value: 'skwiera-rg'
- name: 'rgLOCATION'
value: 'West Europe'
- name: 'VirtualNetworkName'
value: 'SkwieraVNET'
- name: 'VirtualNetworkAddressSpace'
value: ['10.0.0.0/16']
- name: 'VirtualNetworkSubnets'
value: {'GatewaySubnet' : '10.0.255.0/27'}
environment: 'DEV'
strategy:
runOnce:
deploy:
steps:
- checkout: self
- task: qetza.replacetokens.replacetokens-task.replacetokens#3
displayName: 'Replace Terraform variables'
inputs:
targetFiles: '**/*.tfvars'
tokenPrefix: '__'
tokenSuffix: '__'
- task: TerraformInstaller#0
displayName: "Install Terraform"
inputs:
terraformVersion: '1.0.8'
- task: TerraformTaskV2#2
displayName: 'Terraform Init'
inputs:
provider: 'azurerm'
command: 'init'
backendServiceArm: 'skwieralearning'
backendAzureRmResourceGroupName: 'skwiera-learning-rg'
backendAzureRmStorageAccountName: 'skwieralearningtfstate'
backendAzureRmContainerName: 'tfstate'
backendAzureRmKey: 'dev.tfstate'
- task: TerraformTaskV2#2
displayName: 'Terraform Validate'
inputs:
provider: 'azurerm'
command: 'validate'
- task: TerraformTaskV2#2
displayName: "Terraform Plan"
inputs:
provider: 'azurerm'
command: 'plan'
environmentServiceNameAzureRM: 'skwieralearning'
- task: TerraformTaskV2#2
displayName: 'Terraform Apply'
inputs:
provider: 'azurerm'
command: 'apply'
environmentServiceNameAzureRM: 'skwieralearning'
The Azure Devops pipeline.yaml file is expecting the job variable's value to be a string but if you use:
- name: 'VirtualNetworkSubnets'
value: {'GatewaySubnet' : '10.0.255.0/27'}
Then the YAML parser sees that as a nested mapping under the value key as YAML supports both key1: value and {key: value} syntax for mappings.
You can avoid it being read as a mapping by wrapping it in quotes so that it's read as a string literal:
- name: 'VirtualNetworkSubnets'
value: "{'GatewaySubnet' : '10.0.255.0/27'}"
Separately you can avoid the qetza.replacetokens.replacetokens-task.replacetokens#3 step and the tokenised values in dev.tfvars by prefixing the environment variables with TF_VAR_:
stages:
- stage: DEV
jobs:
- deployment: TERRAFORM
displayName: 'Terraform deployment'
pool:
nvmImage: 'ubuntu-latest'
workspace:
clean: all
variables:
- name: 'TF_VAR_rgNAME'
value: 'skwiera-rg'
- name: 'TF_VAR_rgLOCATION'
value: 'West Europe'
- name: 'TF_VAR_VirtualNetworkName'
value: 'SkwieraVNET'
- name: 'TF_VAR_VirtualNetworkAddressSpace'
value: "['10.0.0.0/16']"
- name: 'TF_VAR_VirtualNetworkSubnets'
value: "{'GatewaySubnet' : '10.0.255.0/27'}"

Azure DevOps Use a Variable as a Condition between Deployment Stages

I currently build all our Azure Infrastructure using Terraform via Azure DevOps Pipelines. This has been working well and we have a standard pipeline which calls two templates
Plan Stage
Deployment Job
Planning Template
Runs Every Time
Apply Stage
Deployment Job
Apply Template
Runs with manual approval check
Now this works fine, but what I want to do is only have the apply step run if there are changes to make. I have found other articles on how to get a variable set in the plan stage which I can do and it works fine.
I can call this same variable in the next step
variables:
varFromPlanStage: $[stageDependencies.Plan.planning_stage.outputs['planning_stage.terraformPlanResult.terraformChanges']]
steps:
- script: echo $(varFromPlanStage)
But the problem comes in when I try and use this same variable in a condition.
I found that the way you all it is different, needing a dependency instead of stagedependancy, but no matter what I try I can't get it to work.
The pipeline looks like this.
stages:
- stage: 'Plan'
displayName: 'Planning'
jobs:
- deployment: planning_stage
displayName: 'Planning Changes'
pool:
vmImage: 'Ubuntu-20.04'
environment: 'planning'
strategy:
runOnce:
deploy:
steps:
- template: /Pipelines/10-TEST-terraform-planning-template.yml # Run the Planning Template
parameters:
terraform_version: ${{ parameters.terraform_version }}
terraform_backend_service_arm: ${{ parameters.terraform_backend_service_arm }}
terraform_backend_resource_group: ${{ parameters.terraform_backend_resource_group }}
terraform_backend_storage_account: ${{ parameters.terraform_backend_storage_account }}
terraform_backend_storage_container: ${{ parameters.terraform_backend_storage_container }}
terraform_state_key: ${{ parameters.terraform_state_key }}
git_working_directory: ${{ parameters.git_working_directory }}
# This is the Build Stage - Only do this when on the master branch (which is via a PR)
- stage: 'Apply'
condition: and(succeeded(), eq(dependencies.Plan.planning_stage.outputs['planning_stage.terraformPlanResult.terraformChanges'], 'true'))
variables:
varFromPlanStage: $[stageDependencies.Plan.planning_stage.outputs['planning_stage.terraformPlanResult.terraformChanges']]
displayName: 'Applying Changes'
jobs:
- deployment: applying_stage
displayName: 'Lets Build'
pool:
vmImage: 'Ubuntu-20.04'
environment: 'building'
strategy:
runOnce:
deploy:
steps:
- script: echo $(varFromPlanStage) # Just a test
- template: /Pipelines/20-TEST-terraform-apply-template.yml # Run the Apply Template
parameters:
terraform_version: ${{ parameters.terraform_version }}
terraform_backend_service_arm: ${{ parameters.terraform_backend_service_arm }}
terraform_backend_resource_group: ${{ parameters.terraform_backend_resource_group }}
terraform_backend_storage_account: ${{ parameters.terraform_backend_storage_account }}
terraform_backend_storage_container: ${{ parameters.terraform_backend_storage_container }}
terraform_state_key: ${{ parameters.terraform_state_key }}
git_working_directory: ${{ parameters.git_working_directory }}
And the part of the Planning Template that exports the variable is called terraformPlanResult with the var being terraformChanges
Any idea what I am doing wrong here, and why I can't call the variable as a condition but I can as part of the steps?
Thx!
This appears to work differently depending on whether you're setting the variable in a 'deployment' job, or a 'job' job. After some trial, error and googling I managed to get working for both. Examples below :-)
Example passing variables from a 'deployment' job.
# Example passing variables from a 'deployment' job.
stages:
# Create some variables to pass to next stage.
- stage: 'A'
jobs:
- deployment: 'A1'
pool:
vmImage: 'windows-2019'
environment: 'test'
strategy:
runOnce:
deploy:
steps:
# Create a variable.
- task: PowerShell#2
name: foo
displayName: 'Create a variable.'
inputs:
targetType: 'inline'
script: |
echo "##vso[task.setvariable variable=bar;isOutput=true]apple"
# Check variable.
- task: PowerShell#2
displayName: 'Check a variable.'
inputs:
targetType: 'inline'
script: |
Write-Host "$env:MY_MAPPED_ENV_VAR"
env:
MY_MAPPED_ENV_VAR: $(foo.bar)
# Confirm condition works & variables are available for use.
- stage: 'B'
dependsOn:
- 'A'
variables:
- name: varFromStageA
# stageDependencies.stageName.deploymentName.outputs['deploymentName.stepName.variableName']
value: $[ stageDependencies.A.A1.outputs['A1.foo.bar'] ]
# dependencies.stageName.outputs['deploymentName.deploymentName.stepName.variableName']
condition: and(succeeded(), eq(dependencies.A.outputs['A1.A1.foo.bar'], 'apple'))
jobs:
- job: 'B1'
pool:
vmImage: 'windows-2019'
steps:
# Confirm variable has been passed between stages.
- task: PowerShell#2
displayName: 'Confirm var passed between stages'
inputs:
targetType: 'inline'
script: |
Write-Host "$env:MY_MAPPED_ENV_VAR"
env:
MY_MAPPED_ENV_VAR: $(varFromStageA)
Example passing variables from a 'job' job.
# Example passing variables from a 'job' job.
stages:
# Create some variables to pass to next stage.
- stage: 'A'
jobs:
- job: 'A1'
pool:
vmImage: 'windows-2019'
steps:
# Create a variable.
- task: PowerShell#2
name: foo
displayName: 'Create a variable.'
inputs:
targetType: 'inline'
script: |
echo "##vso[task.setvariable variable=bar;isOutput=true]apple"
# Check variable.
- task: PowerShell#2
displayName: 'Check a variable.'
inputs:
targetType: 'inline'
script: |
Write-Host "$env:MY_MAPPED_ENV_VAR"
env:
MY_MAPPED_ENV_VAR: $(foo.bar)
# Confirm condition works & variables are available for use.
- stage: 'B'
dependsOn:
- 'A'
variables:
- name: varFromStageA
# stageDependencies.stageName.jobName.outputs['stepName.variableName']
value: $[ stageDependencies.A.A1.outputs['foo.bar'] ]
# dependencies.stageName.outputs['jobName.stepName.variableName']
condition: and(succeeded(), eq(dependencies.A.outputs['A1.foo.bar'], 'apple'))
jobs:
- job: 'B1'
pool:
vmImage: 'windows-2019'
steps:
# Confirm variable has been passed between stages.
- task: PowerShell#2
displayName: 'Confirm var passed between stages'
inputs:
targetType: 'inline'
script: |
Write-Host "$env:MY_MAPPED_ENV_VAR"
env:
MY_MAPPED_ENV_VAR: $(varFromStageA)
What's working for me (by many trials and errors) is the format:
dependencies.stage_name.outputs['job_name.step_name.variable_name']
In your case, this would be:
dependencies.Plan.outputs['planning_stage.terraformPlanResult.terraformChanges']
If that doesn't work, the only unusual thing is that your planning_stage is a deployment. You might try and change it to a regular job.
Here's a pipeline that tests various ways of referencing outputs in conditions.
For me it looks like an issue on Azure Devops (later O will preapre minimial working example and create a bug for this). Because the same syntax works correctly for regular jobs but not for deployment. Also you can use condition like condition: in(dependencies.A.result, 'Succeeded', 'SucceededWithIssues', 'Skipped') but not outputs. And according to this
"dependencies": {
"<STAGE_NAME>" : {
"result": "Succeeded|SucceededWithIssues|Skipped|Failed|Canceled",
"outputs": {
"jobName.stepName.variableName": "value"
}
},
"...": {
// another stage
}
}
it is correct syntax.
I also checked this but using life cycle hook instead of job name doesn't helped me.
What is trange that you can use output variables in job condition if ther are in the same stage.
Here is link to the issue.

Unexpected Behavior With Azure Pipelines Variables Using Variable Groups and Templates

I have a Azure DevOps YAML Pipeline to execute a Terraform deployment using the Terraform by MS DevLabs extension and an Azure Resource Manager service connection.
The last working state was using a pipeline template yaml file however I had to configure a parameter within the template and call the variable using the template expression syntax.
...
...
stages:
- stage: Plan
displayName: Terrafom Plan
jobs:
- job: DEV PLAN
displayName: Plan (DEV)
pool:
vmImage: "ubuntu-latest"
variables:
az_service_connection: "MyServiceConnection"
tf_environment: "DEV"
tf_state_rg: "DEV"
tz_state_location: "canadacentral"
tf_state_stgacct_name: "mystorageaccuontname1231231"
tf_state_container_name: "tfstate"
steps:
- template: templates/terraform-plan.yml
parameters:
az_service_connection: ${{ variables.az_service_connection }}
...
...
steps:
- task: terraformInstaller#0
displayName: "Install Terraform $(tf_version)"
inputs:
terraformVersion: $(tf_version)
- task: TerraformTaskV1#0
displayName: "Run > terraform init"
inputs:
command: "init"
commandOptions: "-input=false"
backendServiceArm: ${{ parameters.az_service_connection }}
...
...
I believe the reason why this works is because the template expression syntax ${{ variables.varname}} evaluates at compile time vs. runtime. If I didn't do it this way, i'd either get $(az_service_connection) passed into the backendServiceArm input or an empty value.
With the introduction of variable groups, i'm now facing similar behavior. I expect that the variable group evaluates after the template expression variable which causes ${{ variables.az_service_connection }} to have an empty value. I am unsure how to get this working.
How can I use variable groups with a pipeline template that uses a service connection?
I used $() syntax to pass arm connection to template:
Template file:
parameters:
- name: 'instances'
type: object
default: {}
- name: 'server'
type: string
default: ''
- name: 'armConnection'
type: string
default: ''
steps:
- task: TerraformTaskV1#0
inputs:
provider: 'azurerm'
command: 'init'
backendServiceArm: '${{ parameters.armConnection }}'
backendAzureRmResourceGroupName: 'TheCodeManual'
backendAzureRmStorageAccountName: 'thecodemanual'
backendAzureRmContainerName: 'infra'
backendAzureRmKey: 'some-terrform'
- ${{ each instance in parameters.instances }}:
- script: echo ${{ parameters.server }}:${{ instance }}
Main file:
trigger:
branches:
include:
- master
paths:
include:
- stackoverflow/09-array-parameter-for-template/*
# no PR triggers
pr: none
pool:
vmImage: 'ubuntu-latest'
variables:
- group: my-variable-group
- name: my-passed-variable
value: $[variables.myhello] # uses runtime expression
steps:
- template: template.yaml
parameters:
instances:
- test1
- test2
server: $(myhello)
armConnection: $(armConnection)
Note: Group my-variable-group contains armConnection variable

Resources