Is there a way to loop over all outputs from stageDependencies in Azure DevOps? - azure

I want to loop over all the outputs of dependent stages and check if any of them are set to a specific value which would indicate there was a problem. Is this possible, and if so, how?
I'm having trouble finding any relevant documentation on stageDependencies outside of using them in job conditions. I know the code below doesn't work, but I've tried a few variations on this and I'm not even seeing the stageDependencies object.
- task: PowerShell#2
displayName: Check for Failures
inputs:
targetType: inline
script: |
foreach ($dependency in $[stageDependencies]) {
foreach ($job in $dependency) {
foreach ($output in $job.outputs) {
if ("$output" -eq "Started") {
write-host "##vso[task.logissue type=error]Task failed in a previous step"
}
}
}
}

The pipeline can output variables through the logging command, but the variables can only be strings:
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#set-variables-in-scripts
All variables set by this method are treated as strings.
It is not difficult to parse a string and put it into an array, just use the function(split()) that comes with the string type to split and restore.
Here is an example:
trigger:
- none
# 1
stages:
- stage: s1
displayName: setvars
jobs:
- job: testJob
steps:
- task: PowerShell#2
name: setvar
inputs:
targetType: 'inline'
script: |
# logic here. For example you get the vars and put it into this format:
$testvars = "testvar1,testvar2,testvar3"
Write-Host "##vso[task.setvariable variable=outputvars;isOutput=true]$testvars"
# 2
- stage: s2
displayName: getvars
dependsOn: s1
variables:
vars: $[ stageDependencies.s1.testJob.outputs['setvar.outputvars'] ]
jobs:
- job:
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
$varsArr = "$(vars)".Split(',')
foreach ($var in $varsArr)
{
Write-Host "$var`r`n"
if($var -eq 'testvar1')
{
Write-Host 'The value is correct.'
}
}
Result:

Related

Is there a way to handle a single parameter as a build number OR source branch in an azure pipeline?

I'm trying to allow for a single variable that can specify the build id or branch to deploy from. This means I need to type coerce, and it's failing if I try to call a ge (greater than or equal) or lt (less than) on a string. It looks like I might have to use bash commands to do the type conversion for me, so I was wondering if anyone had this handy.
Here's the steps I'm using:
- ${{ each deployment in parameters.deployments }}:
- deployment: Deploy_${{deployment.serviceName}}_${{ parameters.region }}
pool:
vmImage: ${{ parameters.vmImage }}
displayName: 'Deploy ${{ deployment.serviceName }} ${{ parameters.region }}'
${{ if not(eq(parameters.kubernetesServiceEndpoint, '')) }}:
environment: ${{ parameters.kubernetesServiceEndpoint }}
${{ elseif not(and(eq(parameters.azureResourceGroup, ''), eq(parameters.kubernetesCluster, ''))) }}:
environment: ${{ parameters.environment }}
strategy:
runOnce:
deploy:
steps:
# Disable the automatic downloading of artifacts, because we will specifying exactly what we need
- download: none
- task: DownloadPipelineArtifact#2
condition: ${{ ge(0, deployment.branchBuildId) }}
inputs:
source: 'specific'
project: ${{ deployment.project }}
pipeline: ${{ deployment.pipeline }}
runVersion: 'latestFromBranch'
runBranch: ${{ deployment.branchBuildId }}
patterns: 'DeploymentData/*'
displayName: 'Download Latest Artifacts'
- task: DownloadPipelineArtifact#2
condition: ${{ lt(0, deployment.branchBuildId) }}
inputs:
source: 'specific'
project: ${{ deployment.project }}
pipeline: ${{ deployment.pipeline }}
runVersion: 'specific'
runId: ${{ deployment.branchBuildId }}
#runBranch: ${{ deployment.branch }}
patterns: 'DeploymentData/*'
displayName: 'Download Specific Artifacts (${{ deployment.buildId }})'
I want these to be mutually exclusive (run the first if pulling the latest from a specific branch, and run the second if downloading from a specific build id). By using a single parameter I can avoid another control parameter to decide which version should be used.
Two things:
1, If you use script to handle, then 'compile time usage' is unable to use.
You need to use runtime usage:
variables['xxx']
2, Variables only support passing string, you can compare them and then use logging command to output the result:
trigger:
- none
pool:
vmImage: ubuntu-latest
parameters:
- name: str1
default: 110
- name: str2
default: 130
steps:
- task: PowerShell#2
name: outputresult
inputs:
targetType: 'inline'
script: |
$str1 = "${{parameters.str1}}"
$str2 = "${{parameters.str2}}"
Write-Host $str1
Write-Host $str2
# powershell compare the strings greater than or less than
if ($str1 -gt $str2) {
$result = "greater than"
Write-Host $result
Write-Host "##vso[task.setvariable variable=result;isoutput=true]$result"
} elseif ($str1 -lt $str2) {
$result = "less than"
Write-Host $result
Write-Host "##vso[task.setvariable variable=result;isoutput=true]$result"
} else {
$result = "equal"
Write-Host $result
Write-Host "##vso[task.setvariable variable=result;isoutput=true]$result"
}
#Below is the handling logic
- task: PowerShell#2
condition: eq('greater than',variables['outputresult.result'])
inputs:
targetType: 'inline'
script: |
# Write your PowerShell commands here.
Write-Host "greater than"
- task: PowerShell#2
condition: eq('less than',variables['outputresult.result'])
inputs:
targetType: 'inline'
script: |
# Write your PowerShell commands here.
Write-Host "less than"
Result:

Publish file content to service bus from CI pipeline

In my CI pipeline I am trying to publish message to service bus and its working when its just some hardcoded text or variables, here Using "PublishToAzureServiceBus" task .
But problem is when trying to a read file from repository and then publish that to service bus.
I have tried using read file using scripting language and put to variable but its not able to work as variable is not storing big json file.
Is there any way to read file directly when publishing message to service bus.
Below is sample code snippet for debugging
trigger:
- none
pool:
vmImage: ubuntu-latest
parameters:
- name: ProjectName
displayName: Project Name
type: string
default: DevOpsDemo
- name: repoName
displayName: repo Name
type: string
default: ProjectCode
- name: branchRef
displayName: Branch Name
type: string
default: main
variables:
- name: jobStatus
value: "Failed"
- name: projectFile
value: ""
stages:
- stage: Stage1
displayName: Stage 1
jobs:
- job: CheckOutRepo
displayName: CheckOut-Repo Display
steps:
- script: |
echo "Checkout for " ${{ parameters.ProjectName}} : ${{ parameters.repoName}} : ${{ parameters.branchRef}}
name: PrintMessage
- checkout: git://${{ parameters.ProjectName}}/${{ parameters.repoName}}#refs/heads/${{ parameters.branchRef}}
name: Checkout
- task: PythonScript#0
inputs:
scriptSource: 'inline'
script: |
import json
import requests
f = open('project-release.json')
projectFile = json.load(f)
print(projectFile)
f.close()
print("Afterclosing")
print(projectFile)
- script: |
echo "Project release file" $(cat project-release.json)
name: TestPrint
- task: CopyFiles#2
inputs:
SourceFolder: 'services'
Contents: '**'
TargetFolder: $(Build.ArtifactStagingDirectory)
name: CopyFiles
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: $(Build.ArtifactStagingDirectory)
ArtifactName: 'drop'
publishLocation: 'Container'
name: PublishArtifacts
- bash: |
echo "##vso[task.setvariable variable=jobStatus]Success"
name: setVar
- bash: |
echo "##vso[task.setvariable variable=jobStatus;isOutput=true]$(jobStatus)"
echo "##vso[task.setvariable variable=projectFile;isOutput=true]$(cat project-release.json)"
name: SetStatus
condition: always()
- stage: Stage2
displayName: Stage 2
condition: always()
jobs:
- job: Publish
pool: server
variables:
jobStatus: $[ stageDependencies.Stage1.CheckOutRepo.outputs['SetStatus.jobStatus'] ]
projectFile: $[ stageDependencies.Stage1.CheckOutRepo.outputs['SetStatus.projectFile'] ]
steps:
- task: PublishToAzureServiceBus#1
inputs:
azureSubscription: 'SBConnection'
messageBody: |
{
"Status": "$(jobStatus)",
"BuildID": "$(build.buildid)",
"BuildNumber":"$(build.buildnumber)",
"projectFile":$(cat project-release.json)
}
signPayload: false
waitForCompletion: false
condition: always()
I am able to solve this by using setvariable in bash script as below
pool:
vmImage: ubuntu-latest
stages:
- stage: Stage1
displayName: Stage 1
jobs:
- job: CheckOutRepo
displayName: CheckOut-Repo Display
steps:
- checkout: git://${{ parameters.ProjectName}}/${{ parameters.repoName}}#refs/heads/${{ parameters.branchRef}}
name: Checkout
- bash: |
data=$(cat project-release.json)
echo "##vso[task.setvariable variable=jobStatus;isOutput=true]$(jobStatus)"
echo "##vso[task.setvariable variable=data;isOutput=true]"$data
name: SetStatus
condition: always()
- stage: Stage2
displayName: Stage 2
condition: always()
jobs:
- job: Publish
pool: server
variables:
jobStatus: $[ stageDependencies.Stage1.CheckOutRepo.outputs['SetStatus.jobStatus'] ]
projectFile: $[ stageDependencies.Stage1.CheckOutRepo.outputs['SetStatus.data'] ]
steps:
- task: PublishToAzureServiceBus#1
inputs:
azureSubscription: 'SBConnection'
messageBody: |
{
"Status": "$(jobStatus)",
"BuildID": "$(build.buildid)",
"BuildNumber":"$(build.buildnumber)",
"projectFile":$(projectFile)
}
signPayload: false
waitForCompletion: false
condition: always()

Azure DevOps Use a Variable as a Condition between Deployment Stages

I currently build all our Azure Infrastructure using Terraform via Azure DevOps Pipelines. This has been working well and we have a standard pipeline which calls two templates
Plan Stage
Deployment Job
Planning Template
Runs Every Time
Apply Stage
Deployment Job
Apply Template
Runs with manual approval check
Now this works fine, but what I want to do is only have the apply step run if there are changes to make. I have found other articles on how to get a variable set in the plan stage which I can do and it works fine.
I can call this same variable in the next step
variables:
varFromPlanStage: $[stageDependencies.Plan.planning_stage.outputs['planning_stage.terraformPlanResult.terraformChanges']]
steps:
- script: echo $(varFromPlanStage)
But the problem comes in when I try and use this same variable in a condition.
I found that the way you all it is different, needing a dependency instead of stagedependancy, but no matter what I try I can't get it to work.
The pipeline looks like this.
stages:
- stage: 'Plan'
displayName: 'Planning'
jobs:
- deployment: planning_stage
displayName: 'Planning Changes'
pool:
vmImage: 'Ubuntu-20.04'
environment: 'planning'
strategy:
runOnce:
deploy:
steps:
- template: /Pipelines/10-TEST-terraform-planning-template.yml # Run the Planning Template
parameters:
terraform_version: ${{ parameters.terraform_version }}
terraform_backend_service_arm: ${{ parameters.terraform_backend_service_arm }}
terraform_backend_resource_group: ${{ parameters.terraform_backend_resource_group }}
terraform_backend_storage_account: ${{ parameters.terraform_backend_storage_account }}
terraform_backend_storage_container: ${{ parameters.terraform_backend_storage_container }}
terraform_state_key: ${{ parameters.terraform_state_key }}
git_working_directory: ${{ parameters.git_working_directory }}
# This is the Build Stage - Only do this when on the master branch (which is via a PR)
- stage: 'Apply'
condition: and(succeeded(), eq(dependencies.Plan.planning_stage.outputs['planning_stage.terraformPlanResult.terraformChanges'], 'true'))
variables:
varFromPlanStage: $[stageDependencies.Plan.planning_stage.outputs['planning_stage.terraformPlanResult.terraformChanges']]
displayName: 'Applying Changes'
jobs:
- deployment: applying_stage
displayName: 'Lets Build'
pool:
vmImage: 'Ubuntu-20.04'
environment: 'building'
strategy:
runOnce:
deploy:
steps:
- script: echo $(varFromPlanStage) # Just a test
- template: /Pipelines/20-TEST-terraform-apply-template.yml # Run the Apply Template
parameters:
terraform_version: ${{ parameters.terraform_version }}
terraform_backend_service_arm: ${{ parameters.terraform_backend_service_arm }}
terraform_backend_resource_group: ${{ parameters.terraform_backend_resource_group }}
terraform_backend_storage_account: ${{ parameters.terraform_backend_storage_account }}
terraform_backend_storage_container: ${{ parameters.terraform_backend_storage_container }}
terraform_state_key: ${{ parameters.terraform_state_key }}
git_working_directory: ${{ parameters.git_working_directory }}
And the part of the Planning Template that exports the variable is called terraformPlanResult with the var being terraformChanges
Any idea what I am doing wrong here, and why I can't call the variable as a condition but I can as part of the steps?
Thx!
This appears to work differently depending on whether you're setting the variable in a 'deployment' job, or a 'job' job. After some trial, error and googling I managed to get working for both. Examples below :-)
Example passing variables from a 'deployment' job.
# Example passing variables from a 'deployment' job.
stages:
# Create some variables to pass to next stage.
- stage: 'A'
jobs:
- deployment: 'A1'
pool:
vmImage: 'windows-2019'
environment: 'test'
strategy:
runOnce:
deploy:
steps:
# Create a variable.
- task: PowerShell#2
name: foo
displayName: 'Create a variable.'
inputs:
targetType: 'inline'
script: |
echo "##vso[task.setvariable variable=bar;isOutput=true]apple"
# Check variable.
- task: PowerShell#2
displayName: 'Check a variable.'
inputs:
targetType: 'inline'
script: |
Write-Host "$env:MY_MAPPED_ENV_VAR"
env:
MY_MAPPED_ENV_VAR: $(foo.bar)
# Confirm condition works & variables are available for use.
- stage: 'B'
dependsOn:
- 'A'
variables:
- name: varFromStageA
# stageDependencies.stageName.deploymentName.outputs['deploymentName.stepName.variableName']
value: $[ stageDependencies.A.A1.outputs['A1.foo.bar'] ]
# dependencies.stageName.outputs['deploymentName.deploymentName.stepName.variableName']
condition: and(succeeded(), eq(dependencies.A.outputs['A1.A1.foo.bar'], 'apple'))
jobs:
- job: 'B1'
pool:
vmImage: 'windows-2019'
steps:
# Confirm variable has been passed between stages.
- task: PowerShell#2
displayName: 'Confirm var passed between stages'
inputs:
targetType: 'inline'
script: |
Write-Host "$env:MY_MAPPED_ENV_VAR"
env:
MY_MAPPED_ENV_VAR: $(varFromStageA)
Example passing variables from a 'job' job.
# Example passing variables from a 'job' job.
stages:
# Create some variables to pass to next stage.
- stage: 'A'
jobs:
- job: 'A1'
pool:
vmImage: 'windows-2019'
steps:
# Create a variable.
- task: PowerShell#2
name: foo
displayName: 'Create a variable.'
inputs:
targetType: 'inline'
script: |
echo "##vso[task.setvariable variable=bar;isOutput=true]apple"
# Check variable.
- task: PowerShell#2
displayName: 'Check a variable.'
inputs:
targetType: 'inline'
script: |
Write-Host "$env:MY_MAPPED_ENV_VAR"
env:
MY_MAPPED_ENV_VAR: $(foo.bar)
# Confirm condition works & variables are available for use.
- stage: 'B'
dependsOn:
- 'A'
variables:
- name: varFromStageA
# stageDependencies.stageName.jobName.outputs['stepName.variableName']
value: $[ stageDependencies.A.A1.outputs['foo.bar'] ]
# dependencies.stageName.outputs['jobName.stepName.variableName']
condition: and(succeeded(), eq(dependencies.A.outputs['A1.foo.bar'], 'apple'))
jobs:
- job: 'B1'
pool:
vmImage: 'windows-2019'
steps:
# Confirm variable has been passed between stages.
- task: PowerShell#2
displayName: 'Confirm var passed between stages'
inputs:
targetType: 'inline'
script: |
Write-Host "$env:MY_MAPPED_ENV_VAR"
env:
MY_MAPPED_ENV_VAR: $(varFromStageA)
What's working for me (by many trials and errors) is the format:
dependencies.stage_name.outputs['job_name.step_name.variable_name']
In your case, this would be:
dependencies.Plan.outputs['planning_stage.terraformPlanResult.terraformChanges']
If that doesn't work, the only unusual thing is that your planning_stage is a deployment. You might try and change it to a regular job.
Here's a pipeline that tests various ways of referencing outputs in conditions.
For me it looks like an issue on Azure Devops (later O will preapre minimial working example and create a bug for this). Because the same syntax works correctly for regular jobs but not for deployment. Also you can use condition like condition: in(dependencies.A.result, 'Succeeded', 'SucceededWithIssues', 'Skipped') but not outputs. And according to this
"dependencies": {
"<STAGE_NAME>" : {
"result": "Succeeded|SucceededWithIssues|Skipped|Failed|Canceled",
"outputs": {
"jobName.stepName.variableName": "value"
}
},
"...": {
// another stage
}
}
it is correct syntax.
I also checked this but using life cycle hook instead of job name doesn't helped me.
What is trange that you can use output variables in job condition if ther are in the same stage.
Here is link to the issue.

IF statement condition within a Azure devops pipeline

EDIT
I have got a pipeline below and would want it to run an inline script based on the time of the day.
The question should have been around pipelines rather than ARM template.
schedules:
- cron: "0 10 * * *"
displayName: Test 1
branches:
include:
- master
always: true
- cron: "0 21 * * *"
displayName: Test 2
branches:
include:
- master
always: true
steps:
- ${{ if eq(schedules.cron, '0 10 ***') }}:
- task: AzureCLI#2
name: RunProcess
displayName: Run test 1
inputs:
azureSubscription: serviceConnection
scriptLocation: 'inlineScript'
scriptType: bash
failOnStandardError: true
inlineScript: |
echo 'starting process 1'
The way I was able to do this can be found below.
steps:
- task: PowerShell#02
name: taskname
displayName: task display name
inputs:
azureSubscription: $(subnamee)
scriptLocation: 'inlineScript'
failOnStandardError: true
targetType: 'inline'
script: |
$h = (Get-Date).hour
if ($h -eq 10)
{
echo 'command 1'
}
if ($h -eq 21)
{
echo 'command 2'
}
I was able to do this by changing the script type to PS.
The way I was able to do this can be found below.
steps:
- task: PowerShell#02
name: taskname
displayName: task display name
inputs:
azureSubscription: $(subnamee)
scriptLocation: 'inlineScript'
failOnStandardError: true
targetType: 'inline'
script: |
$h = (Get-Date).hour
if ($h -eq 10)
{
echo 'command 1'
}
if ($h -eq 21)
{
echo 'command 2'
}

Return an object variable from a script - Azure YAML pipelines

Consider the following simplified pipeline:
### template.yml ###
parameters:
- name: "tables"
type: object
default: {}
steps:
- ${{ each table in parameters.tables }}:
- task: BackupTask#0
displayName: "Backup ${{ table }}"
### pipeline.yml ###
- template: template.yml
parameters:
tables:
- "table1"
- "table2"
- "table3"
- "table4"
- "table5"
What I would like is that the list of tables are generated with a bash script instead of having to write them by hand. So every time a new table is created it gets automatically backed up by the pipeline.
As a workaround, we can create another pipeline. In this pipeline, we add two powershell tasks. In the first task, we set a variable with tables as the value.
- task: PowerShell#2
inputs:
targetType: 'inline'
script: 'Write-Host "##vso[task.setvariable variable=list]table1,table2,table3"'
In the second task, we use rest api to trigger the pipeline.yml pipeline. In the request body, we use the variable set in the first task as the value of the template parameter.
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
$token = "PAT"
$url="https://dev.azure.com/{org}/{pro}/_apis/pipelines/{pipelineId}/runs?api-version=5.1-preview"
$token = [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes(":$($token)"))
$JSON = #'
{
"templateParameters": {
"tab":"[$(list)]"
},
}
'#
$response = Invoke-RestMethod -Uri $url -Headers #{Authorization = "Basic $token"} -Method Post -Body $JSON -ContentType application/json
Below is my test sample:
### pipeline.yml ###
parameters:
- name: tab
type: object
default: {}
pool:
vmImage: 'ubuntu-latest'
steps:
- template: template1.yml
parameters:
tables: ${{ parameters.tab }}
template.yml:
### template1.yml ###
parameters:
- name: "tables"
type: object
default: {}
steps:
- ${{ each table in parameters.tables }}:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: echo "${{ table }}"
Then we run the newly created pipeline to trigger the pipeline.yml pipeline, get the result:

Resources