I found a lot of examples about using EACH loop in an Azure pipeline, but all of them I found are using a parameter as the array.
What about using an array that was created in the code?
I mean:
- script: COMMAND=$(npx nx affected:apps --base=$(BASE_SHA) --head=$(HEAD_SHA) --plain) && echo "##vso[task.setvariable variable=APPLICATIONS;]$COMMAND"
- task: Bash#3
inputs:
targetType: 'inline'
script: |
echo 'APPLICATIONS + $(APPLICATIONS)'
readarray -d ' ' -t ARRAYAPPS <<<'$(APPLICATIONS)'
echo ${ARRAYAPPS[0]}
echo ${ARRAYAPPS[1]}
- ${{each APPLICATION in $APPLICATIONS }}:
- task: ...
Pipeline ${{each does not support runtime pipeline variables. It only supports parameters, because the each is evaluated at compile time; and at that time, it is only parameters (and variables based on them) that are available.
"If you're blocked in a way, try to change this way"
I modified the approach, and I included the loop inside the bash script:
readarray -d ' ' -t ARRAYAPPS <<<$(APPLICATIONS)
if (( ${#APPLICATIONS[#]} > 0 )); then
for (( i=0; i<${#APPLICATIONS[#]}; i++ ));
...
Related
I have the fallowing CI parent pipeline and I used artifacts in order to pass variables to a child pipeline:
before-script:
script:
- terraform --version
- export AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID}
- export AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY}
- export TF_VAR_API_KEY=${TF_VAR_API_KEY}
- export GIT_TOKEN=${GIT_TOKEN}
prepare-infra-code:
stage: build
script:
- cd iac/
- terraform init -force-copy -backend-config="address=https://gitlab.com/api/v4/projects/XXX/terraform/state/main" -backend-config="lock_address=https://gitlab.com/api/v4/projects/XX/terraform/state/main/lock" -backend-config="unlock_address=https://gitlab.com/api/v4/projects/XXX/terraform/state/main/lock" -backend-config="username=XXX" -backend-config="password=$GIT_TOKEN" -backend-config="lock_method=POST" -backend-config="unlock_method=DELETE" -backend-config="retry_wait_min=5"
- terraform plan
...
deploy-prod:
stage: deploy
script:
- cd iac/
- terraform init -force-copy -backend-config="address=https://gitlab.com/api/v4/projects/XXX/terraform/state/main" -backend-config="lock_address=https://gitlab.com/api/v4/projects/XXX/terraform/state/main/lock" -backend-config="unlock_address=https://gitlab.com/api/v4/projects/XXX/terraform/state/main/lock" -backend-config="username=XXX" -backend-config="password=$GIT_TOKEN" -backend-config="lock_method=POST" -backend-config="unlock_method=DELETE" -backend-config="retry_wait_min=5"
- terraform apply --auto-approve
- URL=$(terraform output api_url | cut -c2- | sed 's/.$//')
- S3=$(terraform output bucket_name | cut -c2- | sed 's/.$//')
- echo "REACT_APP_PORTFOLIO_API=$URL/portfolio" >> deploy-prod.env
- echo "REACT_APP_TRANSACTION_API=$URL/transaction" >> deploy-prod.env
- echo "FRONTEND_BUCKET=$S3" >> deploy-prod.env
artifacts:
reports:
- dotenv: infra.env
However, I keep getting this error:
jobs:deploy-prod:artifacts reports should be a hash
Can someone please help me identify the issue ?
The first stage in my pipeline checks for what services have actually changed. This is in an effort to speed up the pipeline by avoiding rebuilding, retesting, redeploying services if there have been no changes.
This is the changed.yaml for that stage:
parameters:
- name: comparedTo
default: ''
stages:
- stage: Changed
displayName: Check for changes in services and configs...
jobs:
- job: Changes
displayName: Checking for changes in services and configs...
steps:
- bash: |
mapfile -t changed < <(git diff HEAD ${{ parameters.comparedTo }} --name-only | awk -F'/' 'NF!=1{print $1}' | sort -u)
servicesChanged=()
configChanged=()
echo ""
echo "Total Changed: ${#changed[#]}"
for i in "${changed[#]}"
do
echo $i
if [[ $i == 'admin' ]]; then
echo "##vso[task.setvariable variable=adminChanged;isOutput=True]true"
servicesChanged+=("admin")
elif [[ $i == 'admin-v2' ]]; then
echo "##vso[task.setvariable variable=adminV2Changed;isOutput=True]true"
servicesChanged+=("admin-v2")
elif [[ $i == 'api' ]]; then
echo "##vso[task.setvariable variable=apiChanged;isOutput=True]true"
servicesChanged+=("api")
elif [[ $i == 'client' ]]; then
echo "##vso[task.setvariable variable=clientChanged;isOutput=True]true"
servicesChanged+=("client")
elif [[ $i == 'k8s' ]]; then
echo "##vso[task.setvariable variable=k8sChanged;isOutput=True]true"
configsChanged+=("k8s")
elif [[ $i == 'pipelines' ]]; then
echo "##vso[task.setvariable variable=pipelineChanged;isOutput=True]true"
configsChanged+=("pipelines")
fi
done
echo ""
echo "Services Changed: ${#servicesChanged[#]}"
for i in "${servicesChanged[#]}"
do
echo $i
done
echo ""
echo "Configs Changed: ${#configsChanged[#]}"
for i in "${configsChanged[#]}"
do
echo $i
done
if [[ ${#servicesChanged[#]} > 0 ]]; then
echo ""
echo "Any services changed: True"
echo "##vso[task.setvariable variable=anyServicesChanged;isOutput=true]true"
echo "##vso[task.setvariable variable=servicesChanged;isOutput=true]${servicesChanged[#]}"
fi
if [[ ${#configsChanged[#]} > 0 ]]; then
echo ""
echo "Any configs changed: True"
echo "##vso[task.setvariable variable=anyConfigsChanged;isOutput=true]true"
echo "##vso[task.setvariable variable=configsChanged;isOutput=true]${configsChanged[#]}"
fi
echo ""
name: detectChanges
As you can see, it creates a number of task output variables:
# This just indicates if the service has changed: true/false
echo "##vso[task.setvariable variable=<service-name>;isOutput=True]true"
# This should be creating a an output variable that is an array of the services that have changed
echo "##vso[task.setvariable variable=servicesChanged;isOutput=true]${servicesChanged[#]}"
So I gave myself two options: just a true/false for each service or iterating (somewhow) through an array of the services that have changed.
Each stage basically has the following form:
# pr.yaml
...
- template: templates/unitTests.yaml
parameters:
services:
- api
- admin
- admin-v2
- client
...
parameters:
- name: services
type: object
default: []
stages:
- stage: UnitTests
displayName: Run unit tests on service...
dependsOn: Changed
condition: succeeded()
jobs:
- job: UnitTests
condition: or(eq(stageDependencies.Changed.Changes.outputs['detectChanges.anyServicesChanged'], true), eq(variables['Build.Reason'], 'Manual'))
displayName: Running unit tests...
steps:
- ${{ each service in parameters.services }}:
- bash: |
echo "Now running ${{ service }} unit tests..."
Here is what I've tried so far and the errors I get:
Adding each service conditionally to services array or adding the array of changed services:
- template: templates/changed.yaml
parameters:
comparedTo: origin/production
- template: templates/unitTests.yaml
dependsOn: Changed
parameters:
services:
- ${{ if eq(stageDependencies.Changed.Changes.outputs['detectChanges.apiChanged'], true) }}
- api
- ${{ if eq(stageDependencies.Changed.Changes.outputs['detectChanges.adminChanged'], true) }}
- admin
- ${{ if eq(stageDependencies.Changed.Changes.outputs['detectChanges.adminV2Changed'], true) }}
- admin-v2
- ${{ if eq(stageDependencies.Changed.Changes.outputs['detectChanges.clientChanged'], true) }}
- client
Or...
- template: templates/changed.yaml
parameters:
comparedTo: origin/production
- template: templates/unitTests.yaml
dependsOn: Changed
parameters:
services:
- ${{ if eq(dependencies.Changed.outputs['Changes.detectChanges.apiChanged'], true) }}
- api
- ${{ if eq(dependencies.Changed.outputs['Changes.detectChanges.adminChanged'], true) }}
- admin
- ${{ if eq(dependencies.Changed.outputs['Changes.detectChanges.adminV2Changed'], true) }}
- admin-v2
- ${{ if eq(dependencies.Changed.outputs['Changes.detectChanges.clientChanged'], true) }}
- client
Or...
- template: templates/changed.yaml
parameters:
comparedTo: origin/production
- template: templates/unitTests.yaml
dependsOn: Changed
parameters:
services:
- $[ stageDependencies.Changed.Changes.outputs['detectChanges.servicesChanged'] ]
This results in:
An error occurred while loading the YAML build pipeline. Object reference not set to an instance of an object.
I know variables: will only take strings and not arrays.
One solution would be to have a variables: for each of the true/false variables and then conditions based on the parameters.services and whether the task output variables is true.
Any suggestions?
Ref:
Task Output Variables: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#set-variables-in-scripts
Parameters: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azure-devops#parameters
Expressions: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/expressions?view=azure-devops
The template expression ${{}} is evaluated at compile-time(before the jobs run), which means it cannot access to the variables that are dynamically set at the runtime(after the jobs start). So you cannot use template expression ${{}} in above scenario. See below description from here.
Within a template expression, you have access to the parameters context that contains the values of parameters passed in. Additionally, you have access to the variables context that contains all the variables specified in the YAML file plus many of the predefined variables (noted on each variable in that topic). Importantly, it doesn't have runtime variables such as those stored on the pipeline or given when you start a run. Template expansion happens very early in the run, so those variables aren't available
You can use conditions as a workaround. You need to add multiple tasks to be executed on the conditions. See below example:
- template: templates/changed.yaml
parameters:
comparedTo: origin/production
- template: templates/unitTests.yaml
dependsOn: Changed
#unitTests.yaml
stages:
- stage: UnitTests
displayName: Run unit tests on service...
dependsOn: Changed
condition: succeeded()
jobs:
- job: UnitTests
condition: or(eq(stageDependencies.Changed.Changes.outputs['detectChanges.anyServicesChanged'], true), eq(variables['Build.Reason'], 'Manual'))
displayName: Running unit tests...
variables:
changedServices: $[stageDependencies.Changed.Changes.outputs['detectChanges.servicesChanged']]
steps:
- bash: |
echo "Now running api unit tests..."
name: Api-unit-test
conidtion: contains(variables['changedServices'], 'api')
- bash: |
echo "Now running admin unit tests..."
name: admin-unit-test
conidtion: contains(variables['changedServices'], 'admin')
- bash: |
echo "Now running client unit tests..."
name: client-unit-test
conidtion: contains(variables['changedServices'], 'client')
Another workaround is to separate your pipeline into two pipelines. The fist pipeline to run the Changed stage. And Then call rest api in a script task to trigger the second pipeline and pass the variables in the request body. See this similar thread.
I have enabled pipeline resource triggers between two pipelines. would like to replace alias value dynamically with triggering pipeline resource. below is the pipeline code
resources:
pipelines:
- pipeline: pipeline1
project: onecom
source: pipeline1-api
trigger:
branches:
- develop
- feat/*
- pipeline: pipeline2
project: onecom
source: pipeline2-api
trigger:
branches:
- develop
- feat
variables:
- name: apiname
value: $(resources.pipeline.<Alias>.pipelineName)
- name: dockertag
value: $(resources.pipeline.<Alias>.sourceCommit)
- name: runname
value: $(resources.pipeline.<Alias>.runName)
stages:
- stage: ScanImage
jobs:
- job: ScanImage
pool:
vmImage: 'ubuntu-16.04'
steps:
- script: echo $(apiname)
- script: echo $(runname)
I would like to replace Alias value in $(resources.pipeline..pipelineName) with value pipeline1 if build comes from source: pipeline1-api and with pipeline2 if build comes from source: pipeline2-api dynamically.
I would like to replace Alias value in $(resources.pipeline..pipelineName) with value pipeline1 if build comes from source: pipeline1-api and with pipeline2 if build comes from source: pipeline2-api dynamically.
Since the value of nested variables (like $(resources.pipeline.$(alias).pipelineName)) are not yet supported in the build/release pipelines. So we could not use it in the variable directly:
variables:
- name: apiname
value: $(resources.pipeline.$(alias).pipelineName)
To resolve this issue, we need add a inline powershell to set the variable resources.pipeline.<Alias>.pipelineName based on the value of the $(resources.triggeringAlias):
variables:
- name: alias
value: $(resources.triggeringAlias)
- task: InlinePowershell#1
inputs:
script: |
if ("$(alias)" -eq "PipelineA")
{
Write-Host ("##vso[task.setvariable variable=dockertag]$(resources.pipeline.PipelineA.sourceCommit) | cut -c -7")
}
elseif ("$(alias)" -eq "PipelineB")
{
Write-Host ("##vso[task.setvariable variable=dockertag]$(resources.pipeline.PipelineB.sourceCommit) | cut -c -7")
}
Update:
could you please help me same config in bash as we are using these
task in linux machines
- task: PowerShell#2
displayName: 'Inline Powershell'
inputs:
TargetType: inline
Script: |
if ("$(alias)" -eq "PipelineA")
{
Write-Host ("##vso[task.setvariable variable=dockertag]$(resources.pipeline.PipelineA.sourceCommit) | cut -c -7")
}
elseif ("$(alias)" -eq "PipelineB")
{
Write-Host ("##vso[task.setvariable variable=dockertag]$(resources.pipeline.PipelineB.sourceCommit) | cut -c -7")
}
pwsh: true
please try this
variables:
- name: alias
value: $(resources.triggeringAlias)
then you can try replace it as below
$(resources.pipeline.$(alias).pipelineName)
I have a multistage YAML Azure pipeline. I tag my images based on the build ID. This works well until a job fails and I need to rerun it.
On the rerun I get a newly incremented build Id so it no longer references same docker image used in the original run.
$(Build.BuildId)
Is there anyway around this?
Re-running job or stage doesn't change Build.BuildId. I checked this using below pipeline. However if you want to run whole pipeline for the same build id you my try to use runtime paramaters and provide a tagName on your own (like below):
parameters:
- name: tagName
type: string
default: ' '
trigger: none
pr: none
pool:
vmImage: 'ubuntu-latest'
stages:
- stage: A
jobs:
- job: A
steps:
- pwsh: |
$tagName = '$(Build.BuildId)'
if('${{ parameters.tagName }}' -ne ' ') {
$tagName = '${{ parameters.tagName }}'
}
echo $tagName
- job: B
steps:
- bash: echo "B"
- stage: B
jobs:
- job: A
steps:
- pwsh: |
$tagName = '$(Build.BuildId)'
if('${{ parameters.tagName }}' -ne ' ') {
$tagName = '${{ parameters.tagName }}'
}
echo $tagName
- bash: exit 1
- job: B
steps:
- bash: echo "B"
You could expand stage and rerun the stage, this won't change the buildId:
- task: PowerShell#2
displayName: Save Storage account Secrets to Build Variables
inputs:
azureSubscription:
targetType: 'inline'
script: '$outputs = ConvertFrom-Json $($env:STORAGE); foreach ($output in $outputs.PSObject.Properties) { echo $output.Name; echo $output.Value.value; Write-Host ("##vso[task.setvariable variable=$($output.Name);]$($output.Value.value)");}'
- phase: DEVRelease
dependsOn: Build
queue: Hosted Ubuntu 1604
steps:
- task: Kubernetes#1
displayName: Apply Kubernetes Deployment
inputs:
kubernetesServiceEndpoint:
arguments: "-f conf/deploy_local.yaml"
command: apply
azureSubscription:
azureContainerRegistry:
configMapName: myconfig
forceUpdateConfigMap: true
configMapArguments: --from-literal=myname=$($env:STORAGEACCOUNTNAME1)
Never reads the $env:STORAGEACCOUNTNAME variable
Since the PowerShell task to set the variables is in phase build. You need to add isOutput=true to the setvariable statement. Please check Set a multi-job output variable
"##vso[task.setvariable variable=$($output.name);isOutput=true]$($output.Value)"
I made a little bit changes to your yaml for testing. Please check it out. I have the env variable STORAGE = {'tags':[{'name':'A', 'Value':'1' }, { 'name':'B', 'Value':'2'}]}
phases:
- phase: build
queue: Hosted Ubuntu 1604
steps:
- powershell: |
$outputs = ConvertFrom-Json $($env:STORAGE)
foreach ($output in $outputs.tags) { echo $output.name; echo $output.Value; Write-Host ("##vso[task.setvariable variable=$($output.name);isOutput=true]$($output.Value)");}
name: myvariables
- powershell: |
echo "$(myvariables.A)"
echo "$(myvariables.A)"
- phase: DEVRelease
dependsOn: Build
queue: Hosted Ubuntu 1604
variables:
Da: $[ dependencies.build.outputs['myvariables.A'] ]
Db: $[ dependencies.build.outputs['myvariables.B'] ]
steps:
- powershell: |
echo $(Da)
echo $(Db)
In above script I output variable in phase build by adding isOutput=true to the statement, and give my powershell task a name name: myvariables.
And I refer to the output variable in the next phase DEVRelease by using statement $[ dependencies.{dependent phase name}.outputs['{task name}.{variable name}'] ] and assign it to the Variables.
Then i can successfully get the value in powershell task in phase DEVRelease.