Azure DevOps Parameters not recognised by AZ CLI - azure

I am having issues passing parameters defined in an Azure Pipeline YAML to AZ Cli located in a bash script. I found the following solution on Stackoverflow but it doesn't seem to work:
Pass parameter to Azure CLI Task in DevOps
Here is my YAML file:
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
steps:
- task: AzureCLI#2
displayName: Azure CLI
inputs:
azureSubscription: templateresourceconnection
scriptType: bash
scriptPath: ./deployment.sh
arguments:
-resourceGrp 'TEST-RG'
-location 'westeurope'
I would expect to be able to access the arguments in my deployment.sh, which fails:
#!/bin/bash
# Create Resource Group
az group create -n $(resourceGrp) -l $(location)
If I don't pass any arguments and hardcode the values in deployment.sh it all works fine.
Any ideas what could cause this issue? I also tried with UPPERCASE and just without brackets.
I get the following error message
Do you have any idea what else I could try to make it work. Seems like the documentation doesn't contain any example for az cli. Just how to define parameters but not how to pass them afterwards.
Thank you.

Do you have any idea what else I could try to make it work
You could try to use the Environment variable. Environment variables can be accessed by bash script files.
First of all, you need to define pipeline variable instead of task argument.
variables:
- name: one
value: initialValue
Here is my example: Used in Linux system.
az group create -n $RESOURCEGRP -l $LOCATION
Note: All characters in environment variables need to be capitalized.
e.g.: resourceGrp -> $RESOURCEGRP
Yaml sample:
variables:
- name: resourceGrp
value: TEST-RG
- name: location
value: westeurope
pool:
vmImage: 'ubuntu-latest'
steps:
- task: AzureCLI#2
displayName: 'Azure CLI deploy.sh'
inputs:
azureSubscription: kevin0209
scriptType: bash
scriptPath: ./deployment.sh

Related

AzurePowerShell task cannot find filePath

I'm trying to run a PowerShell script in Azure Yaml Pipelines and I'm getting this error:
##[error]The term 'D:\a\1\s\myPowershellFile.ps1' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
Code:
jobs:
- deployment: Deploy
displayName: Deploy
environment: $(myEnvironment)
pool:
vmImage: 'windows-latest'
strategy:
runOnce:
deploy:
steps:
- task: AzurePowerShell#5
displayName: 'Run Powershell script'
inputs:
azureSubscription: $(azureConnectionName)
scriptType: filePath
scriptPath: './myPowershellFile.ps1'
azurePowerShellVersion: latestVersion
The file is pushed out to the repo for the branch that is triggering the build. I've also tried referencing the path explicitly with $(Pipeline.Workspace) and $(Build.SourcesDirectory). Version 4 also does not work. According to the docs this should be working!
After some more research I discovered this article which says that files are not automatically downloaded for deployment jobs. I added this step, and that fixed the issue!
- checkout: self
Couldn't help noticing but you have used the forward slash "/" as the seperator for the filepath instead of a back-slash "\" like in the example you are pointing to in the azure docs :
- task: AzurePowerShell#5
inputs:
azureSubscription: my-arm-service-connection
scriptType: filePath
scriptPath: $(Build.SourcesDirectory)\myscript.ps1
This looks like the reason why.
There is a similar issue here. looks like it depends on whether you are using a windows or linux agent : cannot locate file in azure devops pipeline

How to loop through user-defined variables in a YAML pipeline?

I am trying to loop through user-defined variables in an Azure DevOps YAML pipeline.
The variables have been created through the UI:
Below the YAML pipeline code that I'm using:
trigger:
- dev
- main
pr:
- dev
pool:
vmImage: ubuntu-latest
stages:
- stage:
jobs:
- job: TestVars
steps:
- ${{ each var in variables }}:
- script: |
echo ${{ var.key }}
echo ${{ var.value }}
displayName: ${{ var.key }}
When running the above pipeline only system and build variables are listed (e.g. system, system.hostType, build.queuedBy, etc.).
Any help to loop through user-defined variables would be much appreciated.
Unfortunately, no luck fetching the variables defined in UI. However, if your variables are non-secrets, you can bring them over into the YAML, and they will show up in the loop.
- stage:
variables:
myyamlvar: 1000 # this will show up in the loop
jobs:
- job: TestVars
steps:
- ${{ each var in variables }}:
- script: |
echo ${{ var.key }}
echo ${{ var.value }}
displayName: ${{ var.key }}
Alternatively, instead of using a compile time expression, you can list variables using a runtime construct, for example:
- job: TestRuntimeVars
steps:
- script: |
for var in $(compgen -e); do
echo $var ${!var};
done
This will list all variables including ones defined in the UI.
From the Microsoft docs link you provided, it specifies that:
"Unlike a normal variable, they are not automatically decrypted into
environment variables for scripts. You need to explicitly map secret
variables."
However, one workaround could potentially be to run an azure cli task and get the pipeline variables using az pipelines variable list
Assuming your intention is to get the actual values, in which case maybe that won't suffice. Having said that, you should consider a variable group even if you're not using them in other pipelines since the group can be linked to an Azure KeyVault and map the secrets as variables. You can store your sensitive values in a KeyVault and link it to the variable group which can be used like regular variables in your pipeline.
Or you can access KeyVault secrets right from the AzureKeyVault pipeline task.
To expand on the awnser below. It is a bit round about but you can use the azure devopps CLI. This may be a bit overkill but it does do the job.
trigger:
- main
pool:
vmImage: ubuntu-latest
steps:
- bash: az --version
displayName: 'Show Azure CLI version'
- bash: az devops configure --defaults organization=$(System.TeamFoundationCollectionUri) project=$(System.TeamProject) --use-git-aliases true
displayName: 'Set default Azure DevOps organization and project'
- bash: |
az pipelines variable list --pipeline-id $(System.DefinitionId)
displayName: 'Show build list varibales'
env:
AZURE_DEVOPS_EXT_PAT: $(System.AccessToken)
This approach was taken from a combination of:
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#list-variables
and
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#list-variables
If the agent is self hosted you may need to install the dev opps cli.

Azure DevOps accessing two Key Vaults with duplicate secret names

I currently have an azure build pipeline that needs to access two different Key Vaults. Unfortunately both of the secrets I am trying to access have a name of SQLUserName. I am trying to pass these as arguments to a python script. I am looking for a way that I could qualify or differentiate between the secrets when passing the arguments.
Ideally I would like to access the variable qualified something like $(ServiceConnection1.SQLUserName) But I can't find any information on this.
I have been researching a way to rename a variable so I could possibly run the first Key Vault task then rename $(SQLUserName) to $(SQLUserNamefoo) then run the second Key Vault task and rename to $(SQLUserName) to $(SQLUserNamebar). I can't seem to find anyway to rename a variable in YML.
trigger:
- main
pool:
vmImage: 'ubuntu-latest'
strategy:
matrix:
Python37:
python.version: '3.7'
steps:
- task: AzureKeyVault#1
inputs:
azureSubscription: 'ServiceConnection1'
KeyVaultName: 'Vault1'
SecretsFilter: '*'
RunAsPreJob: true
- task: AzureKeyVault#1
inputs:
azureSubscription: 'ServiceConnection2'
KeyVaultName: 'Vault2'
SecretsFilter: '*'
RunAsPreJob: true
- task: UsePythonVersion#0
inputs:
versionSpec: '$(python.version)'
displayName: 'Use Python $(python.version)'
- script: |
python -m pip install --upgrade pip
pip install -r requirements.txt
displayName: 'Install dependencies'
- task: PythonScript#0
inputs:
scriptSource: 'filePath'
scriptPath: 'keyVaultTest.py'
arguments: '$(SQLUserName))'
#ideal way to work
arguments: '$(SQLUserName1) $(SQLUserName2))'
Azure DevOps accessing two Key Vaults with duplicate secret names
We could add a Inline powershell task with Logging Command to set the variable SQLUserNamefoo with value $(SQLUserName) after the first AzureKeyVault task.
Write-Host ("##vso[task.setvariable variable=SQLUserNamefoo]$(SQLUserName)")
Then we could use the $(SQLUserNamefoo) in the next tasks.
And we could set the another Inline powershell task to set the variable SQLUserNamebar with value $(SQLUserName) after the second AzureKeyVault task
Write-Host ("##vso[task.setvariable variable=SQLUserNamebar]$(SQLUserName)")
As test, I created a Key Vault SQLUserName with value Leotest. In order to verify the SQLUserNamefoo is set to $(SQLUserName), I defined SQLUserNamefoo in the Variables with value 123:
And add another powershell task to output the value of SQLUserNamefoo to a txt file to verify it:
cd $(System.DefaultWorkingDirectory)
New-Item $(System.DefaultWorkingDirectory)\temp -type directory
cd temp
New-Item a.txt -type file
write-output $(SQLUserNamefoo)| out-file -filepath $(System.DefaultWorkingDirectory)\temp\a.txt
The result of txt file:

Referencing release variables in azurepipelines.yml

Currently i am in the process of converting my pipelines from classic over to azurepipelines.yml and im having an issue trying to find the correct syntax to reference release variables in a bash step.
The existing code in a bash task
namebuilder=$(RELEASE.ENVIRONMENTNAME)-$(RELEASE.RELEASEID)
will output the following
dev-2049
however when converted over to my new pipeline file the above code produces the the following error
/home/vsts/work/_temp/ac39e1d7-11bd-4c32-9b1b-1520dae11c5a.sh: line 1: RELEASE.ENVIRONMENTNAME: command not found
/home/vsts/work/_temp/ac39e1d7-11bd-4c32-9b1b-1520dae11c5a.sh: line 1: RELEASE.RELEASEID: command not found
[extracted from pipeline.yml]
- bash: |
namebuilder=$(RELEASE.ENVIRONMENTNAME)-$(RELEASE.RELEASEID)
i have even created a step trying a few different approaches without much luck
steps:
- bash: |
echo This multiline script always runs in Bash.
echo Even on Windows machines!
echo '$(release.environmentname)'
echo $(release.environmentname)
echo '$(RELEASE.ENVIRONMENTNAME)'
echo $(RELEASE.ENVIRONMENTNAME)
produces
This multiline script always runs in Bash.
Even on Windows machines!
$(release.environmentname)
$(RELEASE.ENVIRONMENTNAME)
/home/vsts/work/_temp/260dd504-a42d-45d6-bb1b-bf1f4b015cf8.sh: line 4: release.environmentname: command not found
/home/vsts/work/_temp/260dd504-a42d-45d6-bb1b-bf1f4b015cf8.sh: line 6: RELEASE.ENVIRONMENTNAME: command not found
is it also possible (in a much cleaner approach) to define this as a pipeline variable and reference at a global scope like below ?
variables:
namebuilder: '$(release.environmentname)-$(release.releaseid)'
stages:
- stage: Deploy
displayName: deploy infra
jobs:
- job: deploy_infra
displayName: deploy infra
continueOnError: true
workspace:
clean: outputs
steps:
- bash: |
echo This multiline script always runs in Bash.
echo Even on Windows machines!
echo '$(namebuilder)'
tia
It doesn't look like release.environment or any release variables are available for multi-stage pipelines. You could use the new environment concept and at that point environment.name would be available. I think you would likely go with $(environment.name)-$(build.buildid) for what you are after.
So I am not sure if the release pipelines you are converting are deploying to say an app service, or to a VM, or just using a hosted agent to publish something else? Disclaimer I have not used the Environment concept extensively yet just some reading and limited testing. Its all new!
So for deploying to VMs You can configure a Virtual Machine resource in an environment. This concept has a bunch of parallels with classic deployment group agents. You register an agent on a target machine. From there your pipeline steps can execute in that machine's context and you get a further set of environment variables.
The example pipeline below outputs any environment variables from the context the steps are running in and also outputs $(environment.name)-$(build.buildid)
A normal job in a hosted pipeline
A Deployment to an Environment
A Deployment to an Environment with a VM resource
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
variables:
namebuilder: '$(environment.name)-$(build.buildid)'
jobs:
- job: NormalJobInHostedPipeline
steps:
- task: PowerShell#2
name: EnvironmentVariables
inputs:
targetType: 'inline'
script: 'gci env:* | sort-object name'
- bash: |
echo This multiline script always runs in Bash.
echo Even on Windows machines!
echo '$(namebuilder)'
# track deployments on the environment
- deployment: DeploymentHostedContext
displayName: Runs in Hosted Pool
pool:
vmImage: 'Ubuntu-16.04'
# auto creates an environment if it doesn't exist
environment: 'Dev'
strategy:
runOnce:
deploy:
steps:
- task: PowerShell#2
name: EnvironmentVariables
inputs:
targetType: 'inline'
script: 'gci env:* | sort-object name'
- bash: |
echo This multiline script always runs in Bash.
echo Even on Windows machines!
echo '$(namebuilder)'
# Similar to Deployment Group Agent need to register them -stage will fail if resource does not exist
# https://learn.microsoft.com/en-us/azure/devops/pipelines/process/environments-virtual-machines?view=azure-devops
- deployment: DeploymentVirtualMachineContext
displayName: Run On Virtual Machine Agent
environment:
name: DevVM
resourceType: VirtualMachine
strategy:
runOnce:
deploy:
steps:
- task: PowerShell#2
name: EnvironmentVariables
inputs:
targetType: 'inline'
script: 'gci env:* | sort-object name'
- task: PowerShell#2
name: VariableName
inputs:
targetType: 'inline'
script: 'echo $(namebuilder)'
Use $(System.StageName) in place of $(Release.EnvironmentName), as for release Id, you'd need to use $(Build.BuildId)
I found that $(Environment.Name) doesn't work unless you're using environments. I'm not since it's still quite limited.

AzurePowerShell#4 - powershell commands fail to execute on azure pipeline

As a part of my yml pipeline definition, I have a AzurePowerShell#4 task, following is an extract from my pipeline definition
stages:
- stage: DeployDemoCluster
jobs:
- job: 'DeployAKSAndAll'
pool:
vmImage: 'windows-latest'
steps:
- task: AzurePowerShell#4
displayName: Store AI instrumentation key for Inbound Processor in central KeyVault
inputs:
azureSubscription: 'service-connection'
azurePowerShellVersion: LatestVersion
pwsh: true
ScriptType: 'FilePath'
ScriptPath: 'AKS/ps/update_kv_firewall.ps1'
The issue is, within my update_kv_firewall.ps1, all the powershell commands fail with the error, for example:
[error]Login-AzureRmAccount : The term 'Login-AzureRmAccount' is not recognized as the name of a cmdlet, function, script file, or operable program.
The script when executed individually / standalone, works perfectly fine.
what am I missing here?
As per your comment: the command "Get-AzKeyVault" runs without any errors, while 'Get-AzureRmVirtualNetwork' leads to errors.
Then I'm sure that you're installing the new Az module of azure powershell. So the command like Get-AzKeyVault can work.
Since you're installing Az module, please use all the commands from Az module. Almost each azure Rm command has an equivalent azure Az command, you can find it from the Az command list.
Note: the command like Get-AzureRmVirtualNetwork / Login-AzureRmAccount is from azure RM module, which will be retired this year later.

Resources