parameters:
- name: AzureSubscription
default: 'abc'
- name: BlobName
type: string
default: ""
stages:
- stage: MyStage
displayName: 'My Stage'
variables:
- name: sas
jobs:
- job: ABC
displayName: ABC
steps:
- task: AzureCLI#2
displayName: 'XYZ'
inputs:
azureSubscription: ${{ parameters.AzureSubscription }}
scriptType: pscore
arguments:
scriptLocation: inlineScript
inlineScript: |
$sas=az storage account generate-sas --account-key "mykey" --account-name "abc" --expiry (Get-Date).AddHours(100).ToString("yyyy-MM-dTH:mZ") --https-only --permissions rw --resource-types sco --services b
Write-Host "My Token: " $sas
- task: PowerShell#2
inputs:
targetType: 'filepath'
filePath: $(System.DefaultWorkingDirectory)/psscript.ps1
arguments: >
-Token "????"
-BlobName "${{parameters.BlobName}}"
displayName: 'some work'
In this Azure Devops yaml, i have created 2 tasks. AzureCLI#2 and PowerShell#2
In AzureCLI#2 i get value in $sas varaible. Write-Host confirms that, but $sas does not get passes as parameter to PowerShell#2 powershell file as parameter.
"${{parameters.BlobName}}" is working fine. In powershell i am able to read that value.
How to pass sas variable value?
Tried
-Token $sas # not worked
-Token "${{sas}}" # not worked
Different tasks in Azure Pipeline don't share a common runspace that would allow them to preserve or pass on variables.
For this reason Azure Pipelines offers special logging commands that allow to take string output from a task to update an Azure Pipeline environment variable that can be used in subsequent tasks: Set variables in scripts (Microsoft Docs).
In your case you would use a logging command like this to make your sas token available to the next task:
Write-Host "##vso[task.setvariable variable=sas]$sas"
In the argument of your subsequent task (within the same job) use the variable syntax of Azure Pipelines:
-Token '$(sas)'
In Azure DevOps we have the follwing YAML pipeline which is applying Terraform configuration from a CmdLine task.
The Output task should return the ObjectId of a Data Factory after it is deployed by Terraform.
I would like to use that ObjectId and pass it to the next Azure Powershell Task as a parameter so I can add that Id as member to an AzureADGroup.
How can I use the output from the step called 'Terraform output' in the next Powershell Task?
- task: CmdLine#2
displayName: Terraform Apply
enabled: False
inputs:
script: terraform apply -auto-approve -input=false tfplan
workingDirectory: infrastructure/tf_scripts/dev
- task: CmdLine#2
displayName: Terraform output
enabled: False
inputs:
script: |
terraform output adf_objectid
workingDirectory: infrastructure/tf_scripts/dev
- task: AzurePowerShell#4
displayName: 'Azure PowerShell script: InlineScript'
inputs:
azureSubscription: 'a6cb1cd3-8d5e-4db6-8af5-bcb66492d5cc'
ScriptType: 'InlineScript'
Inline: |
$spn=(terraform output adf_objectid)
Connect-AzureAD -AadAccessToken $aadToken -AccountId $context.Account.Id -TenantId $context.tenant.id -MsAccessToken $graphToken
Add-AzureADGroupMember -ObjectId xxxxx-xxxxx-xxxxx -RefObjectId $spn
workingDirectory: wd/scripts/dev
azurePowerShellVersion: 'LatestVersion'
Passing Terraform output to Powershell task in Azure Devosps
You could try to use the the Logging Command to set the adf_objectid as an azure devops pipeline variable:
echo ##vso[task.setvariable variable=spn]$(terraform output adf_objectid)
Check the similar thread for some details.
I am having issues passing parameters defined in an Azure Pipeline YAML to AZ Cli located in a bash script. I found the following solution on Stackoverflow but it doesn't seem to work:
Pass parameter to Azure CLI Task in DevOps
Here is my YAML file:
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
steps:
- task: AzureCLI#2
displayName: Azure CLI
inputs:
azureSubscription: templateresourceconnection
scriptType: bash
scriptPath: ./deployment.sh
arguments:
-resourceGrp 'TEST-RG'
-location 'westeurope'
I would expect to be able to access the arguments in my deployment.sh, which fails:
#!/bin/bash
# Create Resource Group
az group create -n $(resourceGrp) -l $(location)
If I don't pass any arguments and hardcode the values in deployment.sh it all works fine.
Any ideas what could cause this issue? I also tried with UPPERCASE and just without brackets.
I get the following error message
Do you have any idea what else I could try to make it work. Seems like the documentation doesn't contain any example for az cli. Just how to define parameters but not how to pass them afterwards.
Thank you.
Do you have any idea what else I could try to make it work
You could try to use the Environment variable. Environment variables can be accessed by bash script files.
First of all, you need to define pipeline variable instead of task argument.
variables:
- name: one
value: initialValue
Here is my example: Used in Linux system.
az group create -n $RESOURCEGRP -l $LOCATION
Note: All characters in environment variables need to be capitalized.
e.g.: resourceGrp -> $RESOURCEGRP
Yaml sample:
variables:
- name: resourceGrp
value: TEST-RG
- name: location
value: westeurope
pool:
vmImage: 'ubuntu-latest'
steps:
- task: AzureCLI#2
displayName: 'Azure CLI deploy.sh'
inputs:
azureSubscription: kevin0209
scriptType: bash
scriptPath: ./deployment.sh
I currently have an azure build pipeline that needs to access two different Key Vaults. Unfortunately both of the secrets I am trying to access have a name of SQLUserName. I am trying to pass these as arguments to a python script. I am looking for a way that I could qualify or differentiate between the secrets when passing the arguments.
Ideally I would like to access the variable qualified something like $(ServiceConnection1.SQLUserName) But I can't find any information on this.
I have been researching a way to rename a variable so I could possibly run the first Key Vault task then rename $(SQLUserName) to $(SQLUserNamefoo) then run the second Key Vault task and rename to $(SQLUserName) to $(SQLUserNamebar). I can't seem to find anyway to rename a variable in YML.
trigger:
- main
pool:
vmImage: 'ubuntu-latest'
strategy:
matrix:
Python37:
python.version: '3.7'
steps:
- task: AzureKeyVault#1
inputs:
azureSubscription: 'ServiceConnection1'
KeyVaultName: 'Vault1'
SecretsFilter: '*'
RunAsPreJob: true
- task: AzureKeyVault#1
inputs:
azureSubscription: 'ServiceConnection2'
KeyVaultName: 'Vault2'
SecretsFilter: '*'
RunAsPreJob: true
- task: UsePythonVersion#0
inputs:
versionSpec: '$(python.version)'
displayName: 'Use Python $(python.version)'
- script: |
python -m pip install --upgrade pip
pip install -r requirements.txt
displayName: 'Install dependencies'
- task: PythonScript#0
inputs:
scriptSource: 'filePath'
scriptPath: 'keyVaultTest.py'
arguments: '$(SQLUserName))'
#ideal way to work
arguments: '$(SQLUserName1) $(SQLUserName2))'
Azure DevOps accessing two Key Vaults with duplicate secret names
We could add a Inline powershell task with Logging Command to set the variable SQLUserNamefoo with value $(SQLUserName) after the first AzureKeyVault task.
Write-Host ("##vso[task.setvariable variable=SQLUserNamefoo]$(SQLUserName)")
Then we could use the $(SQLUserNamefoo) in the next tasks.
And we could set the another Inline powershell task to set the variable SQLUserNamebar with value $(SQLUserName) after the second AzureKeyVault task
Write-Host ("##vso[task.setvariable variable=SQLUserNamebar]$(SQLUserName)")
As test, I created a Key Vault SQLUserName with value Leotest. In order to verify the SQLUserNamefoo is set to $(SQLUserName), I defined SQLUserNamefoo in the Variables with value 123:
And add another powershell task to output the value of SQLUserNamefoo to a txt file to verify it:
cd $(System.DefaultWorkingDirectory)
New-Item $(System.DefaultWorkingDirectory)\temp -type directory
cd temp
New-Item a.txt -type file
write-output $(SQLUserNamefoo)| out-file -filepath $(System.DefaultWorkingDirectory)\temp\a.txt
The result of txt file:
According to https://learn.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints there's a rich array of Service Connection types. I can easily manage a set of service connections at the project level and set permissions to limit which users are able to view/edit them -- this is all good.
But I can't figure out how to access a Service Connection with a script step in my build pipeline. For example, let's say I have a Service Connection representing credentials for an Azure Service Principal. I'd like to access those credentials in a script step.
How can I write a script step that makes use of them?
Because a Service Connection involves data shaped specifically to the connected service (the Generic Service Connection being the exception that proves the rule...), you won't be able to make use of strongly typed properties in your Bash task. Instead, you may want to examine environment variables and process the service connection data manually.
Based on a survey of some of the tasks in the Azure DevOps repos, it appears that service connections and their data are populated as environment variables on the agent running the build task. The service connections are retrieved via a method that runs a given name string through the following regex before retrieving the resultant environment key's value:
process.env[name.replace(/\./g, '_').toUpperCase()];
The retrieval of various Service Endpoint data is wrapped in the vsts-task-lib/task module, allowing consuming tasks to write code like so:
taskLib.getEndpointAuthorization('SYSTEMVSSCONNECTION', false);
taskLib.getEndpointDataParameter('MYSERVICECONNECTION', 'SOME_PARAMETER_NAME', false);
taskLib.getEndpointUrl('MYSERVICECONNECTION', false) // <-- last param indicates required or not
Therefore, if you wanted to access service connections in a bash script without any additional customization, I would recommend that you:
a) Validate the availability of service connection information in the build script task by iterating and writing environment variables, setting the system.debug environment variable. There's some indication that build tasks aren't "seeded" with connections they aren't requesting specifically, so you may need to create a custom build task which has as one of its' inputs the service connection name you want to use
b) read the desired values from variables as outlined above in your bash script. Service connection variable names may be computed similarly to this:
var dataParam = getVariable('ENDPOINT_DATA_' + id + '_' + key.toUpperCase());
You may need to iterate against this to determine the data schema/structure.
I've been wondering about this too. The solution I've settled on is to use the 'Azure CLI' task rather than the basic 'Script' (or 'Bash') task. This is ostensibly for running Az CLI commands, but there's nothing to stop you running only standard Bash scripts (or PSCore if that's your thing).
If you examine the environment variables present when you run this task, you'll see a bunch of information about the Service Connection in variables prefixed with 'ENDPOINT_DATA_'. This tallies up with what Josh E was saying. It includes Azure Subscription ID, name, Service Principle Object ID, etc.
Optionally you can enable the Service Principle details to be added to the environment too. This will then include SPN key, TenantID, etc. as secret environment variables.
Here's what the tasks look like:
- task: AzureCLI#2
displayName: 'Azure CLI'
inputs:
scriptType: bash
scriptLocation: inlineScript
azureSubscription: '<Service Connection Name>'
inlineScript: |
env | sort
- task: AzureCLI#2
displayName: 'Azure CLI, with SPN info'
inputs:
scriptType: bash
scriptLocation: inlineScript
azureSubscription: '<Service Connection Name>'
addSpnToEnvironment: true
inlineScript: |
env | sort
Of course this is all only applicable to Azure Cloud Service Connections. There might be similar techniques you could use for other Service Connections, but I haven't investigated them.
I found that if I use the Kubectl task with the command to login right before I run my bash Task, I do not need to authenticate or use a hostname.
KUBERNETESNODE and SERVICEPROTOCOL are Pipeline variables that I set a priori.
- task: Kubernetes#1
displayName: 'Kubernetes Login'
# This is needed to run kubectl command from bash.
inputs:
connectionType: 'Kubernetes Service Connection'
kubernetesServiceEndpoint: '<Service Connection Name>'
command: 'login'
- task: Bash#3
displayName: 'Run Component Test'
inputs:
targetType: 'inline'
script: |
#Get the Node Port
nodePort=`kubectl get --namespace $(Build.BuildId) svc <service name> -o=jsonpath='{.spec.ports[0].nodePort}'`
#Run Newman test
newman run postman/Service.postman_collection.json --global-var host=$KUBERNETESNODE --global-var protocol=$SERVICEPROTOCOL --global-var port=$nodePort -r junit
I am using the same service connection in my scripts/tools as for the ARM deployments.
In order to export the variables, I created the following template.
parameters:
- name: azureSubscription
type: string
- name: exportAsOutput
type: boolean
default: false
steps:
- task: AzureCLI#2
name: exported_azure_credentials
displayName: 'Export Azure Credentials'
inputs:
azureSubscription: '${{ parameters.azureSubscription }}'
scriptType: pscore
scriptLocation: inlineScript
addSpnToEnvironment: true
${{ if eq(parameters.exportAsOutput, true) }}:
inlineScript: |
Write-Host "##vso[task.setvariable variable=AZURE_TENANT_ID]$env:tenantId"
Write-Host "##vso[task.setvariable variable=AZURE_TENANT_ID;isOutput=true]$env:tenantId"
Write-Host "##vso[task.setvariable variable=AZURE_CLIENT_ID]$env:servicePrincipalId"
Write-Host "##vso[task.setvariable variable=AZURE_CLIENT_ID;isOutput=true]$env:servicePrincipalId"
Write-Host "##vso[task.setvariable variable=AZURE_CLIENT_SECRET]$env:servicePrincipalKey"
Write-Host "##vso[task.setvariable variable=AZURE_CLIENT_SECRET;isOutput=true]$env:servicePrincipalKey"
${{ if eq(parameters.exportAsOutput, false) }}:
inlineScript: |
Write-Host "##vso[task.setvariable variable=AZURE_TENANT_ID]$env:tenantId"
Write-Host "##vso[task.setvariable variable=AZURE_CLIENT_ID]$env:servicePrincipalId"
Write-Host "##vso[task.setvariable variable=AZURE_CLIENT_SECRET]$env:servicePrincipalKey"
DevOps is really clever about secrets, so they do not show up in the pipeline logs.
As others have stated, there isn't a great built-in way to access Service Connections with a script. As I do not like the workaround of exposing credentials via long-lived environment variables (for security and laziness purposes), I wrote an extension that allows you to utilize a Generic Service Connection with a custom script: https://marketplace.visualstudio.com/items?itemName=cloudpup.authenticated-scripts
It does this by exposing the service connection as environment variables that only last for the lifetime of the single script task:
Service Connection Variable
Environment Variable
url
AS_SC_URL
username
AS_SC_USERNAME
password
AS_SC_PASSWORD
Example
The tasks included in this extension allow you to write succinct pipelines like the following:
steps:
- task: AuthenticatedPowerShell#1
inputs:
serviceConnection: 'Testing Authenticated Shell'
targetType: inline
script: 'Write-Host "url: $env:AS_SC_URL | username: $env:AS_SC_USERNAME | password: $env:AS_SC_PASSWORD"'
Yes this can be achieved, I use this method all of the time, you first need a task which will output the credentials into environment variables, you then create your own variables from the environment variables the task outputs, I usually use AzureCLI:
# Set Variables.
- task: AzureCLI#2
name: setVariables
displayName: Set Output Variables
continueOnError: false
inputs:
azureSubscription: nameOfAzureServiceConnection
scriptType: ps
scriptLocation: inlineScript
addSpnToEnvironment: true # this must be set to true
inlineScript: |
Write-Host "##vso[task.setvariable variable=azureClientId;isOutput=true]$($env:servicePrincipalId)"
Write-Host "##vso[task.setvariable variable=azureClientSecret;isOutput=true]$($env:servicePrincipalKey)"
Write-Host "##vso[task.setvariable variable=azureTenantId;isOutput=true]$($env:tenantId)"
Then you can use these new variables you have set in a step below, make sure you call the variables with the task name, so $(taskName.variableName), the below example uses the new variables to set environment variables in a later PowerShell task for Terraform to use for authentication:
- PowerShell: |
terraform plan -input=false -out=tfplan
displayName: 'Terraform Plan'
env:
ARM_CLIENT_ID: $(setVariables.azureClientId)
ARM_CLIENT_SECRET: $(setVariables.azureClientSecret)
ARM_TENANT_ID: $(setVariables.azureTenantId)
ref: https://jimferrari.com/2021/11/15/access-azure-service-connection-via-script/
If you need to use the service connection to get authorized to different services/resources, you can also get the required tokens with the service connection and pass them to scripts that can't use the service connection directly, like:
- task: AzurePowerShell#5
inputs:
azureSubscription: 'AzureServiceConnection'
ScriptType: 'InlineScript'
Inline: |
$token = Get-AzAccessToken
echo "##vso[task.setvariable variable=accesstoken;]$($token.Token)"
azurePowerShellVersion: 'LatestVersion'
- script: 'echo This is the token: $(accesstoken)'