Azure DevOps accessing two Key Vaults with duplicate secret names - azure

I currently have an azure build pipeline that needs to access two different Key Vaults. Unfortunately both of the secrets I am trying to access have a name of SQLUserName. I am trying to pass these as arguments to a python script. I am looking for a way that I could qualify or differentiate between the secrets when passing the arguments.
Ideally I would like to access the variable qualified something like $(ServiceConnection1.SQLUserName) But I can't find any information on this.
I have been researching a way to rename a variable so I could possibly run the first Key Vault task then rename $(SQLUserName) to $(SQLUserNamefoo) then run the second Key Vault task and rename to $(SQLUserName) to $(SQLUserNamebar). I can't seem to find anyway to rename a variable in YML.
trigger:
- main
pool:
vmImage: 'ubuntu-latest'
strategy:
matrix:
Python37:
python.version: '3.7'
steps:
- task: AzureKeyVault#1
inputs:
azureSubscription: 'ServiceConnection1'
KeyVaultName: 'Vault1'
SecretsFilter: '*'
RunAsPreJob: true
- task: AzureKeyVault#1
inputs:
azureSubscription: 'ServiceConnection2'
KeyVaultName: 'Vault2'
SecretsFilter: '*'
RunAsPreJob: true
- task: UsePythonVersion#0
inputs:
versionSpec: '$(python.version)'
displayName: 'Use Python $(python.version)'
- script: |
python -m pip install --upgrade pip
pip install -r requirements.txt
displayName: 'Install dependencies'
- task: PythonScript#0
inputs:
scriptSource: 'filePath'
scriptPath: 'keyVaultTest.py'
arguments: '$(SQLUserName))'
#ideal way to work
arguments: '$(SQLUserName1) $(SQLUserName2))'

Azure DevOps accessing two Key Vaults with duplicate secret names
We could add a Inline powershell task with Logging Command to set the variable SQLUserNamefoo with value $(SQLUserName) after the first AzureKeyVault task.
Write-Host ("##vso[task.setvariable variable=SQLUserNamefoo]$(SQLUserName)")
Then we could use the $(SQLUserNamefoo) in the next tasks.
And we could set the another Inline powershell task to set the variable SQLUserNamebar with value $(SQLUserName) after the second AzureKeyVault task
Write-Host ("##vso[task.setvariable variable=SQLUserNamebar]$(SQLUserName)")
As test, I created a Key Vault SQLUserName with value Leotest. In order to verify the SQLUserNamefoo is set to $(SQLUserName), I defined SQLUserNamefoo in the Variables with value 123:
And add another powershell task to output the value of SQLUserNamefoo to a txt file to verify it:
cd $(System.DefaultWorkingDirectory)
New-Item $(System.DefaultWorkingDirectory)\temp -type directory
cd temp
New-Item a.txt -type file
write-output $(SQLUserNamefoo)| out-file -filepath $(System.DefaultWorkingDirectory)\temp\a.txt
The result of txt file:

Related

How to set variables, defined in Azure variable group, as env variables in Azure pipeline?

We run integration tests, written in Python, in Azure pipeline.
The run is triggered in this way:
- script: |
pdm run pytest \
--variables "$VARIABLE_FILE" \
--napoleon-docstrings \
--doctest-modules \
--color=yes \
--junitxml=junit/test-results.xml \
integration
env:
<different_environment_variables>
Some integration tests will make connection to a database, and the properties, needed for connection to the database, are stored in a variable group in Azure. I can get them via :
- powershell: |
az pipelines variable-group variable list --group-id <some_group_id> --output table
but i do not know how to set them later as environment variables and use them in the python code?
You can reference the variables from your variable group directly in your python YAML pipeline like below :-
# Python package
# Create and test a Python package on multiple Python versions.
# Add steps that analyze code, save the dist with the build record, publish to a PyPI-compatible index, and more:
# https://docs.microsoft.com/azure/devops/pipelines/languages/python
trigger:
- master
pool:
vmImage: ubuntu-latest
strategy:
matrix:
Python27:
python.version: '2.7'
Python35:
python.version: '3.5'
Python36:
python.version: '3.6'
Python37:
python.version: '3.7'
variables:
- group: SharedVariables
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '$(python.version)'
displayName: 'Use Python $(python.version)'
- script: |
echo $(databaseservername)
- script: |
python -m pip install --upgrade pip
pip install -r requirements.txt
displayName: 'Install dependencies'
- script: |
pip install pytest pytest-azurepipelines
pytest
displayName: 'pytest'
variables:
- group: SharedVariables
- script: |
echo $(databaseservername)
Pipeline ran successfully like below :-
Authorize and allow the permission to run the pipeline like below :-
Permit access :-
Pipeline ran successfully with the variable:-
You can also pass the variables in the pipeline with arguments and inline script with python task like below :-
variables:
- group: variableGroup
steps:
- task: PythonScript#0
displayName: 'Run a Python script'
inputs:
scriptPath: 'Test.py'
arguments: --variableInScript $(variableInVariableGroup)
Inline Python task:-
variables:
- group: SharedVariables
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '$(python.version)'
displayName: 'Use Python $(python.version)'
- script: |
echo $(databaseservername)
- task: PythonScript#0
inputs:
scriptSource: 'inline'
script: 'print(''databaseservername: $(databaseservername))'')'
Output :-
databaseservername variable was encrypted like below after running as an argument in the python inline script.
You can link the variable group in the release pipeline like below and call it in multiple pipelines across different stages too :-
Similarly, you can add Azure CLI task for your shell script to callaz pipelines variable-group variable list in your yaml pipeline and reference it like below :-
- task: AzureCLI#2
inputs:
azureSubscription: '<Subscription-name>(sub-id)'
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: |
az pipelines variable-group variable list
--org ''https://dev.azure.com/org/''
--project project --group-id id
--only-show-errors --output json
Reference:-
Access variables from Variable Groups inside Python script task in Azure DevOps Yaml pipeline By Bo Soborg Perterson
Pass Variable Group as Dictionary To Python Script By Joy wang

Key Vault secret values are displayed as plain text in Bash script

AzureKeyVault#1 task retrieves all the secrets, some of the secrets are displayed as *** whereas some newly created ones are shown as plain text.
A part of my pipeline:
steps:
- task: AzureKeyVault#1
displayName: Download secrets from KeyVault
inputs:
azureSubscription: azure_sub
KeyVaultName: key_vault
SecretsFilter: '*'
RunAsPreJob: true
- task: PipAuthenticate#1
displayName: Authentication step
inputs:
artifactFeeds: organization
onlyAddExtraIndex: true
- script: |
echo "##vso[task.setvariable variable=keyvault_variable;isOutput=true]$(keyvault_variable)"
displayName: Set environment variables
name: SetVariables
- stage: Stage2
jobs:
- job: check_if_encrypted
steps:
- task: CmdLine#2
displayName: Write secrets
inputs:
script: |
echo keyvault_variable
Is there any changes to the Azure Key Vault or wrong with the pipeline?
Thanks
You're creating an unencrypted copy of the secret value with echo "##vso[task.setvariable variable=keyvault_variable;isOutput=true]$(keyvault_variable)". You should specify isSecret=true if you want it to continue to be a secret.
Refer to the documentation for more details.
It seems we have to explicitly mention : issecret=true in the
echo "##vso[task.setvariable variable=keyvault_variable;isOutput=true]$(keyvault_variable)" script. Only then it masks.
What is not clear is why this has to be set for certain for certain sercrets whereas for others it worked without explicitly mentioning.

Azure Pipeline python script environment variables

I have a python script which it has 2 line of code where I have my username and password and I don't want to push them to GitHub for obvious reasons, but they are important for my azure DevOps pipeline to run successfully. Following some documentation, I set in my python script the òs.environ.get` to be able to retrieve the value from a environment variable.
my code looks like this.
import os
usernameot = os.environ.get('USERNAMEOT')
passwordot = os.environ.get('PASSWORDOT')
wait.until(EC.element_to_be_clickable((By.CSS_SELECTOR, "div[class$='visible-lg'] input#signInFormUsername"))).send_keys(usernameot)
wait.until(EC.element_to_be_clickable((By.CSS_SELECTOR, "div[class$='visible-lg'] input#signInFormPassword"))).send_keys(passwordot)
This was the first step and after this problem started. I have a azure pipeline that looks like this.
trigger: none
pool:
vmImage: 'ubuntu-latest'
stages:
- stage:
jobs:
- job: Configuration
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '3.8'
addToPath: true
- script: |
python -m pip install --upgrade pip
pip install selenium
printenv
- task: Pythonscript#0
inputs:
scriptSource: 'filePath'
scriptPath: './script1.py'
env:
USERNAMEOT: $(usernameot)
PASSWORDOT: $(passwordot)
- job: Mailinator
dependsOn: Configuration
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '3.x'
addToPath: true
- script: |
python -m pip install --upgrade pip
pip install selenium
- task: Pythonscript#0
inputs:
scriptSource: 'filePath'
scriptPath: './script2.py'
I tried to pass the environment variable in all the ways that I know.
The pipeline above is the latest one, I tried to store the variables in azure DevOps pipeline but it fails as my python script doesn't find the username value in the environment variables.
I tried to use GitHub secret and environment but it fails because it doesn't reconize the key secret.USERNAME
Can please any of you help me to understand how I can set a environment variable on my pipeline vm during the run time?
EDIT:
I tried all the solutions advice by bazetto (thank you so much for your help) but still facing the same issue.
As you can see, the pipeline return the correct values for my variables, but those are not passed to the python script.
Even if the error point to the css selector no found, I am pretty sure that the web driver is working because before to get to the username and password, there are different buttons to click etc. So what is my guess, is that when it comes to pass the username and password, as the variable does not get read from my configuration, the selenium script timeout.
Any advice about this please?
UPDATE:
I did some update to the pipeline and my python script.
In python I am getting the username and configuration from a json file that get populated during the runtime, and I am feeding those variables values to my script.
following another advice, I set the variables on each step of my pipeline, but yet I am getting an error, as follow:
Traceback (most recent call last):
File "D:\a\1\s\script1.py", line 49, in <module>
wait.until(EC.element_to_be_clickable((By.CSS_SELECTOR, "div[class$='visible-lg'] input#signInFormUsername"))).send_keys(usernameot)
File "C:\hostedtoolcache\windows\Python\3.8.10\x64\lib\site-packages\selenium\webdriver\support\wait.py", line 80, in until
raise TimeoutException(message, screen, stacktrace)
selenium.common.exceptions.TimeoutException: Message
##[error]The process '/opt/hostedtoolcache/Python/3.8.11/x64/bin/python' failed with exit code 1
I really don't understand what I am doing wrong, I think I understand all the steps and what everyone is suggesting, but I might have miss something in my scripts and configuration for sure
When you define variables in Azure DevOps you have an option to mark them as secrets.
Above username is not a secrets, and password is. In this case not-secret variables are mapped to environment variables.
What you can check with this:
steps:
- script: env | sort
displayName: 'Display env variables'
PATH=/home/linuxbrew/.linuxbrew/bin:/home/linuxbrew/.linuxbrew/sbin:/home/vsts/.local/bin:/opt/pipx_bin:/usr/share/rust/.cargo/bin:/home/vsts/.config/composer/vendor/bin:/usr/local/.ghcup/bin:/home/vsts/.dotnet/tools:/snap/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin
PIPELINE_WORKSPACE=/home/vsts/work/1
PIPX_BIN_DIR=/opt/pipx_bin
PIPX_HOME=/opt/pipx
.
.
.
TASK_DISPLAYNAME=Display env variables
TF_BUILD=True
USER=vsts
USERNAME=someusername
VCPKG_INSTALLATION_ROOT=/usr/local/share/vcpkg
However, there is still a way how to map them to env variables. There is explicit mapping like below:
- bash: |
echo "Using the mapped env var for this task works and is recommended: $MY_MAPPED_PASSWORD"
env:
MY_MAPPED_PASSWORD: $(password)
But this is limited to task scope. So you can use this env variable only in this task. If you need this in more tasks, then you need to repeat that env mapping accross every task.
So if you have both username and password defined as secret you need to mapp both values:
- task: Pythonscript#0
inputs:
scriptSource: 'filePath'
scriptPath: './script1.py'
env:
PASSWORD: $(password)
USERNAME: $(username)
And if you see *** in logs this is masking which Azure DevOps does to hide your secret if it appears anywhere in logs.
You might face issues related to environment variables due to admin privileges.
My suggestion is to replace it with a file, like
config.json
{
"username": "",
"password": ""
}
in your pipeline, add a step to populate it:
- stage: temp
jobs:
- job: temp
displayName: temp
variables:
- name: USERNAME
value: "USER1"
- name: PASSWORD
value: "MYPWD1"
steps:
- checkout: self
- task: PowerShell#2
displayName: "Set configuration"
inputs:
targetType: "inline"
script: |
#read config
$config = Get-Content (Get-Item .\config.json) -Raw -Encoding UTF8 | ConvertFrom-Json
#set the credentials
$config.username = "$(USERNAME)" #assuming that you already have it
$config.password = "$(PASSWORD)" #assuming that you already have it
#update the file
$config | ConvertTo-Json | Set-Content .\config.json
- task: PowerShell#2
displayName: "show configuration"
inputs:
targetType: "inline"
script: |
#debug
cat config.json
The configuration step must show something like:
in your python app something like:
import json
f = open('config.json',)
data = json.load(f)
username = data['username']
password = data['password']
#debug
print(data)

Azure DevOps Parameters not recognised by AZ CLI

I am having issues passing parameters defined in an Azure Pipeline YAML to AZ Cli located in a bash script. I found the following solution on Stackoverflow but it doesn't seem to work:
Pass parameter to Azure CLI Task in DevOps
Here is my YAML file:
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
steps:
- task: AzureCLI#2
displayName: Azure CLI
inputs:
azureSubscription: templateresourceconnection
scriptType: bash
scriptPath: ./deployment.sh
arguments:
-resourceGrp 'TEST-RG'
-location 'westeurope'
I would expect to be able to access the arguments in my deployment.sh, which fails:
#!/bin/bash
# Create Resource Group
az group create -n $(resourceGrp) -l $(location)
If I don't pass any arguments and hardcode the values in deployment.sh it all works fine.
Any ideas what could cause this issue? I also tried with UPPERCASE and just without brackets.
I get the following error message
Do you have any idea what else I could try to make it work. Seems like the documentation doesn't contain any example for az cli. Just how to define parameters but not how to pass them afterwards.
Thank you.
Do you have any idea what else I could try to make it work
You could try to use the Environment variable. Environment variables can be accessed by bash script files.
First of all, you need to define pipeline variable instead of task argument.
variables:
- name: one
value: initialValue
Here is my example: Used in Linux system.
az group create -n $RESOURCEGRP -l $LOCATION
Note: All characters in environment variables need to be capitalized.
e.g.: resourceGrp -> $RESOURCEGRP
Yaml sample:
variables:
- name: resourceGrp
value: TEST-RG
- name: location
value: westeurope
pool:
vmImage: 'ubuntu-latest'
steps:
- task: AzureCLI#2
displayName: 'Azure CLI deploy.sh'
inputs:
azureSubscription: kevin0209
scriptType: bash
scriptPath: ./deployment.sh

How can a script access Service Connections? (Azure Devops Pipelines)

According to https://learn.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints there's a rich array of Service Connection types. I can easily manage a set of service connections at the project level and set permissions to limit which users are able to view/edit them -- this is all good.
But I can't figure out how to access a Service Connection with a script step in my build pipeline. For example, let's say I have a Service Connection representing credentials for an Azure Service Principal. I'd like to access those credentials in a script step.
How can I write a script step that makes use of them?
Because a Service Connection involves data shaped specifically to the connected service (the Generic Service Connection being the exception that proves the rule...), you won't be able to make use of strongly typed properties in your Bash task. Instead, you may want to examine environment variables and process the service connection data manually.
Based on a survey of some of the tasks in the Azure DevOps repos, it appears that service connections and their data are populated as environment variables on the agent running the build task. The service connections are retrieved via a method that runs a given name string through the following regex before retrieving the resultant environment key's value:
process.env[name.replace(/\./g, '_').toUpperCase()];
The retrieval of various Service Endpoint data is wrapped in the vsts-task-lib/task module, allowing consuming tasks to write code like so:
taskLib.getEndpointAuthorization('SYSTEMVSSCONNECTION', false);
taskLib.getEndpointDataParameter('MYSERVICECONNECTION', 'SOME_PARAMETER_NAME', false);
taskLib.getEndpointUrl('MYSERVICECONNECTION', false) // <-- last param indicates required or not
Therefore, if you wanted to access service connections in a bash script without any additional customization, I would recommend that you:
a) Validate the availability of service connection information in the build script task by iterating and writing environment variables, setting the system.debug environment variable. There's some indication that build tasks aren't "seeded" with connections they aren't requesting specifically, so you may need to create a custom build task which has as one of its' inputs the service connection name you want to use
b) read the desired values from variables as outlined above in your bash script. Service connection variable names may be computed similarly to this:
var dataParam = getVariable('ENDPOINT_DATA_' + id + '_' + key.toUpperCase());
You may need to iterate against this to determine the data schema/structure.
I've been wondering about this too. The solution I've settled on is to use the 'Azure CLI' task rather than the basic 'Script' (or 'Bash') task. This is ostensibly for running Az CLI commands, but there's nothing to stop you running only standard Bash scripts (or PSCore if that's your thing).
If you examine the environment variables present when you run this task, you'll see a bunch of information about the Service Connection in variables prefixed with 'ENDPOINT_DATA_'. This tallies up with what Josh E was saying. It includes Azure Subscription ID, name, Service Principle Object ID, etc.
Optionally you can enable the Service Principle details to be added to the environment too. This will then include SPN key, TenantID, etc. as secret environment variables.
Here's what the tasks look like:
- task: AzureCLI#2
displayName: 'Azure CLI'
inputs:
scriptType: bash
scriptLocation: inlineScript
azureSubscription: '<Service Connection Name>'
inlineScript: |
env | sort
- task: AzureCLI#2
displayName: 'Azure CLI, with SPN info'
inputs:
scriptType: bash
scriptLocation: inlineScript
azureSubscription: '<Service Connection Name>'
addSpnToEnvironment: true
inlineScript: |
env | sort
Of course this is all only applicable to Azure Cloud Service Connections. There might be similar techniques you could use for other Service Connections, but I haven't investigated them.
I found that if I use the Kubectl task with the command to login right before I run my bash Task, I do not need to authenticate or use a hostname.
KUBERNETESNODE and SERVICEPROTOCOL are Pipeline variables that I set a priori.
- task: Kubernetes#1
displayName: 'Kubernetes Login'
# This is needed to run kubectl command from bash.
inputs:
connectionType: 'Kubernetes Service Connection'
kubernetesServiceEndpoint: '<Service Connection Name>'
command: 'login'
- task: Bash#3
displayName: 'Run Component Test'
inputs:
targetType: 'inline'
script: |
#Get the Node Port
nodePort=`kubectl get --namespace $(Build.BuildId) svc <service name> -o=jsonpath='{.spec.ports[0].nodePort}'`
#Run Newman test
newman run postman/Service.postman_collection.json --global-var host=$KUBERNETESNODE --global-var protocol=$SERVICEPROTOCOL --global-var port=$nodePort -r junit
I am using the same service connection in my scripts/tools as for the ARM deployments.
In order to export the variables, I created the following template.
parameters:
- name: azureSubscription
type: string
- name: exportAsOutput
type: boolean
default: false
steps:
- task: AzureCLI#2
name: exported_azure_credentials
displayName: 'Export Azure Credentials'
inputs:
azureSubscription: '${{ parameters.azureSubscription }}'
scriptType: pscore
scriptLocation: inlineScript
addSpnToEnvironment: true
${{ if eq(parameters.exportAsOutput, true) }}:
inlineScript: |
Write-Host "##vso[task.setvariable variable=AZURE_TENANT_ID]$env:tenantId"
Write-Host "##vso[task.setvariable variable=AZURE_TENANT_ID;isOutput=true]$env:tenantId"
Write-Host "##vso[task.setvariable variable=AZURE_CLIENT_ID]$env:servicePrincipalId"
Write-Host "##vso[task.setvariable variable=AZURE_CLIENT_ID;isOutput=true]$env:servicePrincipalId"
Write-Host "##vso[task.setvariable variable=AZURE_CLIENT_SECRET]$env:servicePrincipalKey"
Write-Host "##vso[task.setvariable variable=AZURE_CLIENT_SECRET;isOutput=true]$env:servicePrincipalKey"
${{ if eq(parameters.exportAsOutput, false) }}:
inlineScript: |
Write-Host "##vso[task.setvariable variable=AZURE_TENANT_ID]$env:tenantId"
Write-Host "##vso[task.setvariable variable=AZURE_CLIENT_ID]$env:servicePrincipalId"
Write-Host "##vso[task.setvariable variable=AZURE_CLIENT_SECRET]$env:servicePrincipalKey"
DevOps is really clever about secrets, so they do not show up in the pipeline logs.
As others have stated, there isn't a great built-in way to access Service Connections with a script. As I do not like the workaround of exposing credentials via long-lived environment variables (for security and laziness purposes), I wrote an extension that allows you to utilize a Generic Service Connection with a custom script: https://marketplace.visualstudio.com/items?itemName=cloudpup.authenticated-scripts
It does this by exposing the service connection as environment variables that only last for the lifetime of the single script task:
Service Connection Variable
Environment Variable
url
AS_SC_URL
username
AS_SC_USERNAME
password
AS_SC_PASSWORD
Example
The tasks included in this extension allow you to write succinct pipelines like the following:
steps:
- task: AuthenticatedPowerShell#1
inputs:
serviceConnection: 'Testing Authenticated Shell'
targetType: inline
script: 'Write-Host "url: $env:AS_SC_URL | username: $env:AS_SC_USERNAME | password: $env:AS_SC_PASSWORD"'
Yes this can be achieved, I use this method all of the time, you first need a task which will output the credentials into environment variables, you then create your own variables from the environment variables the task outputs, I usually use AzureCLI:
# Set Variables.
- task: AzureCLI#2
name: setVariables
displayName: Set Output Variables
continueOnError: false
inputs:
azureSubscription: nameOfAzureServiceConnection
scriptType: ps
scriptLocation: inlineScript
addSpnToEnvironment: true # this must be set to true
inlineScript: |
Write-Host "##vso[task.setvariable variable=azureClientId;isOutput=true]$($env:servicePrincipalId)"
Write-Host "##vso[task.setvariable variable=azureClientSecret;isOutput=true]$($env:servicePrincipalKey)"
Write-Host "##vso[task.setvariable variable=azureTenantId;isOutput=true]$($env:tenantId)"
Then you can use these new variables you have set in a step below, make sure you call the variables with the task name, so $(taskName.variableName), the below example uses the new variables to set environment variables in a later PowerShell task for Terraform to use for authentication:
- PowerShell: |
terraform plan -input=false -out=tfplan
displayName: 'Terraform Plan'
env:
ARM_CLIENT_ID: $(setVariables.azureClientId)
ARM_CLIENT_SECRET: $(setVariables.azureClientSecret)
ARM_TENANT_ID: $(setVariables.azureTenantId)
ref: https://jimferrari.com/2021/11/15/access-azure-service-connection-via-script/
If you need to use the service connection to get authorized to different services/resources, you can also get the required tokens with the service connection and pass them to scripts that can't use the service connection directly, like:
- task: AzurePowerShell#5
inputs:
azureSubscription: 'AzureServiceConnection'
ScriptType: 'InlineScript'
Inline: |
$token = Get-AzAccessToken
echo "##vso[task.setvariable variable=accesstoken;]$($token.Token)"
azurePowerShellVersion: 'LatestVersion'
- script: 'echo This is the token: $(accesstoken)'

Resources