Azure Pipeline python script environment variables - python-3.x

I have a python script which it has 2 line of code where I have my username and password and I don't want to push them to GitHub for obvious reasons, but they are important for my azure DevOps pipeline to run successfully. Following some documentation, I set in my python script the òs.environ.get` to be able to retrieve the value from a environment variable.
my code looks like this.
import os
usernameot = os.environ.get('USERNAMEOT')
passwordot = os.environ.get('PASSWORDOT')
wait.until(EC.element_to_be_clickable((By.CSS_SELECTOR, "div[class$='visible-lg'] input#signInFormUsername"))).send_keys(usernameot)
wait.until(EC.element_to_be_clickable((By.CSS_SELECTOR, "div[class$='visible-lg'] input#signInFormPassword"))).send_keys(passwordot)
This was the first step and after this problem started. I have a azure pipeline that looks like this.
trigger: none
pool:
vmImage: 'ubuntu-latest'
stages:
- stage:
jobs:
- job: Configuration
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '3.8'
addToPath: true
- script: |
python -m pip install --upgrade pip
pip install selenium
printenv
- task: Pythonscript#0
inputs:
scriptSource: 'filePath'
scriptPath: './script1.py'
env:
USERNAMEOT: $(usernameot)
PASSWORDOT: $(passwordot)
- job: Mailinator
dependsOn: Configuration
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '3.x'
addToPath: true
- script: |
python -m pip install --upgrade pip
pip install selenium
- task: Pythonscript#0
inputs:
scriptSource: 'filePath'
scriptPath: './script2.py'
I tried to pass the environment variable in all the ways that I know.
The pipeline above is the latest one, I tried to store the variables in azure DevOps pipeline but it fails as my python script doesn't find the username value in the environment variables.
I tried to use GitHub secret and environment but it fails because it doesn't reconize the key secret.USERNAME
Can please any of you help me to understand how I can set a environment variable on my pipeline vm during the run time?
EDIT:
I tried all the solutions advice by bazetto (thank you so much for your help) but still facing the same issue.
As you can see, the pipeline return the correct values for my variables, but those are not passed to the python script.
Even if the error point to the css selector no found, I am pretty sure that the web driver is working because before to get to the username and password, there are different buttons to click etc. So what is my guess, is that when it comes to pass the username and password, as the variable does not get read from my configuration, the selenium script timeout.
Any advice about this please?
UPDATE:
I did some update to the pipeline and my python script.
In python I am getting the username and configuration from a json file that get populated during the runtime, and I am feeding those variables values to my script.
following another advice, I set the variables on each step of my pipeline, but yet I am getting an error, as follow:
Traceback (most recent call last):
File "D:\a\1\s\script1.py", line 49, in <module>
wait.until(EC.element_to_be_clickable((By.CSS_SELECTOR, "div[class$='visible-lg'] input#signInFormUsername"))).send_keys(usernameot)
File "C:\hostedtoolcache\windows\Python\3.8.10\x64\lib\site-packages\selenium\webdriver\support\wait.py", line 80, in until
raise TimeoutException(message, screen, stacktrace)
selenium.common.exceptions.TimeoutException: Message
##[error]The process '/opt/hostedtoolcache/Python/3.8.11/x64/bin/python' failed with exit code 1
I really don't understand what I am doing wrong, I think I understand all the steps and what everyone is suggesting, but I might have miss something in my scripts and configuration for sure

When you define variables in Azure DevOps you have an option to mark them as secrets.
Above username is not a secrets, and password is. In this case not-secret variables are mapped to environment variables.
What you can check with this:
steps:
- script: env | sort
displayName: 'Display env variables'
PATH=/home/linuxbrew/.linuxbrew/bin:/home/linuxbrew/.linuxbrew/sbin:/home/vsts/.local/bin:/opt/pipx_bin:/usr/share/rust/.cargo/bin:/home/vsts/.config/composer/vendor/bin:/usr/local/.ghcup/bin:/home/vsts/.dotnet/tools:/snap/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin
PIPELINE_WORKSPACE=/home/vsts/work/1
PIPX_BIN_DIR=/opt/pipx_bin
PIPX_HOME=/opt/pipx
.
.
.
TASK_DISPLAYNAME=Display env variables
TF_BUILD=True
USER=vsts
USERNAME=someusername
VCPKG_INSTALLATION_ROOT=/usr/local/share/vcpkg
However, there is still a way how to map them to env variables. There is explicit mapping like below:
- bash: |
echo "Using the mapped env var for this task works and is recommended: $MY_MAPPED_PASSWORD"
env:
MY_MAPPED_PASSWORD: $(password)
But this is limited to task scope. So you can use this env variable only in this task. If you need this in more tasks, then you need to repeat that env mapping accross every task.
So if you have both username and password defined as secret you need to mapp both values:
- task: Pythonscript#0
inputs:
scriptSource: 'filePath'
scriptPath: './script1.py'
env:
PASSWORD: $(password)
USERNAME: $(username)
And if you see *** in logs this is masking which Azure DevOps does to hide your secret if it appears anywhere in logs.

You might face issues related to environment variables due to admin privileges.
My suggestion is to replace it with a file, like
config.json
{
"username": "",
"password": ""
}
in your pipeline, add a step to populate it:
- stage: temp
jobs:
- job: temp
displayName: temp
variables:
- name: USERNAME
value: "USER1"
- name: PASSWORD
value: "MYPWD1"
steps:
- checkout: self
- task: PowerShell#2
displayName: "Set configuration"
inputs:
targetType: "inline"
script: |
#read config
$config = Get-Content (Get-Item .\config.json) -Raw -Encoding UTF8 | ConvertFrom-Json
#set the credentials
$config.username = "$(USERNAME)" #assuming that you already have it
$config.password = "$(PASSWORD)" #assuming that you already have it
#update the file
$config | ConvertTo-Json | Set-Content .\config.json
- task: PowerShell#2
displayName: "show configuration"
inputs:
targetType: "inline"
script: |
#debug
cat config.json
The configuration step must show something like:
in your python app something like:
import json
f = open('config.json',)
data = json.load(f)
username = data['username']
password = data['password']
#debug
print(data)

Related

How to set variables, defined in Azure variable group, as env variables in Azure pipeline?

We run integration tests, written in Python, in Azure pipeline.
The run is triggered in this way:
- script: |
pdm run pytest \
--variables "$VARIABLE_FILE" \
--napoleon-docstrings \
--doctest-modules \
--color=yes \
--junitxml=junit/test-results.xml \
integration
env:
<different_environment_variables>
Some integration tests will make connection to a database, and the properties, needed for connection to the database, are stored in a variable group in Azure. I can get them via :
- powershell: |
az pipelines variable-group variable list --group-id <some_group_id> --output table
but i do not know how to set them later as environment variables and use them in the python code?
You can reference the variables from your variable group directly in your python YAML pipeline like below :-
# Python package
# Create and test a Python package on multiple Python versions.
# Add steps that analyze code, save the dist with the build record, publish to a PyPI-compatible index, and more:
# https://docs.microsoft.com/azure/devops/pipelines/languages/python
trigger:
- master
pool:
vmImage: ubuntu-latest
strategy:
matrix:
Python27:
python.version: '2.7'
Python35:
python.version: '3.5'
Python36:
python.version: '3.6'
Python37:
python.version: '3.7'
variables:
- group: SharedVariables
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '$(python.version)'
displayName: 'Use Python $(python.version)'
- script: |
echo $(databaseservername)
- script: |
python -m pip install --upgrade pip
pip install -r requirements.txt
displayName: 'Install dependencies'
- script: |
pip install pytest pytest-azurepipelines
pytest
displayName: 'pytest'
variables:
- group: SharedVariables
- script: |
echo $(databaseservername)
Pipeline ran successfully like below :-
Authorize and allow the permission to run the pipeline like below :-
Permit access :-
Pipeline ran successfully with the variable:-
You can also pass the variables in the pipeline with arguments and inline script with python task like below :-
variables:
- group: variableGroup
steps:
- task: PythonScript#0
displayName: 'Run a Python script'
inputs:
scriptPath: 'Test.py'
arguments: --variableInScript $(variableInVariableGroup)
Inline Python task:-
variables:
- group: SharedVariables
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '$(python.version)'
displayName: 'Use Python $(python.version)'
- script: |
echo $(databaseservername)
- task: PythonScript#0
inputs:
scriptSource: 'inline'
script: 'print(''databaseservername: $(databaseservername))'')'
Output :-
databaseservername variable was encrypted like below after running as an argument in the python inline script.
You can link the variable group in the release pipeline like below and call it in multiple pipelines across different stages too :-
Similarly, you can add Azure CLI task for your shell script to callaz pipelines variable-group variable list in your yaml pipeline and reference it like below :-
- task: AzureCLI#2
inputs:
azureSubscription: '<Subscription-name>(sub-id)'
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: |
az pipelines variable-group variable list
--org ''https://dev.azure.com/org/''
--project project --group-id id
--only-show-errors --output json
Reference:-
Access variables from Variable Groups inside Python script task in Azure DevOps Yaml pipeline By Bo Soborg Perterson
Pass Variable Group as Dictionary To Python Script By Joy wang

Azure DevOps Parameters not recognised by AZ CLI

I am having issues passing parameters defined in an Azure Pipeline YAML to AZ Cli located in a bash script. I found the following solution on Stackoverflow but it doesn't seem to work:
Pass parameter to Azure CLI Task in DevOps
Here is my YAML file:
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
steps:
- task: AzureCLI#2
displayName: Azure CLI
inputs:
azureSubscription: templateresourceconnection
scriptType: bash
scriptPath: ./deployment.sh
arguments:
-resourceGrp 'TEST-RG'
-location 'westeurope'
I would expect to be able to access the arguments in my deployment.sh, which fails:
#!/bin/bash
# Create Resource Group
az group create -n $(resourceGrp) -l $(location)
If I don't pass any arguments and hardcode the values in deployment.sh it all works fine.
Any ideas what could cause this issue? I also tried with UPPERCASE and just without brackets.
I get the following error message
Do you have any idea what else I could try to make it work. Seems like the documentation doesn't contain any example for az cli. Just how to define parameters but not how to pass them afterwards.
Thank you.
Do you have any idea what else I could try to make it work
You could try to use the Environment variable. Environment variables can be accessed by bash script files.
First of all, you need to define pipeline variable instead of task argument.
variables:
- name: one
value: initialValue
Here is my example: Used in Linux system.
az group create -n $RESOURCEGRP -l $LOCATION
Note: All characters in environment variables need to be capitalized.
e.g.: resourceGrp -> $RESOURCEGRP
Yaml sample:
variables:
- name: resourceGrp
value: TEST-RG
- name: location
value: westeurope
pool:
vmImage: 'ubuntu-latest'
steps:
- task: AzureCLI#2
displayName: 'Azure CLI deploy.sh'
inputs:
azureSubscription: kevin0209
scriptType: bash
scriptPath: ./deployment.sh

Azure DevOps accessing two Key Vaults with duplicate secret names

I currently have an azure build pipeline that needs to access two different Key Vaults. Unfortunately both of the secrets I am trying to access have a name of SQLUserName. I am trying to pass these as arguments to a python script. I am looking for a way that I could qualify or differentiate between the secrets when passing the arguments.
Ideally I would like to access the variable qualified something like $(ServiceConnection1.SQLUserName) But I can't find any information on this.
I have been researching a way to rename a variable so I could possibly run the first Key Vault task then rename $(SQLUserName) to $(SQLUserNamefoo) then run the second Key Vault task and rename to $(SQLUserName) to $(SQLUserNamebar). I can't seem to find anyway to rename a variable in YML.
trigger:
- main
pool:
vmImage: 'ubuntu-latest'
strategy:
matrix:
Python37:
python.version: '3.7'
steps:
- task: AzureKeyVault#1
inputs:
azureSubscription: 'ServiceConnection1'
KeyVaultName: 'Vault1'
SecretsFilter: '*'
RunAsPreJob: true
- task: AzureKeyVault#1
inputs:
azureSubscription: 'ServiceConnection2'
KeyVaultName: 'Vault2'
SecretsFilter: '*'
RunAsPreJob: true
- task: UsePythonVersion#0
inputs:
versionSpec: '$(python.version)'
displayName: 'Use Python $(python.version)'
- script: |
python -m pip install --upgrade pip
pip install -r requirements.txt
displayName: 'Install dependencies'
- task: PythonScript#0
inputs:
scriptSource: 'filePath'
scriptPath: 'keyVaultTest.py'
arguments: '$(SQLUserName))'
#ideal way to work
arguments: '$(SQLUserName1) $(SQLUserName2))'
Azure DevOps accessing two Key Vaults with duplicate secret names
We could add a Inline powershell task with Logging Command to set the variable SQLUserNamefoo with value $(SQLUserName) after the first AzureKeyVault task.
Write-Host ("##vso[task.setvariable variable=SQLUserNamefoo]$(SQLUserName)")
Then we could use the $(SQLUserNamefoo) in the next tasks.
And we could set the another Inline powershell task to set the variable SQLUserNamebar with value $(SQLUserName) after the second AzureKeyVault task
Write-Host ("##vso[task.setvariable variable=SQLUserNamebar]$(SQLUserName)")
As test, I created a Key Vault SQLUserName with value Leotest. In order to verify the SQLUserNamefoo is set to $(SQLUserName), I defined SQLUserNamefoo in the Variables with value 123:
And add another powershell task to output the value of SQLUserNamefoo to a txt file to verify it:
cd $(System.DefaultWorkingDirectory)
New-Item $(System.DefaultWorkingDirectory)\temp -type directory
cd temp
New-Item a.txt -type file
write-output $(SQLUserNamefoo)| out-file -filepath $(System.DefaultWorkingDirectory)\temp\a.txt
The result of txt file:

Referencing release variables in azurepipelines.yml

Currently i am in the process of converting my pipelines from classic over to azurepipelines.yml and im having an issue trying to find the correct syntax to reference release variables in a bash step.
The existing code in a bash task
namebuilder=$(RELEASE.ENVIRONMENTNAME)-$(RELEASE.RELEASEID)
will output the following
dev-2049
however when converted over to my new pipeline file the above code produces the the following error
/home/vsts/work/_temp/ac39e1d7-11bd-4c32-9b1b-1520dae11c5a.sh: line 1: RELEASE.ENVIRONMENTNAME: command not found
/home/vsts/work/_temp/ac39e1d7-11bd-4c32-9b1b-1520dae11c5a.sh: line 1: RELEASE.RELEASEID: command not found
[extracted from pipeline.yml]
- bash: |
namebuilder=$(RELEASE.ENVIRONMENTNAME)-$(RELEASE.RELEASEID)
i have even created a step trying a few different approaches without much luck
steps:
- bash: |
echo This multiline script always runs in Bash.
echo Even on Windows machines!
echo '$(release.environmentname)'
echo $(release.environmentname)
echo '$(RELEASE.ENVIRONMENTNAME)'
echo $(RELEASE.ENVIRONMENTNAME)
produces
This multiline script always runs in Bash.
Even on Windows machines!
$(release.environmentname)
$(RELEASE.ENVIRONMENTNAME)
/home/vsts/work/_temp/260dd504-a42d-45d6-bb1b-bf1f4b015cf8.sh: line 4: release.environmentname: command not found
/home/vsts/work/_temp/260dd504-a42d-45d6-bb1b-bf1f4b015cf8.sh: line 6: RELEASE.ENVIRONMENTNAME: command not found
is it also possible (in a much cleaner approach) to define this as a pipeline variable and reference at a global scope like below ?
variables:
namebuilder: '$(release.environmentname)-$(release.releaseid)'
stages:
- stage: Deploy
displayName: deploy infra
jobs:
- job: deploy_infra
displayName: deploy infra
continueOnError: true
workspace:
clean: outputs
steps:
- bash: |
echo This multiline script always runs in Bash.
echo Even on Windows machines!
echo '$(namebuilder)'
tia
It doesn't look like release.environment or any release variables are available for multi-stage pipelines. You could use the new environment concept and at that point environment.name would be available. I think you would likely go with $(environment.name)-$(build.buildid) for what you are after.
So I am not sure if the release pipelines you are converting are deploying to say an app service, or to a VM, or just using a hosted agent to publish something else? Disclaimer I have not used the Environment concept extensively yet just some reading and limited testing. Its all new!
So for deploying to VMs You can configure a Virtual Machine resource in an environment. This concept has a bunch of parallels with classic deployment group agents. You register an agent on a target machine. From there your pipeline steps can execute in that machine's context and you get a further set of environment variables.
The example pipeline below outputs any environment variables from the context the steps are running in and also outputs $(environment.name)-$(build.buildid)
A normal job in a hosted pipeline
A Deployment to an Environment
A Deployment to an Environment with a VM resource
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
variables:
namebuilder: '$(environment.name)-$(build.buildid)'
jobs:
- job: NormalJobInHostedPipeline
steps:
- task: PowerShell#2
name: EnvironmentVariables
inputs:
targetType: 'inline'
script: 'gci env:* | sort-object name'
- bash: |
echo This multiline script always runs in Bash.
echo Even on Windows machines!
echo '$(namebuilder)'
# track deployments on the environment
- deployment: DeploymentHostedContext
displayName: Runs in Hosted Pool
pool:
vmImage: 'Ubuntu-16.04'
# auto creates an environment if it doesn't exist
environment: 'Dev'
strategy:
runOnce:
deploy:
steps:
- task: PowerShell#2
name: EnvironmentVariables
inputs:
targetType: 'inline'
script: 'gci env:* | sort-object name'
- bash: |
echo This multiline script always runs in Bash.
echo Even on Windows machines!
echo '$(namebuilder)'
# Similar to Deployment Group Agent need to register them -stage will fail if resource does not exist
# https://learn.microsoft.com/en-us/azure/devops/pipelines/process/environments-virtual-machines?view=azure-devops
- deployment: DeploymentVirtualMachineContext
displayName: Run On Virtual Machine Agent
environment:
name: DevVM
resourceType: VirtualMachine
strategy:
runOnce:
deploy:
steps:
- task: PowerShell#2
name: EnvironmentVariables
inputs:
targetType: 'inline'
script: 'gci env:* | sort-object name'
- task: PowerShell#2
name: VariableName
inputs:
targetType: 'inline'
script: 'echo $(namebuilder)'
Use $(System.StageName) in place of $(Release.EnvironmentName), as for release Id, you'd need to use $(Build.BuildId)
I found that $(Environment.Name) doesn't work unless you're using environments. I'm not since it's still quite limited.

How to pass environment specific values to Azure pipeline?

I am deploying Service Fabric Application packages and I have several (~15) devtest environments, any one of which can be used to test a code fix. I can pass in the Service Connection so deploying the final package is not the issue. What I can't figure out is how to set the other environment specific variables based on the target environment.
I tried using the Service Connection name to pick one of several variable template files:
variables:
- name: envTemplateFileTest
${{ if eq( variables['DevConnection'], 'Environ01' ) }}:
value: ../Templates/DEV01-Variables-Template.yml
${{ if eq( variables['DevConnection'], 'Environ02' ) }}:
value: ../Templates/DEV02-Variables-Template.yml
... (snip) ...
variables:
- template: ${{ variables.envTemplateFile }}
But UI variables are not set at compile time. So the template expressions see blank values and fail.
I could use a pipeline variable but then QA would have to make a file change and check it in each time they want to deploy to a different environment than last time.
What I currently have is an empty variable template and a powershell script that sets the values based on different script names.
- task: PowerShell#2
inputs:
targetType: 'filePath'
filePath: '$(Build.ArtifactStagingDirectory)\drop\Deployment\Code\Scripts\Set-$(DevConnection)Variables.ps1'
#arguments: # Optional
displayName: Set environment variables
There has got to be a better way than this. Please.
There is not a direct way to achieve this, as the template expression is parsed at compile time.
However I have workaround which no need to write additional ps script and avoid making a file change and check it in to your repo each time.
Since all your devtest environments has the same deployment steps. Then you can create steps template yaml to hold the deployment steps.
Then you can modify your azure-pipelines.yml like below example:
jobs:
- job: A
pool:
vmImage: 'windows-latest'
steps:
- powershell: |
$con = "$(connection)"
if($con -eq "environ1"){echo "##vso[task.setvariable variable=variablegroup;isOutput=true]environ1"}
if($con -eq "environ2"){echo "##vso[task.setvariable variable=variablegroup;isOutput=true]environ2"}
name: setvarStep
- script: echo '$(setvarStep.variablegroup)'
- job: environ1
pool:
vmImage: 'windows-latest'
dependsOn: A
condition: eq(dependencies.A.outputs['setvarStep.variablegroup'], 'environ1')
variables:
- template: environ1.yaml
steps:
- template: deploy-jobs.yaml
- job: environ2
pool:
vmImage: 'windows-latest'
dependsOn: A
condition: eq(dependencies.A.outputs['setvarStep.variablegroup'], 'environ2')
variables:
- template: environ2.yml
steps:
- template: deploy-jobs.yaml
Above yml pipeline use depenpencies and condition. The first job A will output a variable according to the variable (eg.$(connection)) you specify when running the pipeline. In the following jobs, there are conditions to evaluate the output variable. If condition is satisfied then the job will be executed, the job will be skipped if failed on condition.
What we decided to do was add a Powershell script step that sets the variables based on a string passed in.
- task: PowerShell#2
inputs:
targetType: 'filePath'
filePath: $(Build.ArtifactStagingDirectory)\drop\Deployment\Code\Scripts\Set-DefaultValues.ps1
displayName: Set default pipeline variables
Then we load the appropriate file and loop through the variables, setting each in turn.
param(
[string]
$EnvironmentName
)
$environmentValues = #{}
switch ($EnvironmentName) {
'DEV98' { . '.\Dev98-Values.ps1'}
'DEV99' { . '.\Dev99-Values.ps1'}
}
foreach ($keyName in $environmentValues.Keys) {
Write-Output "##vso[task.setvariable variable=$($keyName)]$($environmentValues[$keyName])"
}
This allows us to put the environment specific variables in a plain PSCustom object file and dot import it.
$environmentValues = #{
currentYear = '2020';
has_multiple_nodetypes = 'false';
protocol = 'http';
endpoint = 'vm-dev98.cloudapp.com';
... snip ...
}
So QA has an easier time maintaining the different environment files.
Hope this helps others out there.

Resources