Key Vault secret values are displayed as plain text in Bash script - azure

AzureKeyVault#1 task retrieves all the secrets, some of the secrets are displayed as *** whereas some newly created ones are shown as plain text.
A part of my pipeline:
steps:
- task: AzureKeyVault#1
displayName: Download secrets from KeyVault
inputs:
azureSubscription: azure_sub
KeyVaultName: key_vault
SecretsFilter: '*'
RunAsPreJob: true
- task: PipAuthenticate#1
displayName: Authentication step
inputs:
artifactFeeds: organization
onlyAddExtraIndex: true
- script: |
echo "##vso[task.setvariable variable=keyvault_variable;isOutput=true]$(keyvault_variable)"
displayName: Set environment variables
name: SetVariables
- stage: Stage2
jobs:
- job: check_if_encrypted
steps:
- task: CmdLine#2
displayName: Write secrets
inputs:
script: |
echo keyvault_variable
Is there any changes to the Azure Key Vault or wrong with the pipeline?
Thanks

You're creating an unencrypted copy of the secret value with echo "##vso[task.setvariable variable=keyvault_variable;isOutput=true]$(keyvault_variable)". You should specify isSecret=true if you want it to continue to be a secret.
Refer to the documentation for more details.

It seems we have to explicitly mention : issecret=true in the
echo "##vso[task.setvariable variable=keyvault_variable;isOutput=true]$(keyvault_variable)" script. Only then it masks.
What is not clear is why this has to be set for certain for certain sercrets whereas for others it worked without explicitly mentioning.

Related

Make Azure Keyvault secrets available in entire pipeline

In order to access my secret from the keyvault, I run
- task: AzureKeyVault#2
inputs:
azureSubscription: $(KEYVAULT_SC_DEV)
KeyVaultName: $(KEYVAULT_NAME_DEV)
SecretsFilter: APICREDENTIALS
RunAsPreJob: true
which works fine.
However, I have multiple jobs and am now facing the trouble of having to repeat these lines too many times.
So, is there a way to tell Azure Devops that this secret should be set globally for each job/stage/step.. etc?
If you want to make Azure Keyvault secrets available across multiple jobs or stages with AzureKeyVault#2task, you can use outputs in a different stages.
For example, I’ve set secret password in my KeyVault.
Across multiple jobs:
variables:
# map the output variable from A into this job
password-job-b: $[ dependencies.A.outputs['ouputvariable.mypassword'] ]
Across multiple stage:
variables:
# map the output variable from A into this job
password-stage-two: $[ stageDependencies.One.A.outputs['ouputvariable.mypassword'] ]
Across whole job :
- task: AzureKeyVault#2
RunAsPreJob: true ## Make the secret(s) available to the whole job
Full yaml sample:
trigger:
- none
pool:
vmImage: ubuntu-latest
stages:
- stage: One
jobs:
- job: A
steps:
- task: AzureKeyVault#2
inputs:
azureSubscription: ‘your subscription‘
KeyVaultName: ‘your keyvault name’
SecretsFilter: '*'
RunAsPreJob: true
- task: Bash#3
inputs:
targetType: 'inline'
script: 'echo "##vso[task.setvariable variable=mypassword;isOutput=true]$(password)"'
name : ouputvariable
- job: B
dependsOn : A
variables:
# map the output variable from A into this job
password-job-b: $[ dependencies.A.outputs['ouputvariable.mypassword'] ]
steps:
- script: echo this is password :$(password-job-b) # this step uses the mapped-in variable
- stage: Two
variables:
# map the output variable from A into this job
password-stage-two: $[ stageDependencies.One.A.outputs['ouputvariable.mypassword'] ]
jobs:
- job: C
steps:
- script: echo this is password :$(password-stage-two) # this step uses the mapped-in variable
Result across multiple jobs:
Result across multiple stages:
UPDATE
When issecret is set to true, the value of the variable will be saved as secret .
script: 'echo "##vso[task.setvariable variable=mypassword;isOutput=true;issecret=true]$(password)"'
If you want these secrets available to multiple pipelines one way would be to use the library variables
And reference these in your pipeline
https://learn.microsoft.com/en-us/azure/devops/pipelines/library/variable-groups?view=azure-devops&tabs=yaml#use-a-variable-group
If you want these secrets available to multiple stages/jobs/steps within the same pipeline one way would be to create a pipeline variable
variables:
secretValue: ''
jobs:
- job: RetrieveSecret
steps:
- task: AzureKeyVault#2
inputs:
azureSubscription: $(KEYVAULT_SC_DEV)
KeyVaultName: $(KEYVAULT_NAME_DEV)
SecretsFilter: APICREDENTIALS
OutputVariable: secretValue
Here the RetrieveSecret job retrieves the secret from the Key Vault and stores it in the secretValue pipeline variable.Once the secret has been stored in the pipeline variable, you can reference it from any job or task in your pipeline by using the $(pipelineVariableName) syntax.
The caveat here is that pipeline variables are scoped to a specific job, if you wanted to use the same variable across different jobs then you need to pass this value to the next job sort of like below
jobs:
- job: Job1
steps:
- task: AzureKeyVault#2
inputs:
azureSubscription: $(KEYVAULT_SC_DEV)
KeyVaultName: $(KEYVAULT_NAME_DEV)
SecretsFilter: APICREDENTIALS
OutputVariable: secretValue
- job: Job2
inputs:
secretInput: $(secretValue)
steps:
- task: SomeTask
inputs:
secret: $(secretInput)
We can use "variable groups" to pass the values into a YAML pipeline, which we can make available across all.
Steps1:
Store Key vault key values into Variable Groups
how to use keyvault
Step2:
Use that Variable group into any pipelines
Here is the reference: tutorial from Thomas Thornton

unable to pass parameter from azure Devops yaml to PowerShell

parameters:
- name: AzureSubscription
default: 'abc'
- name: BlobName
type: string
default: ""
stages:
- stage: MyStage
displayName: 'My Stage'
variables:
- name: sas
jobs:
- job: ABC
displayName: ABC
steps:
- task: AzureCLI#2
displayName: 'XYZ'
inputs:
azureSubscription: ${{ parameters.AzureSubscription }}
scriptType: pscore
arguments:
scriptLocation: inlineScript
inlineScript: |
$sas=az storage account generate-sas --account-key "mykey" --account-name "abc" --expiry (Get-Date).AddHours(100).ToString("yyyy-MM-dTH:mZ") --https-only --permissions rw --resource-types sco --services b
Write-Host "My Token: " $sas
- task: PowerShell#2
inputs:
targetType: 'filepath'
filePath: $(System.DefaultWorkingDirectory)/psscript.ps1
arguments: >
-Token "????"
-BlobName "${{parameters.BlobName}}"
displayName: 'some work'
In this Azure Devops yaml, i have created 2 tasks. AzureCLI#2 and PowerShell#2
In AzureCLI#2 i get value in $sas varaible. Write-Host confirms that, but $sas does not get passes as parameter to PowerShell#2 powershell file as parameter.
"${{parameters.BlobName}}" is working fine. In powershell i am able to read that value.
How to pass sas variable value?
Tried
-Token $sas # not worked
-Token "${{sas}}" # not worked
Different tasks in Azure Pipeline don't share a common runspace that would allow them to preserve or pass on variables.
For this reason Azure Pipelines offers special logging commands that allow to take string output from a task to update an Azure Pipeline environment variable that can be used in subsequent tasks: Set variables in scripts (Microsoft Docs).
In your case you would use a logging command like this to make your sas token available to the next task:
Write-Host "##vso[task.setvariable variable=sas]$sas"
In the argument of your subsequent task (within the same job) use the variable syntax of Azure Pipelines:
-Token '$(sas)'

Azure DevOps accessing two Key Vaults with duplicate secret names

I currently have an azure build pipeline that needs to access two different Key Vaults. Unfortunately both of the secrets I am trying to access have a name of SQLUserName. I am trying to pass these as arguments to a python script. I am looking for a way that I could qualify or differentiate between the secrets when passing the arguments.
Ideally I would like to access the variable qualified something like $(ServiceConnection1.SQLUserName) But I can't find any information on this.
I have been researching a way to rename a variable so I could possibly run the first Key Vault task then rename $(SQLUserName) to $(SQLUserNamefoo) then run the second Key Vault task and rename to $(SQLUserName) to $(SQLUserNamebar). I can't seem to find anyway to rename a variable in YML.
trigger:
- main
pool:
vmImage: 'ubuntu-latest'
strategy:
matrix:
Python37:
python.version: '3.7'
steps:
- task: AzureKeyVault#1
inputs:
azureSubscription: 'ServiceConnection1'
KeyVaultName: 'Vault1'
SecretsFilter: '*'
RunAsPreJob: true
- task: AzureKeyVault#1
inputs:
azureSubscription: 'ServiceConnection2'
KeyVaultName: 'Vault2'
SecretsFilter: '*'
RunAsPreJob: true
- task: UsePythonVersion#0
inputs:
versionSpec: '$(python.version)'
displayName: 'Use Python $(python.version)'
- script: |
python -m pip install --upgrade pip
pip install -r requirements.txt
displayName: 'Install dependencies'
- task: PythonScript#0
inputs:
scriptSource: 'filePath'
scriptPath: 'keyVaultTest.py'
arguments: '$(SQLUserName))'
#ideal way to work
arguments: '$(SQLUserName1) $(SQLUserName2))'
Azure DevOps accessing two Key Vaults with duplicate secret names
We could add a Inline powershell task with Logging Command to set the variable SQLUserNamefoo with value $(SQLUserName) after the first AzureKeyVault task.
Write-Host ("##vso[task.setvariable variable=SQLUserNamefoo]$(SQLUserName)")
Then we could use the $(SQLUserNamefoo) in the next tasks.
And we could set the another Inline powershell task to set the variable SQLUserNamebar with value $(SQLUserName) after the second AzureKeyVault task
Write-Host ("##vso[task.setvariable variable=SQLUserNamebar]$(SQLUserName)")
As test, I created a Key Vault SQLUserName with value Leotest. In order to verify the SQLUserNamefoo is set to $(SQLUserName), I defined SQLUserNamefoo in the Variables with value 123:
And add another powershell task to output the value of SQLUserNamefoo to a txt file to verify it:
cd $(System.DefaultWorkingDirectory)
New-Item $(System.DefaultWorkingDirectory)\temp -type directory
cd temp
New-Item a.txt -type file
write-output $(SQLUserNamefoo)| out-file -filepath $(System.DefaultWorkingDirectory)\temp\a.txt
The result of txt file:

Referencing Azure Key Vault secrets from CI/CD YAML

We have a multi-stage YAML pipeline that does CI/CD to an existing set of Azure Resources
The stages are
Build
Deploy to Development and Run Tests
If Previous succeeded - Deploy to Production and Run Tests
We use the AzureRmWebAppDeployment task during the deployment stages and we use the AppSettings argument to that task to specify environment-specific settings. For example
- task: AzureRmWebAppDeployment#4
displayName: 'Deploy Azure App Service'
inputs:
azureSubscription: '$(azureSubscriptionEndpoint)'
appType: '$(webAppKind)'
WebAppName: 'EXISTING__AZURE_RESOURCENAME-DEV'
Package: '$(Pipeline.Workspace)/**/*.zip'
AppSettings: >
-AzureAd:CallbackPath /signin-oidc
-AzureAd:ClientId [GUID was here]
-AzureAd:Domain [domain was here]
-AzureAd:Instance https://login.microsoftonline.com/
-AzureAd:TenantId [Id was here]
-EmailServer:SMTPPassword SECRETPASSWORD
-EmailServer:SMTPUsername SECRETUSERNAME
There are two settings in that set, EmailServer: SMTPUsername and EmailServer: SMTPPassword that I want to pull from an Azure KeyVault. I know how to reference the KV secret from Azure Portal using the syntax
#Microsoft.KeyVault(SecretUri=https://our.vault.azure.net/secrets/SendGridPassword/ReferenceGuidHere)
but how do I reference the value from the YAML pipeline so it is set in Azure?
As pointed out by Thomas in this comment, Referencing Azure Key Vault secrets from CI/CD YAML
I can explicitly set the value in the YAML file like this:
-EmailServer:SMTPPassword #Microsoft.KeyVault(SecretUri=https://our.vault.azure.net/secrets/SendGridPassword/ReferenceGuidHere)
You need to set an AzureKeyVault#1 task with RunAsPreJob to true, this will make your key vault values available as CI/CD jobs environment variables so you can use it as $(KEY-OF-SECRET-VALUE) on the rest of your stages in the job.
The following piece of yaml file is a working example.
We set for python unittest a set of env variable provided from Azure key-vault
trigger:
batch: true # disable concurrent build for pipeline
branches:
include:
- '*' # CI start for all branches
pool:
vmImage: ubuntu-16.04
stages:
- stage: Test
jobs:
- job: sample_test_stage
steps:
- task: AzureKeyVault#1
inputs:
azureSubscription: 'YOUR SUBSCRIPTION HERE'
KeyVaultName: 'THE-KEY-VAULT-NAME'
SecretsFilter: '*'
RunAsPreJob: true
- task: UsePythonVersion#0
inputs:
versionSpec: '3.7'
- script : python -m unittest discover -v -s tests
displayName: 'Execute python unittest'
env: { MY-ENV-VAL-1: $(SECRET-VALUE-1), MY-ENV-VAL-2: $(SECRET-VALUE-2)}
Note that sometimes you need to approve connection beetween AzureDevops and another Azure service like KeyVault

Changing Azure YAML Pipeline causes authorization to be lost to Resources

I've had this happen to me a number of times where I have a working Azure Pipeline written in YAML. Then I change the Pipeline and then I get the error that There was a resource authorization issue. Typically I delete the pipeline, re-create it, and then it works. However, now it is not working and I continuously get the following error:
So, I click the little button, and it pops up saying, Resources have been Authorized. I attempt to run the pipeline again, and I get the same error.
I am an Account/Collection/Organization Administrator, and created the Library Group originally where it is set to have Access to all Pipelines enabled. I've tried renaming the Pipeline and re-creating it a few times to the same error. short of reverting the pipeline back to it original state what should I do?
--EDIT--
Simply resetting the branch to an earlier version of the pipeline worked. Still no clue on why moving the steps into Stages and Jobs failed though.
--EDIT--
Below will is the YAML I used originally and the updated version. When the updated version gave Resource Authorization issues, I performed a git log and took the commit id of the previous commit that worked, and did a git reset $commitId. Pushed the reset branch back up to Azure DevOps, then it just magically worked.
Original Azure Pipeline YAML:
---
trigger: none
variables:
- name: ProjectFolder
value: tf-datafactory
- group: 'Deploy_Terraform_Library_Group'
pool:
vmImage: 'ubuntu-latest'
steps:
- task: replacetokens#3
displayName: Replace tokens
inputs:
targetFiles: '$(System.DefaultWorkingDirectory)/$(ProjectFolder)/variables.tf'
encoding: 'auto'
writeBOM: true
verbosity: 'detailed'
actionOnMissing: 'warn'
keepToken: false
tokenPrefix: '#{{'
tokenSuffix: '}}#'
- task: AzureCLI#2
displayName: Get the storage account key
inputs:
azureSubscription: '$(ARM.SubscriptionEndpoint)'
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: |
export ContainerAccessKeyExport=$(az storage account keys list \
--resource-group $(StorageResourceGroupName) \
--account-name $(StorageAccountName) \
--query "[0].value")
echo "##vso[task.setvariable variable=ContainerAccessKey]$ContainerAccessKeyExport"
...
I then moved these steps into stages and jobs.
---
parameters:
- name: TerraformAction
displayName: 'Will Terraform Create or Destroy?'
type: 'string'
default: 'create'
values:
- 'create'
- 'destroy'
trigger: none
pool:
vmImage: 'ubuntu-latest'
stages:
- stage: 'Terraform'
displayName: 'Terraform Stage'
variables:
- name: 'TerraformAction'
value: '${{ parameters.TerraformAction }}'
- name: ProjectFolder
value: tf-datafactory
jobs:
- job: 'DeployTerraform'
displayName: 'Terraform Deploy Data Factory'
condition: eq(variables['TerraformAction'], 'create')
variables:
- group: 'Deploy_Terraform_Library_Group'
steps:
- task: replacetokens#3
displayName: Replace tokens
inputs:
targetFiles: '$(System.DefaultWorkingDirectory)/$(ProjectFolder)/variables.tf'
encoding: 'auto'
writeBOM: true
verbosity: 'detailed'
actionOnMissing: 'warn'
keepToken: false
tokenPrefix: '#{{'
tokenSuffix: '}}#'
- task: AzureCLI#2
displayName: Get the storage account key
inputs:
azureSubscription: '$(ARM.SubscriptionEndpoint)'
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: |
export ContainerAccessKeyExport=$(az storage account keys list \
--resource-group $(StorageResourceGroupName) \
--account-name $(StorageAccountName) \
--query "[0].value")
echo "##vso[task.setvariable variable=ContainerAccessKey]$ContainerAccessKeyExport"
...
--EDIT--
What I get from this: https://aka.ms/yamlauthz is that you either need to start with Stages and Jobs from the get go, otherwise you have to stick with the original Pipeline that was created. Most people that let Azure DevOps create their initial pipeline won't know to use the Stages and Jobs because the pipeline generator doesn't do that for them and only starts with Steps.

Resources