I have a an azure pipelines with a secret variable that triggers on Pull requests. When triggered the secret variable is not available to the pipeline.
Secret Variable works when triggered by commits to a branch.
pipeline
pr:
branches:
include:
- '*'
trigger:
branches:
exclude:
- '*'
jobs:
- job:
pool:
vmImage: 'ubuntu-latest'
timeoutInMinutes: 360
displayName: 'Running test'
steps:
- bash: |
if [ -z "$(system.pullRequest.sourceRepositoryUri)" ]
then
python3 runTest.py \
--config "blessedImageConfig-temp.json" \
--code $(SecretCode)
else
python3 runTest.py \
--config "blessedImageConfig-temp.json" \
--pullRepo $(system.pullRequest.sourceRepositoryUri) \
--pullId $(system.pullRequest.pullRequestNumber) \
--code $(SecretCode)
fi
Secret variable added via the webUI
output and error
Generating script.
========================== Starting Command Output ===========================
[command]/bin/bash --noprofile --norc /home/vsts/work/_temp/95f6ae7c-d2e1-4ebd-891c-2d998eb4b1d9.sh
/home/vsts/work/_temp/95f6ae7c-d2e1-4ebd-891c-2d998eb4b1d9.sh: line 7: SecretCode: command not found
usage: runTest.py [-h] [--config CONFIG] [--code CODE] [--pullId PULLID]
[--pullRepo PULLREPO]
runTest.py: error: argument --code: expected one argument
##[error]Bash exited with code '2'.
SecretCode: command not found
This error caused by it's a secret variable, and it was passed in command line with the incorrect way.
You may feel confused about this. But, in fact, Microsoft ever warning about this with doc : Never pass secrets on the command line. That's by designed.
I ever meet this similar issue on my docker build. I solved it with mapping the secrets variable value into an environment variable, which also mentioned on the doc of Variable.
For your Bash task, there also has the solution about secret variable: Use the environment variables input to pass secret variables to this script' and set targetType == Inline is necessary.
So, you can add the script below into your Bash task script, to map the secret variable into the environment variable:
inputs:
targetType: 'inline'
- script:
echo $code
env:
code: $(SecretCode)
Related
I am trying to build a pipeline which is capable of deploying environments for multiple customers. I have the following structure
Azure Vault with secrets names in this form: customerName-customerEnvironmentType-secretName
release pipeline with these steps. It is a multi configuration type job where it creates a job for each customer in a list
Get secrets from vault using filter: customerName-customerEnvironmentType-secretName
Bash print env env:sort and print secret (should result in **** in the logs). The bash script has been added below
The customerName-customerEnvironmentType part of the secret name is put into a variable for reuse. What I am trying to do is to get the vault secret based on the customer name and environment type. The bash script is as follows.
#!/bin/bash
env | sort
echo "FULL_NAME: $(FULL_NAME)" # This prints customerName-customerEnvironmentType
echo "normal usage: $(customerName-customerEnvironmentType-secretName)" # This works and prints ***, but this wouldn't be dynamic and would only work for one customer
# Some options I tried, all of them do not resolve. Some of them don't even resolve the FULL_NAME variable
echo "variables['customerName-customerEnvironmentType-secretName']"
echo "${{ variables['FULL_NAME'] }}"
echo "${{ variables.FULL_NAME }}"
echo "$($(FULL_NAME)-secretName)"
echo "$(${{ variables.FULL_NAME }}-secretName)"
echo "variables['$(FULL_NAME)-secretName']"
echo "$(variables['$(FULL_NAME)-secretName'])"
echo "$[variables['$(FULL_NAME)-secretName']]"
Is there a better way of doing this of maybe another way of variable substitution that would work?
logs:
Azure vault build step
2022-12-27T13:38:38.0903674Z ##[section]Starting: Azure Key Vault: customer-environments
2022-12-27T13:38:38.0909613Z ==============================================================================
2022-12-27T13:38:38.0909898Z Task : Azure Key Vault
2022-12-27T13:38:38.0910115Z Description : Download Azure Key Vault secrets
2022-12-27T13:38:38.0910336Z Version : 2.211.1
2022-12-27T13:38:38.0910524Z Author : Microsoft Corporation
2022-12-27T13:38:38.0910836Z Help : https://docs.microsoft.com/azure/devops/pipelines/tasks/deploy/azure-key-vault
2022-12-27T13:38:38.0911185Z ==============================================================================
2022-12-27T13:38:38.2801948Z SubscriptionId: hidden-for-security.
2022-12-27T13:38:38.2804271Z Key vault name: customer-environments.
2022-12-27T13:38:38.2810602Z Downloading secrets using: hidden-for-security.
2022-12-27T13:38:38.8860681Z Number of secrets found in customer-environments: 8
2022-12-27T13:38:38.8900028Z Number of enabled and unexpired secrets found in customer-environments: 8
2022-12-27T13:38:38.8909999Z Downloading secret value for: customerName-customerEnvironmentType-secretName.
.... there where more here, but i have hidden them
2022-12-27T13:38:39.0434461Z ##[section]Finishing: Azure Key Vault: customer-environments
Bash build step
2022-12-27T13:38:39.7977754Z ##[section]Starting: Bash Script
2022-12-27T13:38:39.7990665Z ==============================================================================
2022-12-27T13:38:39.7991040Z Task : Bash
2022-12-27T13:38:39.7991348Z Description : Run a Bash script on macOS, Linux, or Windows
2022-12-27T13:38:39.7991674Z Version : 3.211.0
2022-12-27T13:38:39.7991955Z Author : Microsoft Corporation
2022-12-27T13:38:39.7992334Z Help : https://docs.microsoft.com/azure/devops/pipelines/tasks/utility/bash
2022-12-27T13:38:39.7992749Z ==============================================================================
2022-12-27T13:38:40.0194291Z Generating script.
2022-12-27T13:38:40.0198418Z ========================== Starting Command Output ===========================
2022-12-27T13:38:40.0202702Z [command]/usr/bin/bash /home/vsts/work/_temp/6ad51bf7-2673-449a-9e74-66b2bb6abb19.sh
... removed the output of env | sort for security reasons
2022-12-27T13:38:40.0288018Z FULL_NAME: customerName-customerEnvironmentType
2022-12-27T13:38:40.0298702Z normal usage: ***
2022-12-27T13:38:40.0299290Z variables['customerName-customerEnvironmentType-secretName']
2022-12-27T13:38:40.0299794Z
2022-12-27T13:38:40.0300226Z
2022-12-27T13:38:40.0301018Z /home/vsts/work/_temp/6ad51bf7-2673-449a-9e74-66b2bb6abb19.sh: line 11: ${{ variables['FULL_NAME'] }}: bad substitution
2022-12-27T13:38:40.0302092Z /home/vsts/work/_temp/6ad51bf7-2673-449a-9e74-66b2bb6abb19.sh: line 12: ${{ variables.FULL_NAME }}: bad substitution
2022-12-27T13:38:40.0304197Z /home/vsts/work/_temp/6ad51bf7-2673-449a-9e74-66b2bb6abb19.sh: line 13: customerName-customerEnvironmentType-secretName: command not found
2022-12-27T13:38:40.0305052Z /home/vsts/work/_temp/6ad51bf7-2673-449a-9e74-66b2bb6abb19.sh: line 14: ${{ variables.FULL_NAME }}-secretName: bad substitution
2022-12-27T13:38:40.0305862Z variables['customerName-customerEnvironmentType-secretName']
2022-12-27T13:38:40.0306661Z /home/vsts/work/_temp/6ad51bf7-2673-449a-9e74-66b2bb6abb19.sh: line 15: variables[customerName-customerEnvironmentType-secretName]: command not found
2022-12-27T13:38:40.0307284Z
2022-12-27T13:38:40.0308101Z /home/vsts/work/_temp/6ad51bf7-2673-449a-9e74-66b2bb6abb19.sh: line 16: 'customerName-customerEnvironmentType-secretName': syntax error: operand expected (error token is "'customerName-customerEnvironmentType-secretName'")
2022-12-27T13:38:40.0337412Z ##[error]Bash exited with code '1'.
2022-12-27T13:38:40.0352993Z ##[section]Finishing: Bash Script
I tested in my pipeline and it worked fine. Here’s yaml sample:
variables:
- name: FULL_NAME
value: customerName-customerEnvironmentType
steps:
- task: AzureKeyVault#2
inputs:
azureSubscription: ‘your subscription‘
KeyVaultName: ‘your key vault name’
SecretsFilter: '*'
RunAsPreJob: true
- script: |
echo "FULL_NAME: $(FULL_NAME)"
echo "normal usage: $(customerName-customerEnvironmentType-secretName)"
# get the vault secret based on the customer name and environment type
echo $(${{ variables.FULL_NAME }}-secretName)
displayName: 'Run a multi-line script'
RESULT
BTW it seems like logs in a wetransfer link is private, i have no access to it, you can provide plain text or make it public.
Azure Release pipelines seems to not support variables in variables at the moment.
Found an alternative. By using the azure cli to get the secrets and store them in secret output variables. This seems to behave similar to when the Azure vault step gets the variables. The difference is that I can now controll what they are called.
# Azure CLI bash step
# Save password secrets base64 encoded
NAME_OF_SECRET_WITHOUT_CUSTOMER_SPECIFIC_NAMING=$(az keyvault secret show --vault-name some-vault --name some-secret | jq -r '.value' | base64)
echo "##vso[task.setvariable variable=NAME_OF_SECRET_WITHOUT_CUSTOMER_SPECIFIC_NAMING;isOutput=true;issecret=true]$NAME_OF_SECRET_WITHOUT_CUSTOMER_SPECIFIC_NAMING"
# Other bash task
# Note: Under output variables in de pipeline builder for the CLI step, I added the the VAULT_ prefix. This is added to the front of the variable. If you don't do this, I am not sure if you need something else in front of it.
echo "$(VAULT_NAME_OF_SECRET_WITHOUT_CUSTOMER_SPECIFIC_NAMING)" <- logs *** because it is a secret
variable big_var_01 defined with value '3q4w#V$X3q4w#V$X' by following azure pipeline yaml file gets corrupted to become value '3q4w#V#V' when read back in azure pipeline template
cat parent_scott.yaml
variables:
- name: big_var_01
value: ${{ parameters.big_var_01 }}
parameters:
- name: big_var_01
displayName: "this var wants to get written to then read by templates"
type: string
default: '3q4w#V$X3q4w#V$X'
# CI Triggers
trigger:
branches:
exclude:
- '*'
pool:
vmImage: 'ubuntu-latest'
# Release Stages
stages:
- template: child_scott_one.yaml
following azure pipeline template variable big_var_01 is read back however its value is corrupted and does not match above assignment
cat child_scott_one.yaml
# Release Stages
stages:
- stage: A
jobs:
- job: JA
steps:
- script: |
echo "here is big_var_01 -->$(big_var_01)<-- "
local_var_01=$(big_var_01)
echo
echo "here is local_var_01 -->$local_var_01<-- "
echo
echo "length of local_var_01 is ${#local_var_01}"
echo
name: DetermineResult
see run of above pipeline
https://dev.azure.com/sekhemrekhutawysobekhotep/public_project/_build/results?buildId=525&view=logs&j=54e3124b-25ae-54f7-b0df-b26e1988012b&t=52fad91f-d6ac-51fb-b63d-00fda7898bb6&l=13
see code at https://github.com/sekhemrekhutawysobekhotep/shared_variables_across_templates
How to make the string variable big_var_01 get treated as a literal evidently its somehow getting evaluated and thus corrupted ... above code is a simplification of my actual azure pipeline where I get same variable corruption issue even when setting up a Key Value secret with value 3q4w#V$X3q4w#V$X which gets corrupted when read back in a pipeline template
here is another pipeline run which explicitly shows this issue https://dev.azure.com/sekhemrekhutawysobekhotep/public_project/_build/results?buildId=530&view=logs&j=ed5db508-d8c1-5154-7d4e-a21cef45e99c&t=a9f49566-82d0-5c0a-2e98-46af3de0d6e9&l=38 on this run I check marked ON pipeline run option: "Enable system diagnostics" ... next I will try to single quote my shell assignment from the azure variable
At some step Azure DevOps or Ubuntu replaced part of your string. So you have:
3q4w#V$X3q4w#V$X = 3q4w#V + $X3q4w + #V + $X
and this part $X3q4w and this $X was replaced with empty string giving you 3q4w#V + #V.
If you run this with \ before $ like here 3q4w#V\$X3q4w#V\$X
This is job Foo.
here is big_var_01 -->3q4w#V$X3q4w#V$X<--
here is local_var_01 -->3q4w#V$X3q4w#V$X<--
length of local_var_01 is 16
I got an error running this on windows-latest however I got correct string:
"This is job Foo."
"here is big_var_01 -->3q4w#V$X3q4w#V$X<-- "
'local_var_01' is not recognized as an internal or external command,
operable program or batch file.
ECHO is off.
"here is local_var_01 -->$local_var_01<-- "
ECHO is off.
"length of local_var_01 is ${#local_var_01}"
ECHO is off.
##[error]Cmd.exe exited with code '9009'.
so it looks like ubuntu replaces it with env variables however, since there is not variables like $X3q4w and $X it replaces with empty string.
Found a solution ... it works if I single quote around the azure variable during bash shell assignment
local_var_01=$(big_var_01) # bad results with value 3q4w#V#V
local_var_02="$(big_var_01)" # bad results with value 3q4w#V#V
local_var_03='$(big_var_01)' # good this gives value 3q4w#V$X3q4w#V$X
my instincts said to not use single quotes knowing its not the bash shell way however once I accepted the fact azure performs some magic ~~helping~~ interstigal layer pre processing in between pipeline source code and fully resolved bash shell execution I took a chance and tried this ... come to find out this is how bash shell blocks variable expansion
In YAML pipeline I'm attempting to set OS variable on Linux Agent so Cypress can look it up:
- script: export CYPRESS_key=ala
displayName: "Set key"
- script: echo $(CYPRESS_key)
displayName: "Print key"
unfortunately the OS variable is never set.
The output is:
/home/vsts/work/_temp/321aacd-cadd-4a16-a4d1-db7927deacde.sh: line 1: CYPRESS_key: command not found
$(command) and ${variable} you are using wrong brackets
- script: export CYPRESS_key=ala
displayName: "Set key"
- script: echo ${CYPRESS_key}
displayName: "Print key"
- script: echo $(cat /etc/os-release)
displayName: "Print file content"
Environment variables in Linux are accessed as $ENVIRONMENT_VARIABLE_NAME, not $(ENVIRONMENT_VARIABLE_NAME).
Is there any way to use an export variable, defined in the generic before_script:
before_script:
- export UPPERHASH=$(echo $CI_COMMIT_REF_SLUG | md5sum | tr [a-z] [A-Z])
into another job as a variable, because I am gonna use trigger but trigger does not allow to have any script, ex:
test variables:
stage: test-variables
variables:
UPPERHASH_TEST1: $UPPERHASH
trigger:
project: "...\..."
I have tried multiple options but none of them is working.
It will not work this way because "test variables".variables is processed before before_script
You only can refer to this variable in a script:
test variables:
stage: test-variables
script:
UPPERHASH_TEST1=$UPPERHASH
... trigger other project from command line ...
Read here on how to trigger other project from command line
https://docs.gitlab.com/ee/ci/triggers/README.html
Is it possible to make a build Pipeline with a file-based trigger?
Let's say I have the following Directory structure.
Microservices/
|_Service A
|_Test_Stage
|_Testing_Config
|_QA_Stage
|_QA_Config
|_Prod_stage
|_Prod_Config
|_Service B
|_Test_Stage
|_Testing_Config
|_QA_Stage
|_QA_Config
|_Prod_stage
|_Prod_Config
I want to have just one single YAML Build Pipeline File.
Based on the Variables $(Project) & $(Stage) different builds are created.
Is it possible to check what directory/file initiated the Trigger and set the variables accordingly?
Additionally it would be great if its possible to use those variables to set the tags to the artifact after the run.
Thanks
KR
Is it possible to check what directory/file initiated the Trigger and
set the variables accordingly?
Of course yes. But there's no direct way since we do not provide any pre-defined variables to store such message, so you need additional complex work around to get that.
#1:
Though there's no variable can direct stores the message like which folder and which file is modified, but you could get it by tracking the commit message Build.SourceVersion via api.
GET https://dev.azure.com/{organization}/{project}/_apis/git/repositories/{repositoryId}/commits/{commitId}/changes?api-version=5.1
From its response body, you can directly know its path and file:
Since the response body is JSON format, you could make use of some JSON function to parsing this path value. See this similar script as a reference.
Then use powershell script to set these value as pipeline variable which the next agent jobs/tasks could use them.
Also, in your scenario, all of these should be finished before all next job started. So, you could consider to create a simple extension with pipeline decorator. Define all above steps in decorator, so that it can be finished in the pre-job of every pipeline.
#2
Think you should feel above method is little complex. So I'd rather suggest you could make use of commit messge. For example, specify project name and file name in commit message, get them by using variable Build.SourceVersionMessage.
Then use the powershell script (I mentioned above) to set them as variable.
This is more convenient than using api to parse commits body.
Hope one of them could help.
Thanks for your reply.
I tried a different approach with a Bash Script.
Because I only use ubuntu Images.
I make "git log" with Filtering for the last commit of the Directory Microservices.
With some awk (not so a satisfying Solution) I get the Project & Stage and write them into Pipeline Variables.
The Pipeline just gets triggered when there is a change to the Microservices/* Path.
trigger:
batch: true
branches:
include:
- master
paths:
include:
- Microservices/*
The first job when the trigger activated, is the Dynamic_variables job.
This Job I only use to set the Variables $(Project) & $(Stage). Also the build tags are set with those Variables, so I'm able do differentiate the Artifacts in the Releases.
jobs:
- job: Dynamic_Variables
pool:
vmImage: 'ubuntu-latest'
steps:
- checkout: self
- task: Bash#3
name: Dynamic_Var
inputs:
filePath: './scripts/multi-usage.sh'
arguments: '$(Build.SourcesDirectory)'
displayName: "Set Dynamic Variables Project"
- task: Bash#3
inputs:
targetType: 'inline'
script: |
set +e
if [ -z $(Dynamic_Var.Dynamic_Project) ]; then
echo "target Project not specified";
exit 1;
fi
echo "Project is:" $(Dynamic_Var.Dynamic_Project)
displayName: 'Verify that the Project parameter has been supplied to pipeline'
- task: Bash#3
inputs:
targetType: 'inline'
script: |
set +e
if [ -z $(Dynamic_Var.Dynamic_Stage) ]; then
echo "target Stage not specified";
exit 1;
fi
echo "Stage is:" $(Dynamic_Var.Dynamic_Stage)
displayName: 'Verify that the Stage parameter has been supplied to pipeline'
The Bash Script I run in this Job looks like this:
#!/usr/bin/env bash
set -euo pipefail
WORKING_DIRECTORY=${1}
cd ${WORKING_DIRECTORY}
CHANGEPATH="$(git log -1 --name-only --pretty='format:' -- Microservices/)"
Project=$(echo $CHANGEPATH | awk -F[/] '{print $2}')
CHANGEFILE=$(echo $CHANGEPATH | awk -F[/] '{print $4}')
Stage=$(echo $CHANGEFILE | awk -F[-] '{print $1}')
echo "##vso[task.setvariable variable=Dynamic_Project;isOutput=true]${Project}"
echo "##vso[task.setvariable variable=Dynamic_Stage;isOutput=true]${Stage}"
echo "##vso[build.addbuildtag]${Project}"
echo "##vso[build.addbuildtag]${Stage}"
If someone has a better solution then the awk commands please let me know.
Thanks a lot.
KR