Setting OS variable in Azure DevOps pipeline doesn't work - linux

In YAML pipeline I'm attempting to set OS variable on Linux Agent so Cypress can look it up:
- script: export CYPRESS_key=ala
displayName: "Set key"
- script: echo $(CYPRESS_key)
displayName: "Print key"
unfortunately the OS variable is never set.
The output is:
/home/vsts/work/_temp/321aacd-cadd-4a16-a4d1-db7927deacde.sh: line 1: CYPRESS_key: command not found

$(command) and ${variable} you are using wrong brackets
- script: export CYPRESS_key=ala
displayName: "Set key"
- script: echo ${CYPRESS_key}
displayName: "Print key"
- script: echo $(cat /etc/os-release)
displayName: "Print file content"

Environment variables in Linux are accessed as $ENVIRONMENT_VARIABLE_NAME, not $(ENVIRONMENT_VARIABLE_NAME).

Related

Azure pipeline ##vso[build.addbuildtag] does not work

I have a problem with adding buildtag to one specific pipeline.
When i use this code with normal string it adds tag successfully:
- bash: |
echo "##vso[build.addbuildtag]TEST_TAG"
displayName: 'Add TAG to Run'
env:
AZURE_DEVOPS_EXT_PAT: $(System.AccessToken)
but when i use it with variable it throws me an error.
The funniest thing is that same code with variable works fine in another pipeline:
- bash: |
echo "##vso[build.addbuildtag]$(ChangeNumber)"
displayName: 'Add TAG to Run'
env:
AZURE_DEVOPS_EXT_PAT: $(System.AccessToken)
Error:
##[error]Unable to process command '##vso[build.addbuildtag]' successfully. Please reference documentation (http://go.microsoft.com/fwlink/?LinkId=817296)
##[error]Build tag is required.
Variable is fine because i "echo" it earlier successfully.
What might be the issue?
I figured out that variable was an issue and that it was defined but not passed to another tasks, so the solution is:
- task: Bash#3
displayName: 'Add TAG to Run'
name: TAG
inputs:
targetType: "inline"
script: |
ChangeNumber=$(<$(System.ArtifactsDirectory)/variables/ChangeNumber.var)
echo "##vso[task.setvariable variable=ChangeNumber;isOutput=true]$ChangeNumber"
echo "##vso[build.addbuildtag]$ChangeNumber"
failOnStderr: true

Set environment variable in unix machine using bash script via YAML steps

I need to set the below environment variable to a Unix machine through bash script as an inline script in the YAML file
My env variables are
cache=30
delay=10
url={"https.8shd3dad#d/wipeout#doamin.com"}
I have tried the below steps in YAML for setting the same, but couldn't see the environment variable and its value at my runtime
- task: Bash#3
inputs:
targetType: 'inline'
script: export cache=30
export url={"https.8shd3dad#d/wipeout#doamin.com"}
export Delay=10
env:
cache : $(30)
Can anyone help me in fixing this issue? since I am new to YAML and bash.
After each export command, you also need to set the variable in the variable service to be able to expose it as an environment variable.
- task: Bash#3
inputs:
targetType: 'inline'
script: |
export cache=30
echo "##vso[task.setvariable variable=cache;]${cache}"
export url={"https.8shd3dad#d/wipeout#doamin.com"}
echo "##vso[task.setvariable variable=url;]${url}"
export Delay=10
echo "##vso[task.setvariable variable=Delay;]${Delay}"
env:
cache : $(30)

In Azure, can I create a mapping at runtime?

I'm trying to loop over an array of group names and I want to dynamically get the IDs of those groups so that I can assign roles to them.
I just learned I can set a variable at runtime in a pipeline like this:
steps:
- script: echo "##vso[task.setvariable variable=myVar;]foo"
displayName: Set variable
- script: echo "You can use macro syntax for variables: $(myVar)"
displayName: Echo variable
But to loop over the group names I now want to set a mapping so that I can use that in a subsequent step where I assign the roles. I tried the following:
steps:
- script: echo "##vso[task.setvariable variable=mymapping;]{a: 1}"
displayName: Set mapping
- script: echo $(mymapping)
displayName: Echo mapping
- script: echo $(mymapping.a)
displayName: Echo mapping value
But I get an error saying Mapping values are not allowed in this context.
Is there any other way of creating some sort of mapping/object/dict from within a pipeline?

Sharing variables between deployment job lifecycle hooks in Azure Devops Pipelines

I have been trying to define a variable in a lifecycle hook within a deployment job, to then access that variable in a later lifecycle hook. The documentation on deployment jobs references the ability to do so, but offers no actual examples of this case (added emphasis):
Define output variables in a deployment job's lifecycle hooks and consume them in other downstream steps and jobs within the same stage.
The sample pipeline that I've been working with:
stages:
- stage: Pipeline
jobs:
- deployment: Deploy
environment: 'testing'
strategy:
runOnce:
preDeploy:
steps:
- bash: |
echo "##vso[task.setvariable variable=myLocalVar;isOutput=false]local variable"
name: setvarStep
- bash: |
echo "##vso[task.setvariable variable=myOutputVar;isOutput=true]output variable"
name: outvarStep
- bash: |
echo 'Both $(myLocalVar) and $(outvarStep.myOutputVar) are available here'
echo "Both $(myLocalVar) and $(outvarStep.myOutputVar) are available here"
deploy:
steps:
- bash: |
echo 'Neither $(myLocalVar) nor $(outvarStep.myOutputVar) are available here'
echo "Neither $(myLocalVar) nor $(outvarStep.myOutputVar) are available here"
I have tried any number of options, but nothing I've done seems to allow this to actually work - the output of the bash task in the deploy lifecycle hook is:
Neither nor are available here
I've tried wiring in the variables to the bash task via the env: input, both using the macro syntax (eg $(myOutputVar) and the runtime expression syntax, hoping maybe there's a hidden dependency I could find (eg $[ dependencies.Deploy.outputs['preDeploy.outVarStep.myOutputVar'] ], and many, many other syntaxes)
I've tried defining the variables at the job level, hoping they'd be updated by the first preDeploy lifecycle hook and available to the deploy lifecycle hook.
I've tried echo-ing the variables via the runtime expression syntax
Many other combinations of syntax, hoping I'd stumble onto the actual answer
But all to no avail. I will likely resort to a hack of uploading the variable as an artifact and downloading it later, but would really like to find the solution to this issue. Has anyone been able to accomplish this? Many thanks in advance.
Work around:
Firstly, share you the correct sample on this scenario you are looking for:
stages:
- stage: Pipeline
jobs:
- deployment: Deploy
environment: 'testing'
strategy:
runOnce:
preDeploy:
steps:
- bash: |
echo "##vso[task.setvariable variable=myLocalVar;isOutput=false]local variable"
name: setvarStep
- bash: |
echo "##vso[task.setvariable variable=myOutputVar;isOutput=true]output variable"
name: outvarStep
- bash: |
echo 'Both $(myLocalVar) and $(outvarStep.myOutputVar) are available here'
echo "Both $(myLocalVar) and $(outvarStep.myOutputVar) are available here"
mkdir -p $(Pipeline.Workspace)/variables
echo "$(myLocalVar)" > $(Pipeline.Workspace)/variables/myLocalVar
echo "$(outvarStep.myOutputVar)" > $(Pipeline.Workspace)/variables/myOutputVar
- publish: $(Pipeline.Workspace)/variables
artifact: variables
deploy:
steps:
- bash: |
echo 'Neither $(myLocalVar) nor $(outvarStep.myOutputVar) are available here'
echo "Neither $(myLocalVar) nor $(outvarStep.myOutputVar) are available here"
- download: current
artifact: variables
- bash: |
myLocalVar=$(cat $(Pipeline.Workspace)/variables/myLocalVar)
myOutputVar=$(cat $(Pipeline.Workspace)/variables/myOutputVar)
echo "##vso[task.setvariable variable=myLocalVar;isoutput=true]$myLocalVar"
echo "##vso[task.setvariable variable=myOutputVar;isoutput=true]$myOutputVar"
name: output
- bash: |
echo "$(output.myLocalVar)"
echo "$(output.myOutputVar)"
name: SucceedToGet
You will see that the output variables can succeed to printed in SucceedToGet task:
Explanation for why your previous attempts were keeping failed:
For our system, strategy represents one job before it start to running(compile time). It only expanded at running time.
Define output variables in a deployment job's lifecycle hooks and
consume them in other downstream steps and jobs within the same stage.
Here other downstream steps and jobs means a independent job which independent at compile time. That's why we provide that sample YAML under this line.

Azure Pipeline File-based Trigger and Tags

Is it possible to make a build Pipeline with a file-based trigger?
Let's say I have the following Directory structure.
Microservices/
|_Service A
|_Test_Stage
|_Testing_Config
|_QA_Stage
|_QA_Config
|_Prod_stage
|_Prod_Config
|_Service B
|_Test_Stage
|_Testing_Config
|_QA_Stage
|_QA_Config
|_Prod_stage
|_Prod_Config
I want to have just one single YAML Build Pipeline File.
Based on the Variables $(Project) & $(Stage) different builds are created.
Is it possible to check what directory/file initiated the Trigger and set the variables accordingly?
Additionally it would be great if its possible to use those variables to set the tags to the artifact after the run.
Thanks
KR
Is it possible to check what directory/file initiated the Trigger and
set the variables accordingly?
Of course yes. But there's no direct way since we do not provide any pre-defined variables to store such message, so you need additional complex work around to get that.
#1:
Though there's no variable can direct stores the message like which folder and which file is modified, but you could get it by tracking the commit message Build.SourceVersion via api.
GET https://dev.azure.com/{organization}/{project}/_apis/git/repositories/{repositoryId}/commits/{commitId}/changes?api-version=5.1
From its response body, you can directly know its path and file:
Since the response body is JSON format, you could make use of some JSON function to parsing this path value. See this similar script as a reference.
Then use powershell script to set these value as pipeline variable which the next agent jobs/tasks could use them.
Also, in your scenario, all of these should be finished before all next job started. So, you could consider to create a simple extension with pipeline decorator. Define all above steps in decorator, so that it can be finished in the pre-job of every pipeline.
#2
Think you should feel above method is little complex. So I'd rather suggest you could make use of commit messge. For example, specify project name and file name in commit message, get them by using variable Build.SourceVersionMessage.
Then use the powershell script (I mentioned above) to set them as variable.
This is more convenient than using api to parse commits body.
Hope one of them could help.
Thanks for your reply.
I tried a different approach with a Bash Script.
Because I only use ubuntu Images.
I make "git log" with Filtering for the last commit of the Directory Microservices.
With some awk (not so a satisfying Solution) I get the Project & Stage and write them into Pipeline Variables.
The Pipeline just gets triggered when there is a change to the Microservices/* Path.
trigger:
batch: true
branches:
include:
- master
paths:
include:
- Microservices/*
The first job when the trigger activated, is the Dynamic_variables job.
This Job I only use to set the Variables $(Project) & $(Stage). Also the build tags are set with those Variables, so I'm able do differentiate the Artifacts in the Releases.
jobs:
- job: Dynamic_Variables
pool:
vmImage: 'ubuntu-latest'
steps:
- checkout: self
- task: Bash#3
name: Dynamic_Var
inputs:
filePath: './scripts/multi-usage.sh'
arguments: '$(Build.SourcesDirectory)'
displayName: "Set Dynamic Variables Project"
- task: Bash#3
inputs:
targetType: 'inline'
script: |
set +e
if [ -z $(Dynamic_Var.Dynamic_Project) ]; then
echo "target Project not specified";
exit 1;
fi
echo "Project is:" $(Dynamic_Var.Dynamic_Project)
displayName: 'Verify that the Project parameter has been supplied to pipeline'
- task: Bash#3
inputs:
targetType: 'inline'
script: |
set +e
if [ -z $(Dynamic_Var.Dynamic_Stage) ]; then
echo "target Stage not specified";
exit 1;
fi
echo "Stage is:" $(Dynamic_Var.Dynamic_Stage)
displayName: 'Verify that the Stage parameter has been supplied to pipeline'
The Bash Script I run in this Job looks like this:
#!/usr/bin/env bash
set -euo pipefail
WORKING_DIRECTORY=${1}
cd ${WORKING_DIRECTORY}
CHANGEPATH="$(git log -1 --name-only --pretty='format:' -- Microservices/)"
Project=$(echo $CHANGEPATH | awk -F[/] '{print $2}')
CHANGEFILE=$(echo $CHANGEPATH | awk -F[/] '{print $4}')
Stage=$(echo $CHANGEFILE | awk -F[-] '{print $1}')
echo "##vso[task.setvariable variable=Dynamic_Project;isOutput=true]${Project}"
echo "##vso[task.setvariable variable=Dynamic_Stage;isOutput=true]${Stage}"
echo "##vso[build.addbuildtag]${Project}"
echo "##vso[build.addbuildtag]${Stage}"
If someone has a better solution then the awk commands please let me know.
Thanks a lot.
KR

Resources