accessing user defined pipeline variables within definitions steps bitbucket - bitbucket-pipelines

Im trying pass a user defined variable from a custom pipeline to a step defined within the definitions section.
My yml snippet is below:
definitions:
steps:
- step: &TagVersion
trigger: manual
script:
- export VERSION=$VERSION
- echo "VERSION $VERSION"
custom:
run_custom:
- variables:
- name: VERSION
- step:
script:
- echo "starting"
- parallel:
- step:
<<: *TagVersion
variables:
VERSION: $VERSION
When I build the pipeline I can see the variable is listed as a pipeline variable when running the step TagVersion, and the correct value is shown there but dont know how to use this within the scripts section where im trying to echo out the value.
thanks

Related

Getting an error like unexpected value "name" on azure pipelines

I have the below yaml file. I am trying to solve the issue where two deployment files set up in two different pipelines exist in a single folder and a case where one file change does not trigger other pipelines based on file changes.
For the above purpose, I have made the below change but getting an error in the pipeline unexpected value "name",
and the code is below
.....
trigger:
paths:
include:
- '.devops/**'
variables:
deploymentFile: 'test-hub-deploy-dev.yml'
stages:
- stage: Get the deployment file name
jobs:
- job: Get deployment file
displayName: 'Get deployment file'
steps:
- name: Get the deployment file name // getting error at here
script: |
if [[ $Build.SourcesDirectory == *"deployment-folder"* ]]; then
echo "##[set-output name=deploymentFile;]$(basename $Build.SourcesDirectory)"
fi
displayName: Get deployment file name
env:
SYSTEM_DEFAULTWORKINGDIRECTORY: $(System.DefaultWorkingDirectory)
BUILD_SOURCESDIRECTORY: $(Build.SourcesDirectory)
outputs:
deploymentFile: '$(deploymentFile)'
- name: Deploy to test-dev
condition: and(succeeded(), eq(variables['deploymentFile'], 'test-hub-deploy-dev.yml'))
script: echo "Deploying using test-hub-deploy-dev.yml"
displayName: Deploy to test-dev
- stage: Build
jobs:
....
I don't know where I am doing wrong with the above yaml code. Please point me in the right direction.
1 - In triggers section 'include' and - '.devops/**' should be on the same identation level
https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/trigger?view=azure-pipelines#examples-2
trigger:
paths:
include:
- '.devops/**'
2 - Same for 'stages' and '- stage: abc'
https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/stages-stage?view=azure-pipelines
stages:
- stage: abc
3 - The same for 'jobs' and '- job'
https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/jobs?view=azure-pipelines#examples
4 - And for 'steps'
https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/steps?view=azure-pipelines
First declare your step(script/pwsh/other) then you can assign name for it.
'script' needs to have dash in front of it.
https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/steps-script?view=azure-pipelines#examples
5 - Outputting variable is used when you want to get this variable from next jobs, here you have second step using variable from step one within same job.
6 - Where did you find this 'set-output'? Why not use ##vso + task.setvariable. This whole line cannot work.
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/set-variables-scripts?view=azure-devops&tabs=bash#set-an-output-variable-for-use-in-the-same-job
7 - $(basename $Build.SourcesDirectory)
I suppose that 'basename' is variable, not constant string but you did not provide this infromation.
This will never work, to access variable you need to use syntax $(VariableName)
If you want to concat variables, put them next to each other
"$(basename)$(Build.SourcesDirectory)
8 - I do not guarantee it will fully work because there was plenty of mistakes that I could miss something.
The 'steps' section of yaml should be similar to this one
steps:
- script: |
if [[ $Build.SourcesDirectory == *"deployment-folder"* ]]; then
echo "##vso[task.setvariable variable=deploymentFile;]$(basename)$(Build.SourcesDirectory)"
fi
name: Get the deployment file name
displayName: Get deployment file name
env:
SYSTEM_DEFAULTWORKINGDIRECTORY: $(System.DefaultWorkingDirectory)
BUILD_SOURCESDIRECTORY: $(Build.SourcesDirectory)
- script: echo "Deploying using test-hub-deploy-dev.yml"
displayName: Deploy to test-dev
name: Deploy to test-dev
condition: and(succeeded(), eq(variables['deploymentFile'], 'test-hub-deploy-dev.yml'))

Azure Pipeline: Update variables according to parameter

I have some very simple variables, which I would like to change according to the environment.
I have written the code below in very different ways (including indentation) but none was fruitful. Alternatives I see are
Use variable groups (trying to avoid to have too many of them)
Write a bash script which updates the variables (will work but I
think its not a super neat solution)
variables:
- group : secrets
- name: hello
value: world
${{ if eq(parameters.environment, 'dev') }}:
- name: RabbitMQ_replicaCount
value: 3
${{ if eq(parameters.environment, 'test') }}:
RabbitMQ_replicaCount: '1'
Any other ideas will be appriciated :)
I would rather go by a PS script/Bash script for this task. Why ? The logic part of build where manipulation is needed like setting or overriding var based on branch or env can be done in a better way in script rather than the build yaml itself. Also this part un-necessary elongates the yaml.
Step 1 : Define a var in the build pipe with default env name
and may be another var whose value you want to set based on condition
Step 2 : Add a yml file(lets name it BuildEnv.yml) in your repo which actually contains your PowerShell/Bash code:
steps:
- powershell: |
if($BuildEnv -ne "Test"){
Write-Host "##vso[task.setvariable variable=BuildEnv]Dev"
Write-Host "##vso[task.setvariable variable=RabbitMQ_replicaCount]11"
}
displayName: 'Override Build Env'
# MORE CODE HERE
Step 3: Plug your yml in the build pipe as a template-
trigger:
branches:
include:
- master
name: $(date:yyyy-MM-dd_HH.mm)_$(rev:.r)
stages:
- stage: Build_Stage
displayName: Build_Stage
jobs:
- job: Build_Job
pool:
name: ABC
steps:
- template: ..\BuildEnv.yml
#REST CODE
That's it. You are done.
Reference : Template usage in Azure DevOps build - https://learn.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azure-devops

How can I use variable created inside of pipeline (from gradle.properties file) within an template as a task

I have an java project which have gradle.properties file. Im extracting variables defined in gradle.properties as
##vso[task.setvariable variable=myVariable;]`my script to extract it from gradle.properties`
Then im using template from another repository that needs that variable but I can't use it within task, but when I try use it within - script: echo $variable as a step instead of task it is working.
When i try to use it within task it sees variable as $variable not a value.
Maybe there is a better way to extract variables to azure pipeline instead of using this approach?
Check the error message:
We get the error before the pipeline run the bash task, Since it cannot create the variable parampass, we get the parameters value is $(parampass) instead of the variable value.
Check this doc:
In a pipeline, template expression variables ${{ variables.var }} get processed at compile time, before runtime starts. Macro syntax variables $(var) get processed during runtime before a task runs. Runtime expressions $[variables.var] also get processed during runtime but were designed for use with conditions and expressions.
As a workaround:
pipeline.yml
pool:
vmImage: ubuntu-20.04
resources:
repositories:
- repository: common
type: git
name: organisation/repo-name
variables:
- name: parampass
value: xxx
stages:
- stage: "Build"
jobs:
- job: "Build"
steps:
- template: templatename.yml#common
parameters:
par1: ${{ variables.parampass}}
Result:
Probably you do not provide variable to your template
Example execution of template with provided parameter
- template: my/path/to/myTemplate.yml#MyAnotherRepositoryResourceName
parameters:
projectsToBeTested: $(someVariable)
And example template accepting parameters
steps:
- task: DotNetCoreCLI#2
displayName: 'Just testing'
inputs:
command: test
projects: ${{ parameters.projectsToBeTested}}
Please provide more information if it does not help.
Code looks like this:
pipeline.yml
pool:
vmImage: ubuntu-20.04
resources:
repositories:
- repository: common
type: git
name: organisation/repo-name
stages:
- stage: "Build"
jobs:
- job: "Build"
steps:
- bash: |
echo "##vso[task.setvariable variable=parampass]anything"
- template: templatename.yml#common
parameters:
par1: $(parampass)
templatename.yml
parameters:
- name: par1
steps:
- task: SonarCloudPrepare#1
displayName: SonarCloud analysis prepare
inputs:
SonarCloud: ${{ parameters.par1}}
organization: 'orgname'
scannerMode: 'Other'
extraProperties: |
# Additional properties that will be passed to the scanner,
# Put one key=value per line, example:
# sonar.exclusions=**/*.bin
sonar.projectKey= # same param pass case
sonar.projectName= # same param pass case
Generally, it does not matter if i do have parameters passed or if I'm using the template as if it were part of the pipeline code within. Output is always $(parampass) could not be found or smth

Use variables in Azure DevOps Pipeline templates

We have a collection of Azure DevOps pipeline templates that we re-use across multiple repositories. Therefore we wanted to have a file that contains variables for all of our templates.
The repo structure looks like this
template repo
├── template-1.yml
├── template-2.yml
└── variables.yml
project repo
├── ...
└── azure-pipelines.yml
The variables.yml looks like this
...
variables:
foo: bar
In template-1.yml we are importing the variables.yml as described in here
variables:
- template: variables.yml
In the azure-pipelines.yml we are using the template like this
resources:
repositories:
- repository: build-scripts
type: git
name: project-name/build-scripts
steps:
...
- template: template-1.yml#build-scripts
When we now try to run the pipeline, we get the following error message:
template-1.yml#build-scripts (Line: 10, Col: 1): Unexpected value 'variables'
The issue is because you used variable template at steps scope. And variables simply doesn't exists at that level. This should work for you:
resources:
repositories:
- repository: build-scripts
type: git
name: project-name/build-scripts
variables:
- template: template-1.yml#build-scripts
steps:
...
this is available to use at any place where variables are possible to use. So for instance you can use this in that way:
jobs:
- job: myJob
timeoutInMinutes: 10
variables:
- template: template-1.yml # Template reference
pool:
vmImage: 'ubuntu-16.04'
steps:
- script: echo My favorite vegetable is ${{ variables.favoriteVeggie }}.
If your template file only has variables, you can refer to Krzysztof Madej's answer.
If your template file has both variables and steps as shown below, it can only be used by extends.
# File: template-1.yml
variables: ...
steps: ...
Or you can write them in a stage, as shown below.
# File: template-1.yml
stages:
- stage: {stage}
variables: ...
jobs:
- job: {job}
steps: ...
Then insert it as a separate stage.
# azure-pipelines.yml
stages:
- stage: ...
- template: template-1.yml
Replying as I think there was something missed in the original explanations:
First:
If you are referencing variables in a template then you must ensure that you are extending that template when you call it. Calling a template that defines variables that is not extended will result in a failed pipeline execution.
template-1.yml#build-scripts (Line: 10, Col: 1): Unexpected value 'variables'
Second: When referencing variables in multiple templates do not use:
${{ variables.foo }} #even though this is seen in templates examples.
but rather use the normal variable syntax
$(foo) #this works
So for the originally reference example the following should work.
Variables.yml:
variables:
foo: bar
template-1.yml:
variables:
- template: variables.yml
....
steps:
...
- task: CmdLine#2
displayName: Display Variable
inputs:
script: |
echo $(foo)
azure-pipelines.yml:
resources:
repositories:
- repository: build-scripts
type: git
name: project-name/build-scripts
extends:
template: template-1.yml#build-scripts
template-2.yml
steps:
...
- task: CmdLine#2
displayName: Display Variable
inputs:
script: |
echo $(foo)
This approach can help someone, so I decided to post it here.
One more kind of "workaround" for that case, maybe a bit "dirty" since you need to specify the parameters explicitly each time you execute template, which is not a good idea if you have a lot of parameters to pass. (actually, there is a way, read below improved version) But, if you really want or you have not that much parameters, this should work:
the logic is: you have all your parameters in variables template, like templates/vars.yml:
variables:
- name: myVar1
value: MyValue1
- name: myVar2
value: MyValue2
and since you have everything you need in the variables, probably there is no need to importing variables into template itself, because template will be executed in pipeline, which will have your variables imported and you can substitute it explicitly, like in the example below:
templates/my-template-setup-env.yml's content (without variables inside it):
steps:
- script: |
echo "$(myVar3)"
my-azure-pipeline.yml's content (with importing variables template):
name: my-cute-CI-name
trigger:
- main
pool:
vmImage: ubuntu-18.04
variables:
- template: templates/vars.yml # importing your variables from templates
stages:
- stage: RunTheStage
displayName: 'Run first stage'
jobs:
- job: RunTheJob
displayName: 'Run your job'
steps:
- template: templates/my-template-setup-env.yml # your template
parameters:
myVar3: $(myVar1) # substitute value from vars.yml, so myVar1 will be used in templated and printed
Improved version
But, if you have unique naming of your params and variables across all pipelines and templates, you are in safe to not specify it explicitly during template usage, that will work as well:
edited and shortened version of my-azure-pipeline.yml (in case you have the same name of your variable and parameter in template):
variables:
- template: templates/vars.yml
...
steps:
- template: templates/my-template-setup-env.yml # env preparation
# parameters:
# myVar2: $(myVar2) # you don't need to pass any parameters explicitly to the template since you have same name of variable
templates/my-template-setup-env.yml then should be like this:
steps:
- script: |
echo "$(myVar2)" # not myVar3, since myVar3 is not in initial variables file templates/vars.yml
or you need to add remaining variables (myVar3 in our first case) into templates/vars.yml file as well.

How to share environment variables in parallel Bitbucket pipeline steps?

So, I am using Bitbucket pipelines to deploy my application. The app consists of two components: 1 and 2. They are deployed in two parallel steps in the Bitbucket pipeline:
pipelines:
custom:
1-deploy-to-test:
- parallel:
- step:
name: Deploying 1
image: google/cloud-sdk:latest
script:
- SERVICE_ENV=test
- GCLOUD_PROJECT="some-project"
- MEMORY_LIMIT="256Mi"
- ./deploy.sh
- step:
name: Deploying 2
image: google/cloud-sdk:latest
script:
- SERVICE_ENV=test
- GCLOUD_PROJECT="some-project"
- MEMORY_LIMIT="256Mi"
- ./deploy2.sh
The environment variables SERVICE_ENV, GCLOUD_PROJECT and MEMORY_LIMIT are always the same for deployments 1 and 2.
Is there any way to define these variables once for both parallel steps?
You can use User-defined variables in Pipelines. For example, you can configure your SERVICE_ENV, GCLOUD_PROJECT and MEMORY_LIMIT as a Repository variables and they will be available to all steps in your pipeline.
To add to the other answers, here's a glimpse of how we currently handle this, inspired from all the other solutions found in Bitbucket's forums.
To allow parallel tasks to re-use deployment variables (currently cannot be passed between steps), we use the bash Docker image first to set environment variables in an artifact. The pure bash image is very fast (runs under 8 seconds usually).
Then all other tasks can run in parallel benefiting of the deployment and repository variables that we usually set, all of that bypassing the current Bitbucket Pipelines limitations.
definitions:
steps:
- step: &set_env
name: Set multi-steps env variables
image: bash:5.2.12
artifacts:
- set_env.sh
script:
## Pass env
- echo "Passing all env variables to next steps"
- >-
echo "
export USERNAME=$USERNAME;
export HOST=$HOST;
export PORT_LIVE=$PORT_LIVE;
export THEME=$THEME;
" >> set_env.sh
- step: &git_ftp
name: Git-Ftp
image: davidwebca/docker-php-deployer:8.0.25
script:
# check if env file exists
- if [ -e set_env.sh ]; then
- cat set_env.sh
- source set_env.sh
- fi
# ...
branches:
staging:
- parallel:
- step:
<<: *set_env
deployment: staging
- parallel:
- step:
<<: *git_ftp
- step:
<<: *root_composer
- step:
<<: *theme_assets
According to the forums, Bitbucket staff is working on a solution to allow more flexibility on this as we speak (2022-12-01) but we shouldn't expect immediate release of this as it seems complicated to implement safely on their end.
as was explained in this link, you can define the Environment Variables and copy them to a file.
After that you can share that file between steps as an Artifact.

Resources