I'm using the below azure-pipeline.yml file to build docker image, push to Azure docker registry & restart the Azure docker app servicer.
This yaml file uses variable set in azure pipeline, screenshot attached.
My issue is, I need to create 2-3 pipelines every-week for different projects I need to add every variable manually for each project and copy paste from my config. Is there a way I can import a .env file or add multiple variables all at once while creating the pipeline.
Objectively I need to cut down the single variable copy paste time & avoid errors that might occurr
1.You could use variable group to reuse variables.
trigger:
- none
pool:
vmImage: ubuntu-latest
variables:
- group: forTest
steps:
- script: |
echo $(test1)
echo $(test2)
displayName: 'Run a multi-line script'
2.You could use variable template.
trigger:
- none
pool:
vmImage: ubuntu-latest
variables:
- template: vars.yml
steps:
- script: |
echo $(test1)
echo $(test2)
displayName: 'Run a multi-line script'
Related
Is there a way within a Release pipeline in Azure to pass variables created in one stage to the next stage?
I see lots of documentation about using echo "##vso[task..... - This however does not seem to work within the release pipeline.
I am mainly using bash scripts and I can reuse it within the same stage in different tasks, but not subsequent stages.
This seems like an essential feature to me, passing variables through stages...
Is there a way to do this?
If you want to pass variables from one stage to another stage in yml pipelines for release, you are supposed to use echo "##vso[task....." follow the doc
For a simple example:
stages:
- stage: BuildStage
jobs:
- job: BuildJob
steps:
- bash: echo "##vso[task.setvariable variable=TestArtifactName;isoutput=true]testValue"
name: printvar
- stage: DeployWebsiteStage
lockBehavior: sequential
dependsOn: BuildStage
condition: succeeded()
variables:
BuildStageArtifactFolderName: $[stageDependencies.BuildStage.BuildJob.outputs['printvar.TestArtifactName'] ]
jobs:
- deployment: DeployWebsite
environment:
name: webapplicationdeploy
strategy:
runOnce:
deploy:
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: Write-Host "BuildStageArtifactFolderName:" $(BuildStageArtifactFolderName)
You are supposed to get the value set from stage 'BuildStage'.
If you want to pass variables from one stage to another stage in classic release pipelines,
1.Set Release Permission Manage releases for the Project Collection Build Service as Allow.
2.Toggle on 'Allow scripts to access the OAuth token' for the first stage
3.Set a variable like 'StageVar' in release scope.
4.ADD the first powershell task(inline) in the first stage for creating a variable 'myVar' in the first stage.
5.update the Release Definition and Release Variable (StageVar)
6.Add a powershell task in the second stage to retrieve the value of myVar via the Release Variable StageVar.
You could refer the blog for more details.
It works on my side,
results in the first stage:
results in the second stage:
You can make use of the Azure DevOps Variable group to store your variables and call them in your release pipelines across multiple stages and multiple pipelines within a project. You can make use of Azure CLI to make use of the Variable group.
I have stored 2 variables of database name and password in the SharedVariables group. I can call this variable group in 2 ways: 1) Via the YAML pipeline and 2) Via Classic/Release pipelines.
Yaml:-
trigger:
- main
pool:
vmImage: ubuntu-latest
variables:
- group: SharedVariables
steps:
- script: |
echo $(databaseserverpassword)
When I ran the pipeline the database server password was encrypted like the below:-
In Release pipeline:-
You can add this variable within multiple pipelines and multiple stages in your release like below:-
But the above method will help you store static values in the variable group and not the output variable of build to release Unless you specifically assign those variables manually in the variable group. You can make use of this extension > Variable Tools for Azure DevOps Services - Visual Studio Marketplace
With this extension, you can store your variables from the build in a JSON file and load that JSON file in the next release stage by calling the task from the extension.
Save the build variable in a file:-
Load the build variable in a release:-
Another method is to store the variables in a file as a build artifact and then call the build artifact in the release pipeline with the below yaml code:-
trigger:
- dev
pool:
vmImage: windows-latest
parameters:
- name: powerenvironment
displayName: Where to deploy?
type: string
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
$variable = '${{parameters.powerenvironment}}'
$variable | Out-File $(Build.ArtifactStagingDirectory)\filewithvariable.txt
Get-Content $(Build.ArtifactStagingDirectory)\filewithvariable.txt
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)'
ArtifactName: 'drop'
publishLocation: 'Container'
And download the artifact and run the tasks in your release pipeline.
Reference:-
Pass parameters from build to release pipelines on Azure DevOps - GeralexGR
Another simple method is to run the PowerShell script to store the build output as JSON in the published artifact and read the content in the release pipeline like below:-
ConvertTo-Json | Out-File "file.json"
Get-Content "file.json" | themnvertFrom-Json
You can also reference the dependencies from various stages and call them in another stage within a pipeline with below yaml code :-
stages:
- stage: A
jobs:
- job: A1
steps:
- bash: echo "##vso[task.setvariable variable=shouldrun;isOutput=true]true"
# or on Windows:
# - script: echo ##vso[task.setvariable variable=shouldrun;isOutput=true]true
name: printvar
- stage: B
condition: and(succeeded(), eq(dependencies.A.outputs['A1.printvar.should run], 'true'))
dependsOn: A
jobs:
- job: B1
steps:
- script: echo hello from Stage B
Reference:-
bash - How to pass a variable from build to release in azure build to release pipeline - Stack Overflow By PatrickLu-MSFT
azure devops - How to get the variable value in TFS/AzureDevOps from Build to Release Pipeline? - Stack Overflow By jessehouwing
VSTS : Can I access the Build variables from Release definition? By Calidus
Expressions - Azure Pipelines | Microsoft Learn
I want to tag my pipeline build during the build process itself. For which based on the official document I tried echo ##vso[build.addbuildtag]testing in the pipeline yaml. There was no error but the build was not tagged either.
I'm able to add tags from portal successfully.
My pipleine yaml below.
pool:
vmImage: ubuntu-latest
jobs:
- job: addingtag
displayName: addingtag
steps:
- script: |
echo ##vso[build.addbuildtag]testing
displayName: addingtag
Below are other combinations I tried and still failed.
echo ##vso[build.addbuildtag] testing
echo ##vso[build.addbuildtag]Tag_testing
You may need to add double quotes, I could successfully add tag by using YAML script like below.
- script: |
echo "##vso[build.addbuildtag]testing"
displayName: addingtag
I am trying to loop through user-defined variables in an Azure DevOps YAML pipeline.
The variables have been created through the UI:
Below the YAML pipeline code that I'm using:
trigger:
- dev
- main
pr:
- dev
pool:
vmImage: ubuntu-latest
stages:
- stage:
jobs:
- job: TestVars
steps:
- ${{ each var in variables }}:
- script: |
echo ${{ var.key }}
echo ${{ var.value }}
displayName: ${{ var.key }}
When running the above pipeline only system and build variables are listed (e.g. system, system.hostType, build.queuedBy, etc.).
Any help to loop through user-defined variables would be much appreciated.
Unfortunately, no luck fetching the variables defined in UI. However, if your variables are non-secrets, you can bring them over into the YAML, and they will show up in the loop.
- stage:
variables:
myyamlvar: 1000 # this will show up in the loop
jobs:
- job: TestVars
steps:
- ${{ each var in variables }}:
- script: |
echo ${{ var.key }}
echo ${{ var.value }}
displayName: ${{ var.key }}
Alternatively, instead of using a compile time expression, you can list variables using a runtime construct, for example:
- job: TestRuntimeVars
steps:
- script: |
for var in $(compgen -e); do
echo $var ${!var};
done
This will list all variables including ones defined in the UI.
From the Microsoft docs link you provided, it specifies that:
"Unlike a normal variable, they are not automatically decrypted into
environment variables for scripts. You need to explicitly map secret
variables."
However, one workaround could potentially be to run an azure cli task and get the pipeline variables using az pipelines variable list
Assuming your intention is to get the actual values, in which case maybe that won't suffice. Having said that, you should consider a variable group even if you're not using them in other pipelines since the group can be linked to an Azure KeyVault and map the secrets as variables. You can store your sensitive values in a KeyVault and link it to the variable group which can be used like regular variables in your pipeline.
Or you can access KeyVault secrets right from the AzureKeyVault pipeline task.
To expand on the awnser below. It is a bit round about but you can use the azure devopps CLI. This may be a bit overkill but it does do the job.
trigger:
- main
pool:
vmImage: ubuntu-latest
steps:
- bash: az --version
displayName: 'Show Azure CLI version'
- bash: az devops configure --defaults organization=$(System.TeamFoundationCollectionUri) project=$(System.TeamProject) --use-git-aliases true
displayName: 'Set default Azure DevOps organization and project'
- bash: |
az pipelines variable list --pipeline-id $(System.DefinitionId)
displayName: 'Show build list varibales'
env:
AZURE_DEVOPS_EXT_PAT: $(System.AccessToken)
This approach was taken from a combination of:
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#list-variables
and
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#list-variables
If the agent is self hosted you may need to install the dev opps cli.
I want to convert Yaml pipeline from Gitlab to Azure DevOps. The problem is I did not have experience with GitLab before. This is yaml.
Is .package_deploy template for job? And is image it a pool or I need to use for this Docker task?
And before_script: means I need to create a task before task with docker?
variables:
myVar: "Var"
stages:
- deploy
.package_deploy:
image: registry.gitlab.com/docker-images/$myVar:latest
stage: build
script:
- cd src
- echo "Output file name is set to $OUTPUT_FILE_NAME"
- echo $OUTPUT_FILE_NAME > version.txt
- az login --service-principal -u $ARM_CLIENT_ID -p $ARM_CLIENT_SECRET --tenant $ARM_TENANT_ID
dev_package_deploy:
extends: .package_deploy
stage: deploy
before_script:
- export FOLDER=$FOLDER_DEV
- timestampSuffix=$(date -u "+%Y%m%dT%H%M%S")
- export OUTPUT_FILE_NAME=${myVar}-${timestampSuffix}-${CI_COMMIT_REF_SLUG}.tar.gz
when: manual
demo_package_deploy:
extends: .package_deploy
stage: deploy
before_script:
- export FOLDER=$FOLDER_DEMO
- timestampSuffix=$(date -u "+%Y%m%dT%H%M%S")
- export OUTPUT_FILE_NAME=${myVar}-${timestampSuffix}.tar.gz
when: manual
only:
refs:
- master
.package_deploy: is a 'hidden job' that you can use with the extends keyword. Itself, it does not create any job. It's a way to avoid repeating yourself in other job definitions.
before_script really is no different from script except that they're two different keys. The effect is that before_script + script includes all the script steps in the job.
before_script:
- one
- two
script:
- three
- four
Is the same as:
script:
- one
- two
- three
- four
image: defines the docker container in which the job runs. In this way, it is very similar to a pool you would define in ADO. But if you want things to run close to thee way it does in GitLab, you probably want to define it as container: in ADO.
I am trying to improve the CI/CD process of an old funky project whose code is not open to refactoring at the moment. I just cant get this to work following the Azure documentation or even if it is possible.
I have been able to improve the current state with an azure pipeline file that runs unit tests before merging into releases/dev branch. But i want to further.
Tasks every PR into releases/dev will:
script: npm run test:unit
script: npm run build:dev
copy/publish the contents of the .div/ folder to a azure blob store config for static site
Any PR or merge into releases/staging will:
script: npm run build:staging
copy/publish the contents of the .div/ folder to a azure blob store config for static site
Any PR or merge into master will:
script: npm run test:unit
script: npm run build:production
copy/publish the contents of the .div/ folder to a azure blob store config for static site
I have 3 questions
Is this possible within a single yaml file?
How do i run different task for different branches, I've defined jobs/stages but cant get them to be conditional?
Is there some magic anyone can direct me to that lets me copy the content of a directory to a blob store? Or must it be zipped->copied->un zipped?
Thanks in advance from a new sleep deprived dad
Is this possible within a single yaml file? How do i run different
task for different branches, I've defined jobs/stages but cant get
them to be conditional?
Of course. You could add these stages in a single yaml file. Then you need to define the condiftion field for each stage or each job.
Here is an example for stages:
trigger:
- '*'
pool:
vmImage: 'ubuntu-latest'
stages:
- stage: Test1
condition: OR(eq(variables['Build.SourceBranch'], 'refs/heads/master') ,eq(variables['System.PullRequest.TargetBranch'], 'refs/heads/master'))
jobs:
- job: BuildJob
steps:
- script: echo Build Stage1!
- stage: Test2
condition: OR(eq(variables['Build.SourceBranch'], 'refs/heads/dev') ,eq(variables['System.PullRequest.TargetBranch'], 'refs/heads/dev'))
jobs:
- job: BuildJob
steps:
- script: echo Build Stage2!
- stage: Test3
condition: OR(eq(variables['Build.SourceBranch'], 'refs/heads/staging') ,eq(variables['System.PullRequest.TargetBranch'], 'refs/heads/staging'))
jobs:
- job: BuildJob
steps:
- script: echo Build Stage3!
You could set required branches as trigger. Then you could use the Build.SourceBranch and System.PullRequest.TargetBranch to determine whether to run the current stage.
Build.SourceBranch -> for merge branch.
System.PullRequest.TargetBranch -> for Pull Request.
Here are the docs about conditions and variables.
Is there some magic anyone can direct me to that lets me copy the content of a directory to a blob store? Or must it be zipped->copied->un zipped?
Since you need to publish file to Azure Blob, you could directly try to use the Azure File Copy task.
Here is an example:
- task: AzureFileCopy#4
displayName: 'AzureBlob File Copy'
inputs:
SourcePath: xxx
azureSubscription: xxx
Destination: AzureBlob
storage: xxx
ContainerName: '$web'
Hope this helps.