Replace kubernetes yaml environment variables in Azure pipeline? - azure

I have an azure devops pipeline that I'm using to deploy a kubernetes project to rancher. In my k8s deployment.yaml file I have environment variables defined like this:
containers:
- name: frontend
env:
- name: GIT_HASH
value: dummy_value
I want to be able to replace the GIT_HASH with a value created in the azure yml pipeline. Specifically, I have a script to get a git commit e.g:
- task: Bash#3
displayName: Set the short git hash as a variable
inputs:
targetType: 'inline'
script: |
short_hash=$(git rev-parse --short=7 HEAD)
echo "##vso[task.setvariable variable=git-hash;]$short_hash"
And I want to be able to inject this value into kubernetes as the GIT_HASH. Is there a way to do this? I've tried using the qetza.replacetokens.replacetokens-task.replacetokens#3 but can't get it to work.

Posting comment as the community wiki answer for better visibility.
Have you considered using Helm or Kustomize to template your deployment manifests instead of relying on token replacement?
More information can be found on official Azure documentation

Related

How to declare variable groups once in template and use in multiple repos in Azure yaml pipeline

How to authorize variable in a yaml template in another repo to be used in a different repo. IOW, how to declare variables in a template once and use in multiple repos in azure devops
I am trying to migrate from classic pipelines to yaml in azure devops. So i am trying to set up a repo to host all yaml templates so it can be referenced and reused by multiple repos for builds, etc.
I wrote this yaml pipeline to sample prototyping it:
`name: FirstPL
trigger:
- my_test_branch
pool: my-agent
resources:
repositories:
- repository: blah
type: git
name: foo/bar
ref: refs/heads/poc
variables:
- template: pipeline_vars.yml#blah
steps:
- script: echo $(variable_from_pipeline_vars)
`
However when i run this i get the follwoing error:
An error occurred while loading the YAML build pipeline. Variable group was not found or is not authorized for use. For authorization details, refer to https://aka.ms/yamlauthz.
How can i declare my variables and variables groups once in a template in a repo that is dedicated to host those templates and then use them over and again in multiple repos using the resourcs syntax above? Also, I tried to find a way authorize the variables template but couldn't find anything to enable this.
How to authorize variable in a yaml template in another repo to be
used in a different repo. IOW, how to declare variables in a template
once and use in multiple repos in azure devops. However when I run
this i get the follwoing error:
An error occurred while loading the YAML build pipeline. Variable group was not found or is not authorized for use. For authorization
details, refer to
https://aka.ms/yamlauthz.
You can directly add the variable group in your azure DevOps project in the Library tab and save all your variables from pipeline_vars.yml in the variable group like below:-
Now, You can access this variable group in your YAML pipeline of multiple repos like the below:-
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
pool:
vmImage: ubuntu-latest
workspace:
clean: all
resources:
repositories:
- repository: repo_a
type: git
name: InternalProjects/repo_a
trigger:
- main
- release
- repository: repo_b
type: git
name: InternalProjects/repo_b
trigger:
- main
variables:
- group: SharedVariables
steps:
- checkout: repo_a
- checkout: repo_b
- script: |
echo $(databaseServerName)
- task: AzureCLI#2
inputs:
azureSubscription: 'xxx subscription(xxxxxxxxx-f598-44d6-b4fd-xxxxxxxxxxxx)'
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: 'az resource list --location uksouth'
Output:-
It asks for approving permission for the Variable group to run in the pipeline like below:-
Console:-
Tried the same with another repo repo_b in the project and it asks to approve access for repositories and variable groups like the below:-
Output:-
If you want this variable to be accessed in multiple stages/repos/pipelines within the project without authorization prompt. You can click on Security on top and allow it:-
I created one variables template and referenced it in the YAML pipeline to use across multiple repos by checking out another repo like below:-
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
pool:
vmImage: ubuntu-latest
workspace:
clean: all
resources:
repositories:
- repository: repo_a
type: git
name: InternalProjects/repo_a
trigger:
- main
- release
- repository: repo_b
type: git
name: InternalProjects/repo_b
trigger:
- main
variables:
- template: pipeline_vars.yml
steps:
- checkout: repo_a
- checkout: repo_b
- script: |
echo $(environmentName)
- task: AzureCLI#2
inputs:
azureSubscription: 'xxx subscription(xxxxxxxx-f598-44d6-b4fd-e2b6e97xxxxxx)'
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: 'az resource list --location uksouth'
Output:-
I tried to reference the same template in another repo where it does not exist it could not read the pipeline_vars.yml file as it does not exist in the repo.
You can make use of variable groups like the above to reference the variables in this pipeline.
One of the possible reasons for this is that the project that hosts the repository with the variables does not allow access to it's repositories from yaml pipelines.
To verify, go to your project's settings -> Pipelines -> Settings -> Verify "Protect access to repositories in YAML pipelines" . This setting is enabled by default. You could set it to off or add a checkout step to your pipeline yaml. See here for more information.

Azure DevOps Pipeline yaml file Docker Tag

Hello my expert friends.
I know this question might sound really simple but I am seeking for some advice and best practices to follow/learn.
I have a testing infra in azure divided in 2 environment. One is Staging and the other is Production.
Those environment have the same configurations for on hand practices as I want to learn how to deploy specific docker images from staging to production.
At the current state, I have 1 web app in Staging and 1 web app in Production.
My Build pipeline for this lab is try to trigger the build pipeline in staging only if I push to GitHub a Tag, and I achieved this by setting my pipeline as follow:
trigger:
batch: true
tags:
include:
- '*'
branches:
exclude:
- Staging
This runs a docker build with some c# code and deploys it to a container registry.
But in my current pipeline, under the task Docker:
- task: Docker#2
name: 'dockerBuildAndPush'
displayName: 'docker - Build & Push'
inputs:
repository: $(imageRepository)
Dockerfile: $(dockerfilePath)
containerRegistry: ${{ variables.dockerRegistryServiceConnection }}
buildContext: ${{ variables.buildContext }}
tags: |
$(Build.BuildNumber)
latest
As you can see I have a tags that is based on the Build.BuildNumber
This of course during the build process it get the current date + build number, but I wanted to do is target the latest tag coming from GitHub and pass it to the build.
And this is where I got confuse and not really sure about the best practice to follow.
Assuming that on GitHub I push the update with the tag v1.0, is there a way how I can use the pipeline to pick the tag number and pass it automatically to the build? Or I have to update the Tag value manually in the pipeline every time before to push to GitHub?
So basically what I want to have in my container registry is as follow:
Github push tag v1.0
Azure Container registry have a build docker build:v1.0
In this way, I will be easier to detect quickly which docker image is running on Staging and Production later on.
Sorry if I couldn't explain my dilemma clearly and please if this is the case, don't hesitate to ask for more informations.
UPDATE:
Thank you so much Dave for your reply and you solution. I will look into it asap.
Right I was looking for something a bit easier to achieve to understand the full process and get confident with it.
At the current state I managed my pipeline in the following order.
parameters:
- name: tag
type: string
default: 'v1.1'
trigger:
batch: true
tags:
include:
- '*'
branches:
exclude:
- Staging
and in my docker task set the tag to the parameters
- task: Docker#2
name: 'dockerBuildAndPush'
displayName: 'docker - Build & Push'
inputs:
repository: $(imageRepository)
Dockerfile: $(dockerfilePath)
containerRegistry: ${{ variables.dockerRegistryServiceConnection }}
buildContext: ${{ variables.buildContext }}
tags: '${{ parameters.tag }}'
This process worked exactly how expected. In my container registry in the portal I have a version named 'v.1.1'
The issue I am facing is during the release phase of this image.
In the tag for the release I have the $(Build.BuildNumber) which of course as I am setting a parameters variable to build that image, I don't have a build number but a v1.1.
I have been reading around that I can override the BuildNumber with a specific variable name in the yaml file to map the Build.BuildNumber to a parameter and in the relelease pipeline I can leave the Build.BuildNumber as a reference. But is not 100% clear to me how this can be done.
You could use GitVersion to generate a version number from your repository tags.
This could be used from your pipeline as follows:
- task: UseDotNet#2
displayName: Use .NET Core CLI
inputs:
version: "6.x"
- task: DotNetCoreCLI#2
displayName: Install GitVersion
continueOnError: true
inputs:
command: "custom"
custom: "tool"
arguments: "install GitVersion.Tool --version 5.* --global"
- task: PowerShell#2
displayName: Set build version
inputs:
targetType: "inline"
script: |
Write-Host "##vso[build.updatebuildnumber]$(dotnet-gitversion /showvariable FullSemVer)"
The above example will set the Build.BuildNumber variable to a version string based on the last tag. You can, of course, set some other variable instead.
You can also customise the way GitVersion chooses a version number by adding a GitVersion.yml file to your repository.
This answer should serve as a starting point for you - it doesn't produce exactly the format you want, you will need to look at configuration options and such if you want a very specific format.
Also, look at the different modes that GitVersion can run in, since they will affect the version numbers generated for any commits inbetween tagged commits!

Using Azure Devops yaml pipelines to deploy to on-prem servers

When using Azure DevOps pipelines, is it possible to deploy to on-prem servers using a yaml pipeline?
I have deployed to on premise servers with a Release (classic) pipeline using deployment groups, and I have seen instructions on deploying to Azure infrastructure using yaml pipelines.
However I can't find any examples of how to deploy to on-prem servers using yaml pipelines - is this possible yet? If so are there any examples available of how to achieve this?
As already explained in the previous answers, you need to create a new environment and add VMs to the environment (see documentation).
Using a deployment job, you also need to specify the resourceType
- deployment: VMDeploy
displayName: Deploy to VM
environment:
name: ContosoDeploy
resourceType: VirtualMachine
...
If you have multiple VMs in this environment, the job steps will be executed on all the VMs.
To target specific VMs, you can add tags (see documentation).
jobs:
- deployment: VMDeploy
displayName: Deploy to VM
environment:
name: ContosoDeploy
resourceType: VirtualMachine
tags: windows,prod # only deploy to virtual machines with both windows and prod tags
...
Yes you can. In YAML pipelines you can use "Environments" as a replacement for deployment groups. They work in a similar way in that you install the agent on the target machine and then specify the environment in your Deployment Job
Create a new Environment with an appropriate name (e.g. Dev) then add a resource, you'll be able to add either a VM or a Kubernetes cluster. Assuming that you choose VM, then you will able to download a script which you can run an target machines to install the deployment agent. This script will install and register the agent in to the environment.
Once you have the Agent registered in to the environment add a deployment job to your YAML
- stage: DeployToDev
displayName: Deploy to Dev
jobs:
- deployment: DeployWebSite
displayName: Deploy web site to Dev
environment: Dev
strategy:
runOnce:
deploy:
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
Write-Host "Deploy my code"

Azure Pipelines. Run script from resource repo

I have yaml file for the azure pipline in a repo. And I need to run powershell script from a different repo.
As far as I understood I can add side repo to resources section in yaml and then use task:ShellScript#2 with scriptPath parameter. But as I understood it works relatively for repo in which yaml is placed. And I'm not sure how can I access file from a different repo.
Yes, you have to use repository resource and checkout that repo as follows:
resources:
repositories:
- repository: devops
type: github
name: kmadof/devops-templates
endpoint: kmadof
steps:
- checkout: self
- checkout: devops
- task: ShellScript#2
inputs:
scriptPath: $(Agent.BuildDirectory)/devops/scripts/some-script.sh

Is it possible to refrence files inside Azure DevOps pipeline templates when these templates reside in a standalone repo?

I'm setting up several pipelines in Azure DevOps. To make my teams life easier, I'm using job templates.
These job templates are in a a proper repository, just for them.
For every pipeline I define the repository to get the templates from.
Some tasks in these templates run powershell code, and I want this code to be in a script file, to be reusable and stored in the same repo as the template.
When the pipelines runs, the template is embeded, it tries to locate the powershell script inside project repo actually being built/deployed.
How can i achieve this?
The workaround is to have inline code which I really don't want to have.
Any constructive answer will be very appreciated.
Thanks
After some digging I couldn't find any way to specify a script file as source to powershell task in a template.
Inside pipeline definition:
resources:
repositories:
- repository: templates
type: git
name: deploy-templates
variables:
artifactName: 'Trade Data ETL - $(Build.SourceBranchName)'
stages:
- stage: Build
displayName: Build
variables:
- group: DEV-Credential-Group
- group: COMMON-Settings-Group
jobs:
- template: ssis/pipelines/stage-build.yml#templates # Template reference
parameters:
artifactName: '$(artifactName)'
Inside template file:
- task: PowerShell#2
inputs:
filePath: ssis/pipelines/scripts/build-ssis-project.ps1
arguments: '-ProjectToBuild "tradedata-ldz-ssis/tradedata-ldz-ssis.dtproj'
pwsh: true
Update 2021
According to learn.microsoft.com, you can now also check out multiple repositories without custom scripting.
If you check out more than one repository, a separate folder containing the repository is created below $(Build.SourcesDirectory).
You can define multiple repositories like this:
resources:
repositories:
- repository: devops
type: git
name: DevOps
ref: main
- repository: infrastructure
type: git
name: Infrastructure
ref: main
And in the steps simply check them out as follows:
steps:
- checkout: self
- checkout: devops
- checkout: infrastructure
# List all available repositories
- script: ls
Original Answer
Currently the resources command only supports yml files in other repositories. However, you could simply checkout the repository in a task and then run the desired powershell script.
steps:
- task: PowerShell#2
inputs:
targetType: inline
script: |
git clone -b <your-desired-branch> https://azuredevops:$($env:token)#dev.azure.com/<your-organization>/<your-project>/_git/<your-repository> <target-folder-name>
./<target-folder-name>/foo.ps1
env:
token: $(System.AccessToken)
This script would checkout an arbitrary branch and execute a script foo.ps1 in the root of the target repository.
Call - checkout: templates inside the template file. This might only work when you insert a template but it successfully sees the repository resource and pulls it down.
You can copy the script files from source directory. Currently, you have not mentioned the root folder -
ssis/pipelines/scripts/build-ssis-project.ps1
Assuming, you are building on a repo where the powershell script resides -
Try -
- task: PowerShell#1
inputs:
scriptName: '$(ScriptsDir)/ssis/pipelines/scripts/build-ssis-project.ps1'
Pass the value of ScriptsDir where it could be the build source directory or build working directory

Resources