I'm used to using the classic Devops "Release" pipelines for deploying code changes to Kubernetes clusters. Recently I've been looking into switching to using Azure Pipelines "deployment" jobs together with "Environments". It seems to work really well and I like a lot of the features, like being able to inspect the Kubernetes entities associated with your deployments, and track the deployment history.
Something that I'm accustomed to from the classic Release pipelines is rolling back to an old deployment if it is discovered that a bug has been released (to production for example). Since Release pipelines are based on build artifacts, you simply run the deployment on the old artifact in the Releases UI.
Now using deployments under the Environments tab, I'm not sure how to run a rollback, short of actually making a code change to revert back to the old state (and run through CI builds again needlessly). Another option is, since the deployment is done relative to the code (or commit) rather than an artifact, one could manually run a new pipeline and target the given commit - but this is quite cumbersome to achieve in the Devops UI, and seems prone to errors. In my opinion rolling back should be really easy to achieve, and not prone to errors.
Any ideas how to do this? Here is a sample of my yaml file
trigger:
batch: true
branches:
include:
- master
pr:
branches:
include:
- master
variables:
azureContainerRegistry: <registryUrl>
azureContainerRegistryServiceConnection: <serviceConnection>
kubernetesConfigPath: kubernetes
kubernetesNamespace: <my-namespace>
major: 0
buildNumber: $(major).$(Build.BuildId)
imageName: "$(azureContainerRegistry)/<my-app>:$(buildNumber)"
stages:
- stage: Bake
displayName: "Build and Push image"
jobs:
- job: Validate
displayName: "Build image"
pool:
name: "Docker"
steps:
- script: docker build -t $(imageName) .
displayName: Build App
- job: Publish
displayName: "Push image"
dependsOn: Validate
condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/master'))
pool:
name: "Docker"
steps:
- task: Docker#2
displayName: Login to Container Registry
inputs:
command: login
containerRegistry: $(azureContainerRegistryServiceConnection)
- script: docker push $(imageName)
displayName: PUSH $(imageName)
- stage: DeployTest
displayName: "Deploy TEST"
dependsOn: Bake
condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/master'))
jobs:
- deployment: Deploy
environment: <my-test-env>.$(kubernetesNamespace)
pool:
name: "Docker"
strategy:
runOnce:
deploy:
steps:
- task: qetza.replacetokens.replacetokens-task.replacetokens#3
displayName: "Replace tokens"
inputs:
targetFiles: $(kubernetesConfigPath)/base/*.yaml
escapeType: none
tokenPrefix: "{"
tokenSuffix: "}"
- task: Kubernetes#1
displayName: "kubectl apply"
inputs:
namespace: $(kubernetesNamespace)
command: apply
arguments: -k $(kubernetesConfigPath)/test
versionSpec: 1.7.0
checkLatest: true
- task: Kubernetes#1
displayName: "kubectl rollout status"
inputs:
namespace: $(kubernetesNamespace)
command: rollout
arguments: "status deployments/<my-app>"
versionSpec: 1.7.0
checkLatest: true
- stage: DeployProd
displayName: "Deploy PROD"
dependsOn: DeployTest
condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/master'))
jobs:
- deployment: Deploy
environment: <my-prod-env>.$(kubernetesNamespace)
pool:
name: "Docker"
strategy:
runOnce:
deploy:
steps:
- task: qetza.replacetokens.replacetokens-task.replacetokens#3
displayName: "Replace tokens"
inputs:
targetFiles: $(kubernetesConfigPath)/base/*.yaml
escapeType: none
tokenPrefix: "{"
tokenSuffix: "}"
- task: Kubernetes#1
displayName: "kubectl apply"
inputs:
namespace: $(kubernetesNamespace)
command: apply
arguments: -k $(kubernetesConfigPath)/prod
versionSpec: 1.7.0
checkLatest: true
- task: Kubernetes#1
displayName: "kubectl rollout status"
inputs:
namespace: $(kubernetesNamespace)
command: rollout
arguments: "status deployments/<my-app>"
versionSpec: 1.7.0
checkLatest: true
Redeploying from the older version of the code is the way to do it.
one could manually run a new pipeline and target the given commit -
but this is quite cumbersome to achieve in the Devops UI, and seems
prone to errors
This is you need a well-organised source control branching and tagging policy. If you previously deployed from branch "releases/release-20212710.01", then deployed from branch "releases/release-20212710.02", you don't need to make any code changes. Rolling back just means selecting the older branch – which still exists, with the same code as before – and deploying.
Related
Can we have both build and release in the same YAML script in Azure DevOps ? If yes, Can someone help me with the sample script. For the release part, we are deploying to multiple environments.
We have multiple way to initiate build and release our configurations via Azure DevOps.
Firstly, we’ll be looking at building CI and CD in two different yml formats and create two pipelines to automate our tasks. For that we’ll be using Resources in YAML Pipelines.
Now we will be creating build pipeline by creating new DevOps project and new repository, can add below yaml file as build-pipeline.yml
trigger:
branches:
include:
- master
paths:
exclude:
- build-pipeline.yml
- release-pipeline.yml
variables:
vmImageName: 'ubuntu-latest'
jobs:
- job: Build
pool:
vmImage: $(vmImageName)
steps:
- script: |
echo 'do some unit test'
displayName: 'unit test'
- script: |
echo 'compile application'
displayName: 'compile'
- task: ArchiveFiles#2
displayName: 'Archive files'
inputs:
rootFolderOrFile: '$(System.DefaultWorkingDirectory)'
includeRootFolder: false
archiveType: zip
archiveFile: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
replaceExistingArchive: true
- upload: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
artifact: drop
Similarly, we create a release pipeline and below is the yml format:
# Explicitly set none for repositry trigger
trigger:
- none
resources:
pipelines:
- pipeline: myappbuild # Name of the pipeline resource
source: myapp-build-pipeline # Name of the triggering pipeline
trigger:
branches:
- master
variables:
vmImageName: 'ubuntu-latest'
jobs:
- deployment: Deploy
displayName: Deploy
environment: dev
pool:
vmImage: $(vmImageName)
strategy:
runOnce:
deploy:
steps:
- download: myappbuild
artifact: drop
- task: ExtractFiles#1
inputs:
archiveFilePatterns: '$(PIPELINE.WORKSPACE)/myappbuild/drop/$(resources.pipeline.myappbuild.runID).zip'
destinationFolder: '$(agent.builddirectory)'
cleanDestinationFolder: false
- script: |
cat $(agent.builddirectory)/greatcode.txt
Here we can understand that build pipeline runs first, then release pipeline runs and these pipelines can be linked with multiple deployments.
Also, we have more information on few useful blogs such as c-sharpcorner, adamtheautomater. Thanks to the bloggers.
Im having real issues with a pipeline everytime someone commits or pushes something to a branch on our repo, the pipeline triggers, in following the Microsoft Doc: https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/azure-repos-git?view=azure-devops&tabs=yaml#ci-triggers
Putting in Exclude features on every branch that we have the pipeline will still run when someone does a commit to a local branch even if I have wild carded the branch.
Has anyone been able to get this to work, that the pipeline should only run when there is a commit to Master only and nothing else.
Here is my Code:
trigger:
branches:
include:
- master
exclude:
- CICV/*
- An/*
- Prod/*
- Test/*
- Dev/*
- dev/*
- IN/*
- id/*
- St/*
- tr/*
pool:
vmImage: 'windows-latest'
demands: npm
variables:
System.Debug: false
azureSubscription: 'RunPipelinesInProd'
RG: 'VALUE'
Location: UK South
containername: 'private'
appconnectionname: 'RunPipelinesInProd'
jobs:
- job: job1
displayName: Create And Publish Artifact
pool:
vmImage: vs2017-win2016
steps:
- task: UseDotNet#2
displayName: Use .Net Core 3.1.x SDK
inputs:
packageType: 'sdk'
version: '3.1.x'
- task: DotNetCoreCLI#2
displayName: dotnet restore
inputs:
command: restore
projects: 'Website.csproj'
- task: Npm#1
displayName: 'npm install'
inputs:
workingDir: ClientApp
verbose: false
- task: Npm#1
displayName: 'npm run build'
inputs:
command: 'custom'
workingDir: ClientApp
customCommand: 'build'
- task: DotNetCoreCLI#2
displayName: dotnet build
inputs:
projects: 'Website.csproj'
arguments: '--configuration Release'
- task: DotNetCoreCLI#2
displayName: dotnet Test
inputs:
command: test
projects: 'UnitTests/UnitTests.csproj'
arguments: '--configuration Release'
- task: DotNetCoreCLI#2
displayName: dotnet publish
inputs:
command: publish
projects: 'Website.csproj'
arguments: '--configuration Release --output $(Build.ArtifactStagingDirectory)'
zipAfterPublish: true
modifyOutputPath: false
- task: PublishPipelineArtifact#1
displayName: Publish Pipeline Artifact
inputs:
targetPath: '$(Build.ArtifactStagingDirectory)'
artifact: 'Website'
publishLocation: 'pipeline'
- job: job2
displayName: Create Web App
dependsOn: job1
steps:
# Download Artifact File
- download: none
- task: DownloadPipelineArtifact#2
displayName: 'Download Build Artifacts'
inputs:
patterns: '**/*.zip'
path: '$(Build.ArtifactStagingDirectory)'
# deploy to Azure Web App
- task: AzureWebApp#1
displayName: 'Azure Web App Deploy: nsclassroom-dgyn27h2dfoyo'
inputs:
package: $(Build.ArtifactStagingDirectory)/**/*.zip
azureSubscription: $(azureSubscription)
ConnectedServiceName: $(appconnectionname)
appName: 'VALUE'
ResourceGroupName: $(RG)
You don't need a complex trigger like the one you outlined to trigger the pipeline on pushes to master. The following simple trigger configuration should work:
trigger:
- master
If there's anything in the include section, then only pushes to these branches trigger the build. If you specify both include and exclude sections, then it will try to exclude some subset from the include set - just like in the sample from the docs:
# specific branch build
trigger:
branches:
include:
- master
- releases/*
exclude:
- releases/old*
If the pipeline is still triggered by the push to some other branch, then it must be something else that triggers it.
As mentioned by #yan-sklyraneko in this answer your trigger configuration should be as simple as
trigger:
- master
However the trigger in your YAML file can be overridden in the GUI. Navigate to your pipeline and click Edit then click the ellipses as shown below and select Triggers
On that screen check that the Override the YAML continuous integration trigger from here box is unticked
I solved this in the end, I ended up down the route of managing through the Azure Dev Ops Portal.
It seems that if you try to use YAML to manage this it just doesn't work, but if you do it through the web interface as outlined in Answer 2, then the behaviour is as expected. I think that the Microsoft YAML part is broken for this but I already have three issues open with Microsoft I don't wish to add another one to follow and tag.
So I've finished my backend and frontend part of the project.
Now big aspect of my project is scraper function, which is implemented in the backend side of the code. Right now, I need to open VS code every day, and run a function which will trigger the scrapers. Now I've researched about, and Azure has a function apps which has a scheduled function.
Now what I want is: I want just to call a file inside my Azure repo. My backend and frontend are in the different repos, and I want to run file scraping-service.js inside scraping folder in order to scrape data and insert the data into the db.
Now normally I run pipeline with azure-service.yml which has its own configuration for running my project. Is any way to implement this function to run just scraping-service.js at certain time of the day?
Thanks!
Is any way to implement this function to run just scraping-service.js at certain time of the day?
In Azure Pipeline, you could set schedules for pipelines.
Here is a doc about the detailed info: Configure schedules for pipelines.
For example: In yaml pipeline , you could set cron .
trigger:
- main
variables:
# Azure Resource Manager connection created during pipeline creation
azureSubscription: 'xxx'
# Web app name
webAppName: 'stanbackapp'
# Environment name
environmentName: 'stanbackapp'
# Agent VM image name
vmImageName: 'ubuntu-latest'
schedules:
- cron: "0 0 * * *"
displayName: Daily midnight build
branches:
include:
- main
stages:
- stage: Build
displayName: Build stage
condition: ne(variables['Build.Reason'], 'Schedule')
jobs:
- job: Build
displayName: Build
pool:
vmImage: $(vmImageName)
steps:
- task: NodeTool#0
inputs:
versionSpec: '10.x'
displayName: 'Install Node.js'
- script: |
npm install
npm run build --if-present
npm run test --if-present
displayName: 'npm install, build and test'
- task: ArchiveFiles#2
displayName: 'Archive files'
inputs:
rootFolderOrFile: '$(System.DefaultWorkingDirectory)'
includeRootFolder: false
archiveType: zip
archiveFile: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
replaceExistingArchive: true
- upload: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
artifact: drop
- stage: Build1
displayName: Build1 stage
condition: eq(variables['Build.Reason'], 'Schedule')
jobs:
- job: Build1
displayName: Build1
pool:
vmImage: $(vmImageName)
steps:
- task: NodeTool#0
inputs:
versionSpec: '10.x'
displayName: 'Install Node.js'
- task: CmdLine#2
inputs:
script: ' node $(build.sourcesdirectory)/scraping/scraping-service.js'
- stage: Deploy
displayName: Deploy stage
dependsOn: Build
condition: succeeded()
jobs:
- deployment: Deploy
displayName: Deploy
environment: $(environmentName)
pool:
vmImage: $(vmImageName)
strategy:
runOnce:
deploy:
steps:
- task: AzureWebApp#1
displayName: 'Azure Web App Deploy: stanbackapp'
inputs:
azureSubscription: $(azureSubscription)
appType: webAppLinux
appName: $(webAppName)
runtimeStack: 'NODE|10.10'
package: $(Pipeline.Workspace)/drop/$(Build.BuildId).zip
startUpCommand: 'npm run dev'
In this case, when the pipeline is triggered by schedules, it will run Single file, or it will run the whole pipeline.
You can also add condition to task level.
The normal way is to create an azure devops pipeline that deploys the code you want to run to an azure function when the code changes. So put the code of scraping-service.js in the function or have the function call the method in scraping-service.js. See the docs
Although there might be ways to run the code in an Azure Devops pipeline I don't think it is meant to run application code. You won't have the monitoring capabilities an azure function gives you, nor the availability of scaling, configuration and all the thing Azure Functions provides.
I have the following pipeline
variables:
azureSubscription: ...
stages:
- stage: Deploy
displayName: Deploy stage
dependsOn: Build
condition: succeeded()
jobs:
- deployment: Deploy
displayName: Deploy
environment: 'development'
strategy:
runOnce:
deploy:
steps:
- task: AzureResourceGroupDeployment#2
inputs:
action: 'Create Or Update Resource Group'
resourceGroupName: '...'
location: '...'
templateLocation: 'Linked artifact'
csmFile: '$(Pipeline.Workspace)/azure-deploy.json'
deploymentMode: 'Incremental'
The repo has the following files (at the root directory)
azure-pipelines.yaml
azure-deploy.json
and only a master branch.
I have tried:
azure-deploy.json
**azure-deploy.json
**/*azure-deploy.json
$(Build.SourcesDirectory)/azure-deploy.json
$(Pipeline.Workspace)/azure-deploy.json
$(System.DefaultWorkingDirectory)/azure-deploy.json
Having read:
Azure Pipeline Error: Could not find any file matching the template file pattern
VSTS Pipeline Deployment of ARM Error: Could not find any file matching the template file pattern
https://github.com/microsoft/azure-pipelines-tasks/issues/11520
to no avail. Any ideas?
Update: I have added a publish pipeline as suggested by #ShaykiAbramczyk
Now I get a Template file pattern matches a directory instead of a file: /home/vsts/work/1/azure-deploy.json
- stage: Build
displayName: Build stage
jobs:
- job: Build
displayName: Build
pool:
vmImage: $(vmImageName)
steps:
- task: PublishPipelineArtifact#1
inputs:
targetPath: '$(Pipeline.Workspace)'
artifact: 'azure-deploy.json'
publishLocation: 'pipeline'
"A deployment job doesn't automatically clone the source repo. You can checkout the source repo within your job with checkout: self."
Source: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/deployment-jobs?view=azure-devops
Example from my setup, I put checkout: self as the first step and now my repository is cloned before before executing the Azure PowerShell:
strategy:
runOnce:
deploy:
steps:
- checkout: self
- task: AzurePowerShell#5
displayName: Setup Network
inputs:
It is a good strategy to go for the multi-stage pipelines for what you are doing.
Build is for composing your artifacts.
Deployment jobs is for the publishing part.
So you are on the right track.
If you need sources during the deployment jobs then use the checkout step to fetch the sources. ref. Repo Checkout docs.
Just my two cents
Because you use deployment job the sources from the master branch didn't download into the agent.
You need to publish the files in the build stage and consume them in the deployment - with pipeline artifacts.
Or, just run the AzureResourceGroupDeployment in a regular job, than the .json file will be in the agent.
My objective was to create a Blank and Starter Resource Group in Azure and nothing else. And this was repeatedly giving me the error
Template file pattern matches a directory instead of a file: /home/vsts/work/1/s
And thats what got me here.
I finally sorted that out with Stringfello's
- checkout self
step.
My full pipeline is as follows.
variables:
- name: finalBuildArtifactName
value: 'aspNetCoreDropFromYaml123'
- name: BuildParameters.RestoreBuildProjects
value: '**/*.csproj'
- name: BuildParameters.TestProjects
value: '**/*[Tt]ests/*.csproj'
- name: buildConfiguration
value: 'Release'
trigger:
- master
name: $(date:yyyyMMdd)$(rev:.r)
stages:
- stage: Build ##Basically prints out some vars and copies the template files.
jobs:
- job: buildWebApp
displayName: Build Stage for somepurpose
pool:
vmImage: ubuntu-latest
steps:
- script: |
echo build.artifactstagingdirectory and build.buildnumber are as follows.
echo $(build.artifactstagingdirectory) $(build.buildnumber)
echo $(projects)
echo $(BuildConfiguration)
echo Pipeline.Workspace is $(Pipeline.Workspace)
echo The current branch is - $(Build.SourceBranchName)!!.
echo $(finalBuildArtifactName)
echo "This is the build pipe line. This produces the necessary artifacts for subsequent release pipeline."
displayName: 'Command Line Script to write out some vars'
- powershell: |
# Write your PowerShell commands here.
Write-Host "This is from power shell command task"
Write-Host "This writes out the env vars"
get-childitem -path env:*
displayName: 'PowerShell script to write out env vars'
# The following task is needed. This copies the arm template files.
# Created these templates from Visual studio 2019 as follows.
# Right click your solution and Add -> New Project -> Azure Resource Group and gave the name Vivek_Aks_Rg
- task: CopyFiles#2
inputs:
SourceFolder: 'iac/ArmTemplates/Vivek_Aks_Rg/'
Contents: 'azuredeploy*.json'
TargetFolder: '$(build.artifactStagingDirectory)/ArmTemplates'
- task: PublishBuildArtifacts#1
displayName: Publish Artifact
condition: succeededOrFailed()
inputs:
PathtoPublish: '$(build.artifactstagingdirectory)'
ArtifactName: '$(finalBuildArtifactName)'
PublishLocation: 'Container'
- stage: DeployToDev
displayName: Deploy to Dev Env
jobs:
- deployment:
pool:
vmImage: ubuntu-latest
environment: Dev
strategy:
runOnce:
deploy:
steps:
- checkout: self
- task: AzureResourceManagerTemplateDeployment#3
displayName: 'Create Azure App Service in a Given Resource Group'
inputs:
deploymentScope: 'Subscription'
azureResourceManagerConnection: 'Pay-As-You-Go(123YourSubscriptionId)'
subscriptionId: '123YourSubscriptionId'
action: 'Create Or Update Resource Group'
resourceGroupName: 'YourResourceGroupName'
location: 'Central India'
csmFile: '$(Pipeline.Workspace)/$(finalBuildArtifactName)/ArmTemplates/azuredeploy.json'
csmParametersFile: '$(Pipeline.Workspace)/$(finalBuildArtifactName)/ArmTemplates/azuredeploy.parameters.json'
deploymentMode: 'Incremental'
I´m struggling to make my MultiStage pipelines to run a .exe file in a hosted agent running in a Azure VM.
My .yaml file is:
trigger:
- develop
stages:
- stage: build
displayName: Build
jobs:
- job: buildJob
pool:
vmImage: 'ubuntu-16.04'
variables:
buildConfiguration: 'Release'
steps:
- task: NuGetToolInstaller#1
inputs:
versionSpec: '5.5.0'
- task: DotNetCoreCLI#2
displayName: 'Dotnet Build $(buildConfiguration)'
inputs:
command: 'build'
arguments: '--configuration $(buildConfiguration)'
projects: '**/TestProj.csproj'
- task: DotNetCoreCLI#2
displayName: "Publish"
inputs:
command: 'publish'
publishWebProjects: false
projects: '**/TestProj.csproj'
arguments: '--no-restore --configuration $(BuildConfiguration) --output $(Build.ArtifactStagingDirectory)'
zipAfterPublish: false
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)'
ArtifactName: 'drop'
publishLocation: Container
- stage: Release
displayName: Release
dependsOn: build
jobs:
- deployment: AzureVMDeploy
displayName: agentDeploy
environment:
name: AzureDeploy
resourceName: vmName
resourceType: VirtualMachine
tags: develop
This VM is on the azure pipelines Environment. After I run this pipeline, the folder is downloaded into the VM, but I cannot find how to automate the execution of the output .exe file in this folder.
I think the way is to create a job with a task to do it, but I cannot figure out how to set the agent installed on the VM to run this task.
How can I do that?
If I understood you correctly, you want to execute your artifact file which was deployed to VM.
I think that PowerShell on Target Machines task should do the job for you. Yoy can write simple inline script to exeute your file. However, you need to have remoting confogured on VM. This article may help you with this.
You could specify tasks in strategy: Deployment job For example:
YAML
stages:
- stage: build
jobs:
- job: buildJob
pool:
vmImage: 'Ubuntu-16.04'
steps:
- task: PublishPipelineArtifact#1
inputs:
targetPath: '$(Pipeline.Workspace)'
publishLocation: 'pipeline'
- stage: deploy
dependsOn: build
jobs:
- deployment: DeployWeb
displayName: deploy Web App
environment:
name: vm1
resourceType: virtualmachine
strategy:
runOnce:
deploy:
steps:
- script: echo my first deployment
- task: CmdLine#2
inputs:
script: 'more README.md'
workingDirectory: '$(Pipeline.Workspace)/build.buildJob/s'
For this YAML pipeline, I publish all files in pipeline workspace to artifact in build stage, then this artifact will be download to target virtual machine of vm1 environment in deploy stage (folder name will be {stage name}.{job name}), then run command line task to get a file's content. (The script and command line tasks will be run on that virtual machine)