Adding Conditional Step in Azure Pipeline for rollback of Kubernetes deployment - azure

I have an azure pipeline that deploys 2 Kubernetes deployments (manifest files) to AKS. I am not using HELM and kind of at intermediate level with AKS.
The tasks are something like this (for some reasons I am deploying 1 file at a time in separate tasks)
- task: Kubernetes#1
inputs:
connectionType: 'Azure Resource Manager'
azureSubscriptionEndpoint: '$(serviceConnection)'
azureResourceGroup: 'my-aks-rg'
kubernetesCluster: 'my-aks'
command: 'apply'
arguments: '-f $(Pipeline.Workspace)/drop/manifest_1.yaml --record=true'
secretType: 'dockerRegistry'
containerRegistryType: 'Azure Container Registry'
- task: Kubernetes#1
inputs:
connectionType: 'Azure Resource Manager'
azureSubscriptionEndpoint: '$(serviceConnection)'
azureResourceGroup: 'my-aks-rg'
kubernetesCluster: 'my-aks'
command: 'apply'
arguments: '-f $(Pipeline.Workspace)/drop/manifest_2.yaml --record=true'
secretType: 'dockerRegistry'
containerRegistryType: 'Azure Container Registry'
Now the issue is, let's say if file 1 deployment passes and 2nd file fails then I have a half-deployed application and in case of any of these failures I want to gracefully roll-back all the deployments to previous versions.
I know roll back command for K8s.
kubectl rollout undo deployment/app1
kubectl rollout undo deployment/app2
If I add this command as CLI task then I am not sure how to detect that this CLI task only execute once something in pipeline has failed otherwise should just ignore and skip the cli task if all steps are passed.
Thank you in advance.

If deploying one file at a time in separate tasks. This can be achieved through Custom conditions. Specify the conditions under each stage, job. In this scenario we can keep custom conditions under each stage. On stage2 it will verify the step1 status and if any fails it will auto rollback the deployments.
stages:
- stage: Stage1
displayName: Stage 1
dependsOn: []
condition: and(contains(variables['build.sourceBranch'], 'refs/heads/main'), succeeded())
jobs:
- job: ShowVariables
displayName: KubernetDeployment1
steps:
- task: Kubernetes#1
inputs:
connectionType: 'Azure Resource Manager'
azureSubscriptionEndpoint: '$(serviceConnection)'
azureResourceGroup: 'my-aks-rg'
kubernetesCluster: 'my-aks'
command: 'apply'
arguments: '-f $(Pipeline.Workspace)/drop/manifest_1.yaml --record=true'
secretType: 'dockerRegistry'
containerRegistryType: 'Azure Container Registry'
- stage: Stage2
displayName: stage 2
jobs:
- job: ShowVariables
dependsOn: Stage1
condition: eq(dependencies.Stage1.outputs['DetermineResult.doThing'], 'Yes')
displayName: KubernetDeployment2
steps:
- task: Kubernetes#1
inputs:
connectionType: 'Azure Resource Manager'
azureSubscriptionEndpoint: '$(serviceConnection)'
azureResourceGroup: 'my-aks-rg'
kubernetesCluster: 'my-aks'
command: 'apply'
arguments: '-f $(Pipeline.Workspace)/drop/manifest_2.yaml --record=true'
secretType: 'dockerRegistry'
containerRegistryType: 'Azure Container Registry'
This portion will verify the stage1 status
dependsOn: Stage1
condition: eq(dependencies.Stage1.outputs['DetermineResult.doThing'], 'Yes')

Related

Azure DevOps pipelines - how to execute a whole template stage in a docker container by a self-hosted agent

I have a multi stage Azure DevOps pipeline in yaml. Each stage includes multiple jobs. The pipeline runs properly by using Azure-hosted agents. I need to run it by our company's self-hosted agent. The self-hosted agent is a virtual machine scale set (VMSS) which is defined as an agent pool in DevOps, it has been tested and is working properly. Because the self-hosted agent does not have all the capabilities we need, our company's policy is to run the jobs inside docker containers. The docker images reside in our company's private Azure Container Registy.
My question is how can I run a whole template stage in a docker container? Please note that my question is not about executing a job but all the jobs in a stage template. This link explains how to execute jobs inside a container. It doesn't give any examples of how to run all the jobs in a stage in a container, especially when the stage is defined as a template.
My pipeline in simplified form is as follows:
## Name of self-hosted agent
pool: Platform-VMSS
resources:
containers:
- container: base
image: myazurecontainerregistryname.azurecr.io/platform/myimg:1.0.0
endpoint: 'serviceconnection-acr'
trigger:
branches:
include:
- "feature/ORGUS-4810"
exclude:
- "main"
- "release"
paths:
include:
- "usecase-poc/**"
variables:
pythonVersion: 3.8
whlPackageName: py_core
srcDirectory: usecase-poc/$(whlPackageName)
${{ if eq(variables['Build.SourceBranch'], 'refs/heads/main') }}:
BuildFrom: main
PackageName: $(whlPackageName)
versionOption: custom
deployToDevPipeline: True
deployToQAPipeline: True
deployToProdPipeline: True
${{ else }}:
BuildFrom: branch_${{ lower(variables['Build.SourceBranchName'] ) }}
PackageName: $(whlPackageName)_$(BuildFrom)
versionOption: patch
deployToDevPipeline: True
deployToQAPipeline: False
deployToProdPipeline: False
stageName: 'deployToDev'
stages:
- stage: DownloadArtifact
displayName: "Download python whl from artifactory"
jobs:
- job: DownloadArtifactJob
steps:
- checkout: self
- task: UniversalPackages#0
displayName: 'Download Artifact with Universal Packages'
inputs:
vstsFeed: 'some-value/00000000-0000-0009-0000-d24300000000'
vstsFeedPackage: '00000000-0000-0000-0000-0000000'
vstsPackageVersion: 0.8.4
downloadDirectory: $(Build.ArtifactStagingDirectory)
- task: Bash#3
name: GetWheelName_Task
inputs:
targetType: 'inline'
script: |
echo $(Build.ArtifactStagingDirectory)
find $(Build.ArtifactStagingDirectory) -name '*.whl'
ArtifactName=$(find $(Build.ArtifactStagingDirectory) -name '*.whl')
echo "Artifact name value is " $ArtifactName
echo "##vso[task.setvariable variable=ArtifactName;isOutput=true]$ArtifactName"
displayName: 'Get downloaded artifact in source directory'
- task: PublishBuildArtifacts#1
displayName: "Publish downloaded artifact to pipeline's output"
inputs:
pathToPublish: $(Build.ArtifactStagingDirectory)
artifactName: whl
- stage: ConstructSharedVariablesForAllStages
displayName: Construct Shared Variables For All Stages
dependsOn: DownloadArtifact
variables:
- group: proj-shared-vg
- name: ArtifactName
value: $[stageDependencies.DownloadArtifact.DownloadArtifactJob.outputs['GetWheelName_Task.ArtifactName']]
jobs:
- job: DownloadArtifact
container: base
steps:
- task: Bash#3
displayName: "Print variable value"
inputs:
targetType: 'inline'
script: |
echo $(ArtifactName)
- task: Bash#3
name: extractWheelName
displayName: Extract Wheel Name
inputs:
targetType: inline
script: |
echo $(ArtifactName) | awk -F"/" '{print $NF}'
WhlName="py_core-0.8.4-py3-none-any.whl"
echo "##vso[task.setvariable variable=WhlName;isOutput=true]$WhlName"
- task: DownloadPipelineArtifact#2
displayName: "Download artifact from previous stage"
inputs:
buildType: 'current'
project: 'Project Name'
buildVersionToDownload: 'latest'
artifactName: whl
targetPath: '$(System.ArtifactsDirectory)'
- pwsh: |
$whlFile = Get-ChildItem -Filter *.whl -Path "$(System.ArtifactsDirectory)" | ForEach-Object { $_.fullname } | Select-Object -First 1
Write-Host "##vso[task.setvariable variable=whlFile]$whlFile"
name: SetVars
displayName: Get wheel name
## This is the section where my question is about. How can I make sure each stage runs in the self-hosted agent pool. The stage contains multiple jobs.
- template: ../templates/azure-pipeline-stage-template.yaml
parameters:
deploy: ${{variables.deployToDevPipeline}}
variableGroup: databricks-sp-vg-dev
stageName: DeployToDev
environment: DBRKS_Dev_WHL
conditionParameter: deployToDev
dependsOnStage: ConstructSharedVariablesForAllStages
- template: ../templates/azure-pipeline-stage-template.yaml
parameters:
deploy: ${{variables.deployToQAPipeline}}
variableGroup: databricks-sp-vg-qa
stageName: DeployToQA
environment: DBRKS_QA_WHL
conditionParameter: deployToQA
dependsOnStage: DeployToDev
- template: ../templates/azure-pipeline-stage-template.yaml
parameters:
deploy: ${{variables.deployToProdPipeline}}
variableGroup: databricks-sp-vg-prod
stageName: DeployToProd
environment: DBRKS_Prod_WHL
conditionParameter: deployToProd
dependsOnStage: DeployToQA
In the code above in resources the container resides in our Azure Container Registry (ACR), the endpoint is our DevOps service connection of type container registry to pull and push images to and from ACR. In the code above I have commented where the issue is. In templates where I am refering to stage templates, I would like to run them all inside a container where I have defined as a resource at the beginning of the pipeline. The stage template has multiple jobs. Here is just a sample of stage template when running to emphasize it has multiple jobs:
The highlighted stage is the one created by template:
- template: ../templates/azure-pipeline-stage-template.yaml
parameters:
deploy: ${{variables.deployToDevPipeline}}
variableGroup: databricks-sp-vg-dev
stageName: DeployToDev
environment: DBRKS_Dev_WHL
conditionParameter: deployToDev
dependsOnStage: ConstructSharedVariablesForAllStages
Question is how to enforce all the jobs in the above template run in the docker container defined as resource in our pipeline. Thank you very much for your valuable input.
Add a container field at the job level as shown below. Then all the jobs in the template will be running on specified container.
pool:
  vmImage: ubuntu-latest
resources:
  containers:
    - container: testcontainer
      image: ubuntu
stages:
  - stage: template01
    displayName: tempate test
    jobs:
      - job: template
        container: testcontainer
        steps:
          - template: templates/template01.yaml
  - stage: template02
    displayName: template test
    jobs:
      - job: template
        container: testcontainer
        steps:
          - template: templates/template02.yaml
Else, add a step target field to all the required tasks in a template, as referred to in this link Build and Release Tasks - Azure Pipelines | Microsoft Learn](https://learn.microsoft.com/en-us/azure/devops/pipelines/process/tasks?view=azure-devops&tabs=yaml#step-target) "https://learn.microsoft.com/en-us/azure/devops/pipelines/process/tasks?view=azure-devops&tabs=yaml#step-target)")
resources:
containers:
- container: pycontainer
image: python:3.11
steps:
- task: AnotherTask#1
target: pycontainer

Azure DevOps CI/CD Pipeline failed with exit code 1

Trying to get my Azure DevOps CI/CD Pipeline to push a bunch of images to Azure Container Registry, although it continues to fail
The process 'C:\Windows\system32\docker.exe failed with exit code 1
DockerCompose
Pipeline result error
Error analysis
YAML
# Docker
# Build and push an image to Azure Container Registry
# https://docs.microsoft.com/azure/devops/pipelines/languages/docker
trigger:
branches:
include:
- Initial-K8-Setup
resources:
- repo: self
clean: true
# define variables
variables:
# Container registry service connection established during pipeline creation
azureSubscription: 'HAWeb2-CFS-NonProd'
# Container Registry Type
crType: 'Azure Container Registry'
# Container Name
AZURE_CONTAINER_REGISTRY: 'crschacfsnonprod' #.azurecr.io'
dockerComposeFileArgs: 'REGISTRY=crschacfsnonprod.azurecr.io/'
dockerComposeFile: '**/docker-compose.yml'
# Additional Docker File
addDockerComposeFiles: 'docker-compose.override.yml'
dockerfilePath: '$(Build.SourcesDirectory)/Dockerfile'
dockerRegistryServiceConnection: '7739b21f-2572-4f36-ab8b-df15ab821756'
imageName: aquarium
imageTag: demo
# Agent VM image name
vmImageName: 'windows-2019'
stages:
- stage: Build
displayName: Build and push stage
jobs:
- job: Build
displayName: Build
pool:
vmImage: $(vmImageName)
steps:
# Log into ACR Registry
- script: |
docker login crschacfsnonprod.azurecr.io/ -u $(acrUsername) -p $(acrPassword2)
env:
acrLoginServer: crschacfsnonprod.azurecr.io
acrUsername: $(acrUsername)
acrPassword: $(acrPassword2)
displayName: 'Docker login to ACR'
# Build Images
- task: DockerCompose#0
inputs:
containerregistrytype: 'Azure Container Registry'
azureSubscription: 'HAWeb2-CFS-NonProd'
azureContainerRegistry: '{"loginServer":"crschacfsnonprod.azurecr.io", "id" : "/subscriptions/5e2acfcb-7aeb-443d-9ea4-bc7553a51276/resourceGroups/rg-ha-cfs-nonprod-001/providers/Microsoft.ContainerRegistry/registries/crschacfsnonprod"}'
dockerComposeFile: '**/docker-compose.yml'
additionalDockerComposeFiles: 'docker-compose.override.yml'
dockerComposeFileArgs: 'REGISTRY=crschacfsnonprod.azurecr.io/'
action: 'Build services'
additionalImageTags: '$(Build.BuildNumber)-$(Build.SourceBranchName)'
includeLatestTag: true
# Push Images
- task: DockerCompose#0
inputs:
containerregistrytype: 'Azure Container Registry'
azureSubscription: 'HAWeb2-CFS-NonProd'
azureContainerRegistry: '{"loginServer":"crschacfsnonprod.azurecr.io", "id" : "/subscriptions/5e2acfcb-7aeb-443d-9ea4-bc7553a51276/resourceGroups/rg-ha-cfs-nonprod-001/providers/Microsoft.ContainerRegistry/registries/crschacfsnonprod"}'
dockerComposeFile: '**/docker-compose.yml'
additionalDockerComposeFiles: 'docker-compose.override.yml'
dockerComposeFileArgs: 'REGISTRY=crschacfsnonprod.azurecr.io/'
action: 'Push services'
additionalImageTags: '$(Build.BuildNumber)-$(Build.SourceBranchName)'
includeLatestTag: true
# Copy Files
- task: CopyFiles#2
inputs:
SourceFolder: 'k8s'
Contents: '**'
TargetFolder: '$(Build.ArtifactStagingDirectory)/k8s'
# Publish Artifact
- task: PublishPipelineArtifact#1
inputs:
targetPath: '$(Build.ArtifactStagingDirectory)'
artifact: 'Sitecore-Aquarium-XM1'
publishLocation: 'pipeline'
Configured a Docker Registry Service Connection
Azure Container Registry
SKU: Premium
Admin User: Enabled
Identity
System assigned: Off
User Assigned: None
Networking:
Public network access: All networks
Now I am unsure how the above Pipeline actually ads up... When looking into Repositories within the ACR, something have actually been pushed, although why does the Pipeline fail?
What am I doing wrong?
Can anyone clarify what I might have to do, to resolve this?

Run a command in container after deployment in Azure

I'm using Azure for hosting and Azure Pipelines for CI/CD operations
I have an image build and deploy operations defined like that:
- stage: Package
displayName: 'Package app'
jobs:
- job:
steps:
- task: Docker#2
displayName: 'Build image'
inputs:
containerRegistry: '$(containerRegistry)'
repository: '$(containerRepository)'
command: 'build'
Dockerfile: './Dockerfile'
buildContext: '.'
tags: |
$(Build.BuildId)
- task: Docker#2
displayName: 'Push image'
inputs:
command: push
containerRegistry: '$(containerRegistry)'
repository: '$(containerRepository)'
tags: |
$(Build.BuildId)
- stage: Deploy
displayName: 'Deploy'
jobs:
- job:
steps:
- task: AzureWebAppContainer#1
inputs:
azureSubscription: $(subscription)
appName: $(appName)
What should I do to execute some operations in my container after task AzureWebAppContainer is finished? I have to make some database updates after the deploy operation.
I've tried to find documentation for Azure and search for some SO topics, but didn't find any solutions yet, except usage of entrypoint / cmd for database updates, which is not working for me
I think there should be some Azure pipelines mechanism to perform such actions
You can use the startup command in the AzureWebAppContainer#1 or the AzureAppServiceSettings to manage the afterward operation.
By the way, you could also refer to this doc for Azure Web App for Containers to get more details.

AzurePipeline error: Could not find any file matching the template file pattern

I have the following pipeline
variables:
azureSubscription: ...
stages:
- stage: Deploy
displayName: Deploy stage
dependsOn: Build
condition: succeeded()
jobs:
- deployment: Deploy
displayName: Deploy
environment: 'development'
strategy:
runOnce:
deploy:
steps:
- task: AzureResourceGroupDeployment#2
inputs:
action: 'Create Or Update Resource Group'
resourceGroupName: '...'
location: '...'
templateLocation: 'Linked artifact'
csmFile: '$(Pipeline.Workspace)/azure-deploy.json'
deploymentMode: 'Incremental'
The repo has the following files (at the root directory)
azure-pipelines.yaml
azure-deploy.json
and only a master branch.
I have tried:
azure-deploy.json
**azure-deploy.json
**/*azure-deploy.json
$(Build.SourcesDirectory)/azure-deploy.json
$(Pipeline.Workspace)/azure-deploy.json
$(System.DefaultWorkingDirectory)/azure-deploy.json
Having read:
Azure Pipeline Error: Could not find any file matching the template file pattern
VSTS Pipeline Deployment of ARM Error: Could not find any file matching the template file pattern
https://github.com/microsoft/azure-pipelines-tasks/issues/11520
to no avail. Any ideas?
Update: I have added a publish pipeline as suggested by #ShaykiAbramczyk
Now I get a Template file pattern matches a directory instead of a file: /home/vsts/work/1/azure-deploy.json
- stage: Build
displayName: Build stage
jobs:
- job: Build
displayName: Build
pool:
vmImage: $(vmImageName)
steps:
- task: PublishPipelineArtifact#1
inputs:
targetPath: '$(Pipeline.Workspace)'
artifact: 'azure-deploy.json'
publishLocation: 'pipeline'
"A deployment job doesn't automatically clone the source repo. You can checkout the source repo within your job with checkout: self."
Source: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/deployment-jobs?view=azure-devops
Example from my setup, I put checkout: self as the first step and now my repository is cloned before before executing the Azure PowerShell:
strategy:
runOnce:
deploy:
steps:
- checkout: self
- task: AzurePowerShell#5
displayName: Setup Network
inputs:
It is a good strategy to go for the multi-stage pipelines for what you are doing.
Build is for composing your artifacts.
Deployment jobs is for the publishing part.
So you are on the right track.
If you need sources during the deployment jobs then use the checkout step to fetch the sources. ref. Repo Checkout docs.
Just my two cents
Because you use deployment job the sources from the master branch didn't download into the agent.
You need to publish the files in the build stage and consume them in the deployment - with pipeline artifacts.
Or, just run the AzureResourceGroupDeployment in a regular job, than the .json file will be in the agent.
My objective was to create a Blank and Starter Resource Group in Azure and nothing else. And this was repeatedly giving me the error
Template file pattern matches a directory instead of a file: /home/vsts/work/1/s
And thats what got me here.
I finally sorted that out with Stringfello's
- checkout self
step.
My full pipeline is as follows.
variables:
- name: finalBuildArtifactName
value: 'aspNetCoreDropFromYaml123'
- name: BuildParameters.RestoreBuildProjects
value: '**/*.csproj'
- name: BuildParameters.TestProjects
value: '**/*[Tt]ests/*.csproj'
- name: buildConfiguration
value: 'Release'
trigger:
- master
name: $(date:yyyyMMdd)$(rev:.r)
stages:
- stage: Build ##Basically prints out some vars and copies the template files.
jobs:
- job: buildWebApp
displayName: Build Stage for somepurpose
pool:
vmImage: ubuntu-latest
steps:
- script: |
echo build.artifactstagingdirectory and build.buildnumber are as follows.
echo $(build.artifactstagingdirectory) $(build.buildnumber)
echo $(projects)
echo $(BuildConfiguration)
echo Pipeline.Workspace is $(Pipeline.Workspace)
echo The current branch is - $(Build.SourceBranchName)!!.
echo $(finalBuildArtifactName)
echo "This is the build pipe line. This produces the necessary artifacts for subsequent release pipeline."
displayName: 'Command Line Script to write out some vars'
- powershell: |
# Write your PowerShell commands here.
Write-Host "This is from power shell command task"
Write-Host "This writes out the env vars"
get-childitem -path env:*
displayName: 'PowerShell script to write out env vars'
# The following task is needed. This copies the arm template files.
# Created these templates from Visual studio 2019 as follows.
# Right click your solution and Add -> New Project -> Azure Resource Group and gave the name Vivek_Aks_Rg
- task: CopyFiles#2
inputs:
SourceFolder: 'iac/ArmTemplates/Vivek_Aks_Rg/'
Contents: 'azuredeploy*.json'
TargetFolder: '$(build.artifactStagingDirectory)/ArmTemplates'
- task: PublishBuildArtifacts#1
displayName: Publish Artifact
condition: succeededOrFailed()
inputs:
PathtoPublish: '$(build.artifactstagingdirectory)'
ArtifactName: '$(finalBuildArtifactName)'
PublishLocation: 'Container'
- stage: DeployToDev
displayName: Deploy to Dev Env
jobs:
- deployment:
pool:
vmImage: ubuntu-latest
environment: Dev
strategy:
runOnce:
deploy:
steps:
- checkout: self
- task: AzureResourceManagerTemplateDeployment#3
displayName: 'Create Azure App Service in a Given Resource Group'
inputs:
deploymentScope: 'Subscription'
azureResourceManagerConnection: 'Pay-As-You-Go(123YourSubscriptionId)'
subscriptionId: '123YourSubscriptionId'
action: 'Create Or Update Resource Group'
resourceGroupName: 'YourResourceGroupName'
location: 'Central India'
csmFile: '$(Pipeline.Workspace)/$(finalBuildArtifactName)/ArmTemplates/azuredeploy.json'
csmParametersFile: '$(Pipeline.Workspace)/$(finalBuildArtifactName)/ArmTemplates/azuredeploy.parameters.json'
deploymentMode: 'Incremental'

Deploying and running .exe files using Azure Pipelines

I´m struggling to make my MultiStage pipelines to run a .exe file in a hosted agent running in a Azure VM.
My .yaml file is:
trigger:
- develop
stages:
- stage: build
displayName: Build
jobs:
- job: buildJob
pool:
vmImage: 'ubuntu-16.04'
variables:
buildConfiguration: 'Release'
steps:
- task: NuGetToolInstaller#1
inputs:
versionSpec: '5.5.0'
- task: DotNetCoreCLI#2
displayName: 'Dotnet Build $(buildConfiguration)'
inputs:
command: 'build'
arguments: '--configuration $(buildConfiguration)'
projects: '**/TestProj.csproj'
- task: DotNetCoreCLI#2
displayName: "Publish"
inputs:
command: 'publish'
publishWebProjects: false
projects: '**/TestProj.csproj'
arguments: '--no-restore --configuration $(BuildConfiguration) --output $(Build.ArtifactStagingDirectory)'
zipAfterPublish: false
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)'
ArtifactName: 'drop'
publishLocation: Container
- stage: Release
displayName: Release
dependsOn: build
jobs:
- deployment: AzureVMDeploy
displayName: agentDeploy
environment:
name: AzureDeploy
resourceName: vmName
resourceType: VirtualMachine
tags: develop
This VM is on the azure pipelines Environment. After I run this pipeline, the folder is downloaded into the VM, but I cannot find how to automate the execution of the output .exe file in this folder.
I think the way is to create a job with a task to do it, but I cannot figure out how to set the agent installed on the VM to run this task.
How can I do that?
If I understood you correctly, you want to execute your artifact file which was deployed to VM.
I think that PowerShell on Target Machines task should do the job for you. Yoy can write simple inline script to exeute your file. However, you need to have remoting confogured on VM. This article may help you with this.
You could specify tasks in strategy: Deployment job For example:
YAML
stages:
- stage: build
jobs:
- job: buildJob
pool:
vmImage: 'Ubuntu-16.04'
steps:
- task: PublishPipelineArtifact#1
inputs:
targetPath: '$(Pipeline.Workspace)'
publishLocation: 'pipeline'
- stage: deploy
dependsOn: build
jobs:
- deployment: DeployWeb
displayName: deploy Web App
environment:
name: vm1
resourceType: virtualmachine
strategy:
runOnce:
deploy:
steps:
- script: echo my first deployment
- task: CmdLine#2
inputs:
script: 'more README.md'
workingDirectory: '$(Pipeline.Workspace)/build.buildJob/s'
For this YAML pipeline, I publish all files in pipeline workspace to artifact in build stage, then this artifact will be download to target virtual machine of vm1 environment in deploy stage (folder name will be {stage name}.{job name}), then run command line task to get a file's content. (The script and command line tasks will be run on that virtual machine)

Resources