Project files not available in deploy job azure dev ops pipelines - azure

The files in the related repo are available when running the Build stage of the Azure Dev Ops pipeline, but not when running the deploy stage. Any ideas as to why this would be the case?
Here is a simplified version of the yaml file:
# Deploy to Azure Kubernetes Service
# Build and push image to Azure Container Registry; Deploy to Azure Kubernetes Service
# https://learn.microsoft.com/azure/devops/pipelines/languages/docker
trigger:
- master
resources:
- repo: self
variables:
# Agent VM image name
vmImageName: 'ubuntu-latest'
# Name of the new namespace being created to deploy the PR changes.
k8sNamespaceForPR: 'review-app-$(System.PullRequest.PullRequestId)'
stages:
- stage: Build
displayName: Build stage
jobs:
- job: Build
displayName: Build
pool:
vmImage: $(vmImageName)
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: |
pwd
ls -la
- stage: Deploy
displayName: Deploy stage
dependsOn: Build
jobs:
- deployment: Deploy
condition: and(succeeded(), not(startsWith(variables['Build.SourceBranch'], 'refs/pull/')))
displayName: Deploy
pool:
vmImage: $(vmImageName)
environment: 'test.development'
strategy:
runOnce:
deploy:
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: |
pwd
ls -la
Additional notes:
If the deploy stage is run first (the build stage is removed) the working directory is also empty.

The job in your Deploy stage is a deployment job rather than a standard job, deployment jobs don't automatically checkout the repo that the pipeline is based on but they do have access to any published pipeline artifacts.
You can either publish a pipeline artifact in the Build stage or add a task to your Deploy stage to explicitly checkout the repo.
To publish a pipeline artifact add the Publish Pipeline Artifact as task in your Build stage. In your Deploy stage you can then reference files in that artifact with the path $(Pipeline.Workspace)/<artifactName>/<rest-of-path>
To checkout out the whole repo add this to your Deploy stage:
steps:
- checkout: self
path: 'myrepo/'
Then reference the files in the repo using $(System.DefaultWorkingDirectory)\<rest-of-path>

Related

Azure Devops: how configure my pipeline script to trigger another pipeline with different repo?

I have my own pipeline p-A with repository r-A, and pipeline p-B with another repo r-B.
I want to update the pipeline script for p-A only, to trigger the p-B actively, without any modification in p-B.
Below is the yaml pipeline script for p-B, which is already set up to run with schedule
pool:
name: 'workflow_test_pool'
schedules:
- cron: "0 19 * * *"
displayName: run test every day at 8PM CET
branches:
include:
- main
always: true
trigger: none
jobs:
- job:
timeoutInMinutes: 30
steps:
- script: |
python -m pytest tests/ -s
displayName: 'Run the test'
below is the pipeline script main.yaml for p-A
pool:
name: 'workflow_test_pool'
stages:
#########################
- template: pipeline2/p1.yaml
############################
- template: pipeline2/p2.yaml
parameters:
dependsOn:
- FirstPipeline
so the question is, how to trigger the pipeline p-B in pipeline2/p2.yaml(from p-A)?
You can create PowerShell script task as the last step of the pipeline to trigger pipeline p-B through REST API. You will have to maintain Personal Access Token, ideally as secret variable.
REST API call you will use:
https://learn.microsoft.com/en-us/rest/api/azure/devops/pipelines/runs/run-pipeline?view=azure-devops-rest-7.1
Detailed step-by-step guide:
https://blog.geralexgr.com/cloud/trigger-azure-devops-build-pipelines-using-rest-api
Azure DevOps supports multiple repositories checkout, you can just reference resources function in your YAML script and call another repository to trigger from the pipeline.
YAML code :-
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
pool:
vmImage: ubuntu-latest
workspace:
clean: all
resources:
repositories:
- repository: repo_a
type: git
name: InternalProjects/repo_a
trigger:
- main
- release
- repository: repo_b
type: git
name: InternalProjects/repo_b
trigger:
- main
steps:
- checkout: repo_a
- checkout: repo_b
- script: dir $(Build.SourcesDirectory)
I am running this pipeline from repo_a and the repo_a and repo_b both ran successfully like below:-
Output :-
You can directly run any task from pipeline with multiple repositories like below:-
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
pool:
vmImage: ubuntu-latest
workspace:
clean: all
resources:
repositories:
- repository: repo_a
type: git
name: InternalProjects/repo_a
trigger:
- main
- release
- repository: repo_b
type: git
name: InternalProjects/repo_b
trigger:
- main
steps:
- checkout: repo_a
- checkout: repo_b
- task: AzureCLI#2
inputs:
azureSubscription: 'Subscription-name(sub-id)'
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: 'az resource list --location uksouth'
Output:-
References :-
Check out multiple repositories in your pipeline - Azure Pipelines | Microsoft Learn
Trigger azure Devops pipeline from another repository - GeralexGR
Multiple Repositories in a Single Azure Pipeline - DEV Community 👩‍💻👨‍💻

Deploy selected artifact in Azure DevOps as docker image to ECR

Environment
Azure Dev Ops (code repo and pipeline trigger)
AWS ECR/ECS (target deployment platform)
Docker
.NET Core Web application (v5.0)
Current Situation
Presently building the application using dotnet build (powershell script) and pushing the zip file to Azure DevOps artifacts using azurepipeline.yml. This works out fine. I have added another task for ECR Push and that also pushes a generated docker image to the ECR using a Dockerfile in the source code.
Business Problem
We want to be able to chose a specific build (eg 0.1.24) in the Azure Artifact (using a variable to provide version number), and generate a Docker build using the corresponding binaries and the Dockerfile. I am unable to find a way to do so. The specific task is as follows:-
Deploy user updates variable "versionNoToDeploy" with the artifact id or name
Deploy user runs a specific pipeline
Pipeline finds the artifact (assuming its valid, else sends error), unzips the package at temp location (-need help on)
Pipeline runs dockerfile to build the image (-known & working)
Pipeline pushes this image to ECR (-known & working)
The purpose is to keep on building the branch till we get stable builds. This build is deployed on a test server manually and tested. Once the build gets certified, it needs to be pushed to Production ECR/ECS instances.
Our pipeline (specific code only)
- pwsh: ./build.ps1 --target Clean Protolint Compile --runtime $(runtime)
displayName: ⚙️ Compile
- task: Docker#2
displayName: Build
inputs:
command: build
repository: appRepo
tags: |
$(Build.BuildId)
deploy
addPipelineData: true
Dockerfile: src\DockerfileNew
- task: ECRPushImage#1
inputs:
awsCredentials: 'AWS ECR Connection'
regionName: 'ap-south-1'
imageSource: 'imagename'
sourceImageName: 'myApplication'
sourceImageTag: 'deploy'
repositoryName: 'devops-demo'
outputVariable: 'imageTagOutputVar'
- pwsh: ./build.ps1 --target Test Coverage --skip
displayName: 🚦 Test
- pwsh: ./build.ps1 --target BuildImage Pack --runtime $(runtime) --skip
displayName: 📦 Pack
- pwsh: ./build.ps1 --target Publish --runtime $(runtime) --skip
displayName: 🚚 Publish
Artifact details
Any specific aspects needed can be provided
Finally after playing a lot with the pipeline and custom tweaking the individual steps, I came out with the following (excerpted yml).
This involves having a build version to be stored in a variable, which is referenced in each of the steps of the pipeline.
The admin has to decide if they want a general build, producing an artifact; or just deploy a specific build to AWS. The variable having the build-id is evaluated conditionally, and based on that, the steps are executed or bypassed.
- pwsh: ./build.ps1 --target Clean Protolint Compile --runtime $(runtime)
condition: eq(variables['artifactVersionToPush'], '')
displayName: ⚙️ Compile
- task: DownloadBuildArtifacts#0
condition: ne(variables['artifactVersionToPush'], '')
inputs:
buildType: 'specific'
project: 'NitinProj'
pipeline: 'NitinProj'
buildVersionToDownload: specific
buildId: $(artifactVersionToPush)
downloadType: 'single'
artifactName: 'app'
downloadPath: '$(System.ArtifactsDirectory)' #(this needs to be mentioned as default is Build directory)
- task: ExtractFiles#1
displayName: Extract Artifact to temp location
condition: ne(variables['artifactVersionToPush'], '')
inputs:
archiveFilePatterns: '$(System.ArtifactsDirectory)/app/*.zip' #path need update
cleanDestinationFolder: false
overwriteExistingFiles: true
destinationFolder: src
- task: Docker#2
displayName: Docker Build image with compiled code in artifact
condition: ne(variables['artifactVersionToPush'], '')
inputs:
command: build
repository: myApp
tags: |
$(Build.BuildId)
deploy
addPipelineData: true
Dockerfile: src\DockerfileNew
- task: ECRPushImage#1
displayName: Push built image to AWS ECR
condition: ne(variables['artifactVersionToPush'], '')
inputs:
awsCredentials: 'AWS ECR Connection'
regionName: 'ap-south-1'
imageSource: 'imagename'
sourceImageName: 'myApp'
sourceImageTag: 'deploy'
pushTag: '$(Build.BuildId)'
repositoryName: 'devops-demo'
outputVariable: 'imageTagOutputVar'
- pwsh: ./build.ps1 --target Test Coverage --skip
condition: eq(variables['artifactVersionToPush'], '')
displayName: 🚦 Test
- pwsh: ./build.ps1 --target BuildImage Pack --runtime $(runtime) --skip
condition: eq(variables['artifactVersionToPush'], '')
displayName: 📦 Pack
- pwsh: ./build.ps1 --target Publish --runtime $(runtime) --skip
condition: eq(variables['artifactVersionToPush'], '')
displayName: 🚚 Publish
I will update this yml to have steps organized into jobs, but that's an optimization story.. :)
Since it involves manual intervention here, you may consider splitting the workflow into several jobs like this:
jobs:
- job: BuildAndDeployToTest
steps:
- bash: echo "A"
- job: waitForValidation
displayName: Wait for external validation
pool: server
timeoutInMinutes: 4320 # job times out in 3 days
steps:
- task: ManualValidation#0
timeoutInMinutes: 1440 # task times out in 1 day
inputs:
notifyUsers: |
test#test.com
example#example.com
instructions: 'Please validate the build configuration and resume'
onTimeout: 'resume'
- job: DeployToProd
steps:
- bash: echo "B"
This is not exactly what you want in terms of involving variables, but you will be able to achieve your goal. Wait for validation and deploy to prod only validated builds.
It rely on ManualValidation task.
Another approach could be using deployment job and approvals:
jobs:
- job: BuildAndDeployToTest
steps:
- bash: echo "A"
jobs:
# Track deployments on the environment.
- deployment: DeployToProd
displayName: deploy Web App
pool:
vmImage: 'Ubuntu-16.04'
# Creates an environment if it doesn't exist.
environment: 'PROD'
strategy:
# Default deployment strategy, more coming...
runOnce:
deploy:
steps:
- checkout: self
- script: echo my first deployment
For this you need to define evnironment and define approval.
In both ways you will get clear picture what was delivered to prod and information who approved PROD deployment.

How to run a template(.yml) belonging to another repository from my current repository in azure devops?

With the below setup, when my pipeline corresponding to my repo runs, it runs the template (template.yml) file belonging to 'anotherRepo'. But when it checks out, it checks out my repo instead of 'anotherRepo'.
Is there any issue in my setup?
Looks like checkout:self does not have any impact and it does not work
My current Repo:
azurepipeline.yml file:
variables:
acceptanceTestsRepoName: 'anotherRepo'
resources:
repositories:
- repository: 'anotherRepo'
name: ProjectName/anotherRepo
type: git
ref: master
stages:
- stage: acceptance_tests
displayName: 'Run Acceptance Tests in Dev'
jobs:
- template: 'azure-pipelines-templates/template.yml#${{variables.acceptanceTestsRepoName}}'
Repo:anotherRepo
template.yml
jobs:
- job: AcceptanceTest
displayName: Run Acceptance Test in Dev
pool:
vmImage: 'ubuntu-latest'
steps:
- checkout: self
- task: UsePythonVersion#0
inputs:
versionSpec: '$(python.version)'
self always refers to a repo associated with the build pipeline. In your case, you need to checkout anotherRepo manually:
# Azure Repos Git repository in the same organization
- checkout: git://anotherRepo
This assumes that anotherRepo is in the same Azure DevOps organization. If it's not or stored somewhere else (GitHub, Bitbucket, etc...) you also need to add it as a resource to the pipeline defintion. See Check out multiple repositories in your pipeline for details.
According to the document about checkout: self represents the repo where the initial Azure Pipelines YAML file was found.
So your pipeline checkout the repo where your azurepipeline.yml file is located.
If you want to checkout your anotherRepo, the checkout step in your template.yml should be - checkout: anotherRepo:
jobs:
- job: AcceptanceTest
displayName: Run Acceptance Test in Dev
pool:
vmImage: 'ubuntu-latest'
steps:
- checkout: anotherRepo
- task: UsePythonVersion#0
inputs:
versionSpec: '$(python.version)'
You can also use Inline syntax checkout to directly check out another repo in azure-pipelines.yml file:
stages:
- stage: acceptance_tests
displayName: 'Run Acceptance Tests in Dev'
jobs:
- job: checkout
steps:
- checkout: git://ProjectName/anotherRepo
Thanks all for your response expecially #beatcracker
By replacing the checkout step from self , I was able to run successfully
No change to My Current Repo.
Below change done to another repo
jobs:
- job: AcceptanceTest
displayName: Run Acceptance Test in Dev
pool:
vmImage: 'ubuntu-latest'
steps:
- checkout: git://ProjectName/anotherRepo
- task: UsePythonVersion#0
inputs:
versionSpec: '$(python.version)'

Azure Pipeline Multi-Stages in YAML vs Separate Release

Azure Pipelines support multiple stages in YAML. One typical example would be to have something like :
trigger:
- master
pool:
name: Default
demands:
- npm
- msbuild
- visualstudio
stages:
- stage: build
jobs:
- job: Build app
- stage: deploy
jobs:
- job: Deploy to dev
I'm not used to working like this. Usually, I would run the pipeline to build my application and drop artifacts to a drop folder. The pipeline would be the same regardless the environment that would later be targeted by the release.
Then I would choose to run a release, either Integration, UAT, or Production.
However, having multi-stages pipeline we are mixing the build and the release together. So how would I release in a given environment ? Do I have to duplicate this pipeline per environment ?
You can use template structure here. In this you will need to create separate files for different jobs and variables. Then you will need to execute templates with suitable variable template files for each stage.
Directory structure
Directory Structure
Pipeline:
Pipeline stages
Environments
Environments
Here's the sample pipeline:
trigger:
- master
variables:
- name: vmImage
value: 'ubuntu-latest'
stages:
- stage: Build
displayName: Build stage
jobs:
- job: BuildJob
pool:
vmImage: $(vmImage)
steps:
- template: Jobs/build.yml
- stage: NonProd
displayName: Deploy non prod stage
jobs:
- deployment: DeploymentJob1
pool:
vmImage: $(vmImage)
environment: non-prod
variables:
- template: Variables/non-prod.yml
strategy:
runOnce:
deploy:
steps:
- template: Jobs/deploy.yml
- stage: Prod
displayName: Deploy prod stage
jobs:
- deployment: DeploymentJob2
pool:
vmImage: $(vmImage)
environment: prod
variables:
- template: Variables/prod.yml
strategy:
runOnce:
deploy:
steps:
- template: Jobs/deploy.yml
Jobs/build.yml
steps:
- script: echo I am building!
displayName: 'Run Build'
Jobs/deploy.yml
steps:
- script: echo I am deploying to $(Environment)!
displayName: 'Run Deployment'
Variables/non-prod.yml
variables:
- name: Environment
value: non-prod
Variables/prod.yml
variables:
- name: Environment
value: prod

Change repo for 1 stage in multi-stage pipeline

I have an azure devops multi-stage pipeline which requires 1 stage that copies files from another repository to $(build.artifactstagingdirectory)
For example my YAML looks like
trigger:
- master
resources:
- repo: self
variables:
tag: '$(Build.BuildId)'
stages:
- stage: Build
displayName: Build image
jobs:
- job: Build
...
- stage: Build
... define other resource/repository ...
- task: CopyFiles#2
inputs:
SourceFolder: 'k8s'
Contents: '**'
TargetFolder: '$(build.artifactstagingdirectory)'
This pipeline is connect to a repository, which is probably defined by repo: self. So the question is, can I change this repository for a specific stage?
You can run git command in powershell script. Add a step in your stage to execute git command like below example. In this way the other repo will be cloned to artifact directory.
- powershell: |
cd $(Build.artifactstagingdirectory)
git clone "https://<<Your PAT>>#dev.azure.com/_organization/_project/_git/_repo"
Note: use your personal access token (PAT) as authentication.

Resources