Skip steps from remote azure pipeline template repo - azure

I want to use a remote repo template in my azure pipeline. But I want to skip some of the steps included.
Note: I dont have access to configure the repote pipeline steps.
The remote yml looks like this
build.yml
steps:
- download: none
- checkout: ${{ parameters.checkoutRepo }}
- task: Cache#2
displayName: Cache Maven local repo
- task: Maven#3
displayName: 'Maven: validate'
- task: SonarQubePrepare#4
displayName: 'Prepare analysis on SonarQube'
My yml my-build.yml
resources:
repositories:
- repository: my-remote-repo-above
type: git
name: my-remote-repo-above
.
.
.
stages:
- template: build-stage.yml
parameters:
So my question is, can I somehow specify steps from remote to skip OR there is a way to pick the ones I want to execute only?

Basically, we can define a parameter to your template, and use it in a task condition so that we can skip the specific steps based on the condition.
For example, in build-stage.yml
parameters:
enableSonarCloud: false
steps:
- download: none
- checkout: ${{ parameters.checkoutRepo }}
- task: Cache#2
displayName: Cache Maven local repo
- task: Maven#3
displayName: 'Maven: validate'
- task: SonarQubePrepare#4
condition: and(succeeded(), ${{ parameters.enableSonarCloud }} )
displayName: 'Prepare analysis on SonarQube'
And your build.yml
resources:
repositories:
- repository: my-remote-repo-above
type: git
name: my-remote-repo-above
.
.
.
stages:
- template: build-stage.yml
parameters:
enableSonarCloud: true
But in your scenario, you don't have access to configure the remote template steps. In this case, I don't think you can achieve that if the parameter and task condition is not defined in the template.

Related

Azure DevOps pipelines - how to execute a whole template stage in a docker container by a self-hosted agent

I have a multi stage Azure DevOps pipeline in yaml. Each stage includes multiple jobs. The pipeline runs properly by using Azure-hosted agents. I need to run it by our company's self-hosted agent. The self-hosted agent is a virtual machine scale set (VMSS) which is defined as an agent pool in DevOps, it has been tested and is working properly. Because the self-hosted agent does not have all the capabilities we need, our company's policy is to run the jobs inside docker containers. The docker images reside in our company's private Azure Container Registy.
My question is how can I run a whole template stage in a docker container? Please note that my question is not about executing a job but all the jobs in a stage template. This link explains how to execute jobs inside a container. It doesn't give any examples of how to run all the jobs in a stage in a container, especially when the stage is defined as a template.
My pipeline in simplified form is as follows:
## Name of self-hosted agent
pool: Platform-VMSS
resources:
containers:
- container: base
image: myazurecontainerregistryname.azurecr.io/platform/myimg:1.0.0
endpoint: 'serviceconnection-acr'
trigger:
branches:
include:
- "feature/ORGUS-4810"
exclude:
- "main"
- "release"
paths:
include:
- "usecase-poc/**"
variables:
pythonVersion: 3.8
whlPackageName: py_core
srcDirectory: usecase-poc/$(whlPackageName)
${{ if eq(variables['Build.SourceBranch'], 'refs/heads/main') }}:
BuildFrom: main
PackageName: $(whlPackageName)
versionOption: custom
deployToDevPipeline: True
deployToQAPipeline: True
deployToProdPipeline: True
${{ else }}:
BuildFrom: branch_${{ lower(variables['Build.SourceBranchName'] ) }}
PackageName: $(whlPackageName)_$(BuildFrom)
versionOption: patch
deployToDevPipeline: True
deployToQAPipeline: False
deployToProdPipeline: False
stageName: 'deployToDev'
stages:
- stage: DownloadArtifact
displayName: "Download python whl from artifactory"
jobs:
- job: DownloadArtifactJob
steps:
- checkout: self
- task: UniversalPackages#0
displayName: 'Download Artifact with Universal Packages'
inputs:
vstsFeed: 'some-value/00000000-0000-0009-0000-d24300000000'
vstsFeedPackage: '00000000-0000-0000-0000-0000000'
vstsPackageVersion: 0.8.4
downloadDirectory: $(Build.ArtifactStagingDirectory)
- task: Bash#3
name: GetWheelName_Task
inputs:
targetType: 'inline'
script: |
echo $(Build.ArtifactStagingDirectory)
find $(Build.ArtifactStagingDirectory) -name '*.whl'
ArtifactName=$(find $(Build.ArtifactStagingDirectory) -name '*.whl')
echo "Artifact name value is " $ArtifactName
echo "##vso[task.setvariable variable=ArtifactName;isOutput=true]$ArtifactName"
displayName: 'Get downloaded artifact in source directory'
- task: PublishBuildArtifacts#1
displayName: "Publish downloaded artifact to pipeline's output"
inputs:
pathToPublish: $(Build.ArtifactStagingDirectory)
artifactName: whl
- stage: ConstructSharedVariablesForAllStages
displayName: Construct Shared Variables For All Stages
dependsOn: DownloadArtifact
variables:
- group: proj-shared-vg
- name: ArtifactName
value: $[stageDependencies.DownloadArtifact.DownloadArtifactJob.outputs['GetWheelName_Task.ArtifactName']]
jobs:
- job: DownloadArtifact
container: base
steps:
- task: Bash#3
displayName: "Print variable value"
inputs:
targetType: 'inline'
script: |
echo $(ArtifactName)
- task: Bash#3
name: extractWheelName
displayName: Extract Wheel Name
inputs:
targetType: inline
script: |
echo $(ArtifactName) | awk -F"/" '{print $NF}'
WhlName="py_core-0.8.4-py3-none-any.whl"
echo "##vso[task.setvariable variable=WhlName;isOutput=true]$WhlName"
- task: DownloadPipelineArtifact#2
displayName: "Download artifact from previous stage"
inputs:
buildType: 'current'
project: 'Project Name'
buildVersionToDownload: 'latest'
artifactName: whl
targetPath: '$(System.ArtifactsDirectory)'
- pwsh: |
$whlFile = Get-ChildItem -Filter *.whl -Path "$(System.ArtifactsDirectory)" | ForEach-Object { $_.fullname } | Select-Object -First 1
Write-Host "##vso[task.setvariable variable=whlFile]$whlFile"
name: SetVars
displayName: Get wheel name
## This is the section where my question is about. How can I make sure each stage runs in the self-hosted agent pool. The stage contains multiple jobs.
- template: ../templates/azure-pipeline-stage-template.yaml
parameters:
deploy: ${{variables.deployToDevPipeline}}
variableGroup: databricks-sp-vg-dev
stageName: DeployToDev
environment: DBRKS_Dev_WHL
conditionParameter: deployToDev
dependsOnStage: ConstructSharedVariablesForAllStages
- template: ../templates/azure-pipeline-stage-template.yaml
parameters:
deploy: ${{variables.deployToQAPipeline}}
variableGroup: databricks-sp-vg-qa
stageName: DeployToQA
environment: DBRKS_QA_WHL
conditionParameter: deployToQA
dependsOnStage: DeployToDev
- template: ../templates/azure-pipeline-stage-template.yaml
parameters:
deploy: ${{variables.deployToProdPipeline}}
variableGroup: databricks-sp-vg-prod
stageName: DeployToProd
environment: DBRKS_Prod_WHL
conditionParameter: deployToProd
dependsOnStage: DeployToQA
In the code above in resources the container resides in our Azure Container Registry (ACR), the endpoint is our DevOps service connection of type container registry to pull and push images to and from ACR. In the code above I have commented where the issue is. In templates where I am refering to stage templates, I would like to run them all inside a container where I have defined as a resource at the beginning of the pipeline. The stage template has multiple jobs. Here is just a sample of stage template when running to emphasize it has multiple jobs:
The highlighted stage is the one created by template:
- template: ../templates/azure-pipeline-stage-template.yaml
parameters:
deploy: ${{variables.deployToDevPipeline}}
variableGroup: databricks-sp-vg-dev
stageName: DeployToDev
environment: DBRKS_Dev_WHL
conditionParameter: deployToDev
dependsOnStage: ConstructSharedVariablesForAllStages
Question is how to enforce all the jobs in the above template run in the docker container defined as resource in our pipeline. Thank you very much for your valuable input.
Add a container field at the job level as shown below. Then all the jobs in the template will be running on specified container.
pool:
  vmImage: ubuntu-latest
resources:
  containers:
    - container: testcontainer
      image: ubuntu
stages:
  - stage: template01
    displayName: tempate test
    jobs:
      - job: template
        container: testcontainer
        steps:
          - template: templates/template01.yaml
  - stage: template02
    displayName: template test
    jobs:
      - job: template
        container: testcontainer
        steps:
          - template: templates/template02.yaml
Else, add a step target field to all the required tasks in a template, as referred to in this link Build and Release Tasks - Azure Pipelines | Microsoft Learn](https://learn.microsoft.com/en-us/azure/devops/pipelines/process/tasks?view=azure-devops&tabs=yaml#step-target) "https://learn.microsoft.com/en-us/azure/devops/pipelines/process/tasks?view=azure-devops&tabs=yaml#step-target)")
resources:
containers:
- container: pycontainer
image: python:3.11
steps:
- task: AnotherTask#1
target: pycontainer

Unable to fetch source and target branches of a Pull Request - Azure Devops

I have my source code repository in github and my pipeline in Azure devops. I am trying to execute certain tasks in my pipeline based on the source branch of a pull request. The pipeline gets triggered on PR. However, when I try to get the below attributes of a Pull request from my yml pipeline, I get the message shown in the screenshot below. It basically states command not found for all the values. Is there anything obvious that could cause this. Or is this not how these values are expected to be fetched? Any help is much appreciated.
trigger:
branches:
include:
- feature/azure-pipeline
- develop
- release/*
exclude:
- features/*
- master
pr:
branches:
include:
- develop
- main
stages:
- stage: TestStage
jobs:
- job: unit_test
displayName: 'Unit test Job'
pool:
vmImage: 'macos-latest'
variables:
- name: currentBranch
${{ if eq(variables['Build.Reason'], 'PullRequest') }}:
value: $(System.PullRequest.TargetBranch)
${{ if ne(variables['Build.Reason'], 'PullRequest') }}:
value: $(Build.SourceBranch)
steps:
- task: DownloadSecureFile#1
displayName: 'Download CSSM secrets'
name: secureKeys
inputs:
secureFile: 'cssmkeys.properties'
- script: |
echo Target Branch is $(System.PullRequest.TargetBranch)
echo Source Repository URI is $(System.PullRequest.SourceRepositoryURI)
echo PullRequest Id is $(System.PullRequest.PullRequestId)
echo Source Branch is $(System.PullRequest.SourceBranch)
echo Current Branch is $(value)
Output screenshot
EDIT
Link to the system variables page - https://learn.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=azure-devops&tabs=yaml

Azure YAML Scripts - Build and Release

Can we have both build and release in the same YAML script in Azure DevOps ? If yes, Can someone help me with the sample script. For the release part, we are deploying to multiple environments.
We have multiple way to initiate build and release our configurations via Azure DevOps.
Firstly, we’ll be looking at building CI and CD in two different yml formats and create two pipelines to automate our tasks. For that we’ll be using Resources in YAML Pipelines.
Now we will be creating build pipeline by creating new DevOps project and new repository, can add below yaml file as build-pipeline.yml
trigger:
branches:
include:
- master
paths:
exclude:
- build-pipeline.yml
- release-pipeline.yml
variables:
vmImageName: 'ubuntu-latest'
jobs:
- job: Build
pool:
vmImage: $(vmImageName)
steps:
- script: |
echo 'do some unit test'
displayName: 'unit test'
- script: |
echo 'compile application'
displayName: 'compile'
- task: ArchiveFiles#2
displayName: 'Archive files'
inputs:
rootFolderOrFile: '$(System.DefaultWorkingDirectory)'
includeRootFolder: false
archiveType: zip
archiveFile: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
replaceExistingArchive: true
- upload: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
artifact: drop
Similarly, we create a release pipeline and below is the yml format:
# Explicitly set none for repositry trigger
trigger:
- none
resources:
pipelines:
- pipeline: myappbuild # Name of the pipeline resource
source: myapp-build-pipeline # Name of the triggering pipeline
trigger:
branches:
- master
variables:
vmImageName: 'ubuntu-latest'
jobs:
- deployment: Deploy
displayName: Deploy
environment: dev
pool:
vmImage: $(vmImageName)
strategy:
runOnce:
deploy:
steps:
- download: myappbuild
artifact: drop
- task: ExtractFiles#1
inputs:
archiveFilePatterns: '$(PIPELINE.WORKSPACE)/myappbuild/drop/$(resources.pipeline.myappbuild.runID).zip'
destinationFolder: '$(agent.builddirectory)'
cleanDestinationFolder: false
- script: |
cat $(agent.builddirectory)/greatcode.txt
Here we can understand that build pipeline runs first, then release pipeline runs and these pipelines can be linked with multiple deployments.
Also, we have more information on few useful blogs such as c-sharpcorner, adamtheautomater. Thanks to the bloggers.

How to run a template(.yml) belonging to another repository from my current repository in azure devops?

With the below setup, when my pipeline corresponding to my repo runs, it runs the template (template.yml) file belonging to 'anotherRepo'. But when it checks out, it checks out my repo instead of 'anotherRepo'.
Is there any issue in my setup?
Looks like checkout:self does not have any impact and it does not work
My current Repo:
azurepipeline.yml file:
variables:
acceptanceTestsRepoName: 'anotherRepo'
resources:
repositories:
- repository: 'anotherRepo'
name: ProjectName/anotherRepo
type: git
ref: master
stages:
- stage: acceptance_tests
displayName: 'Run Acceptance Tests in Dev'
jobs:
- template: 'azure-pipelines-templates/template.yml#${{variables.acceptanceTestsRepoName}}'
Repo:anotherRepo
template.yml
jobs:
- job: AcceptanceTest
displayName: Run Acceptance Test in Dev
pool:
vmImage: 'ubuntu-latest'
steps:
- checkout: self
- task: UsePythonVersion#0
inputs:
versionSpec: '$(python.version)'
self always refers to a repo associated with the build pipeline. In your case, you need to checkout anotherRepo manually:
# Azure Repos Git repository in the same organization
- checkout: git://anotherRepo
This assumes that anotherRepo is in the same Azure DevOps organization. If it's not or stored somewhere else (GitHub, Bitbucket, etc...) you also need to add it as a resource to the pipeline defintion. See Check out multiple repositories in your pipeline for details.
According to the document about checkout: self represents the repo where the initial Azure Pipelines YAML file was found.
So your pipeline checkout the repo where your azurepipeline.yml file is located.
If you want to checkout your anotherRepo, the checkout step in your template.yml should be - checkout: anotherRepo:
jobs:
- job: AcceptanceTest
displayName: Run Acceptance Test in Dev
pool:
vmImage: 'ubuntu-latest'
steps:
- checkout: anotherRepo
- task: UsePythonVersion#0
inputs:
versionSpec: '$(python.version)'
You can also use Inline syntax checkout to directly check out another repo in azure-pipelines.yml file:
stages:
- stage: acceptance_tests
displayName: 'Run Acceptance Tests in Dev'
jobs:
- job: checkout
steps:
- checkout: git://ProjectName/anotherRepo
Thanks all for your response expecially #beatcracker
By replacing the checkout step from self , I was able to run successfully
No change to My Current Repo.
Below change done to another repo
jobs:
- job: AcceptanceTest
displayName: Run Acceptance Test in Dev
pool:
vmImage: 'ubuntu-latest'
steps:
- checkout: git://ProjectName/anotherRepo
- task: UsePythonVersion#0
inputs:
versionSpec: '$(python.version)'

AzurePipeline error: Could not find any file matching the template file pattern

I have the following pipeline
variables:
azureSubscription: ...
stages:
- stage: Deploy
displayName: Deploy stage
dependsOn: Build
condition: succeeded()
jobs:
- deployment: Deploy
displayName: Deploy
environment: 'development'
strategy:
runOnce:
deploy:
steps:
- task: AzureResourceGroupDeployment#2
inputs:
action: 'Create Or Update Resource Group'
resourceGroupName: '...'
location: '...'
templateLocation: 'Linked artifact'
csmFile: '$(Pipeline.Workspace)/azure-deploy.json'
deploymentMode: 'Incremental'
The repo has the following files (at the root directory)
azure-pipelines.yaml
azure-deploy.json
and only a master branch.
I have tried:
azure-deploy.json
**azure-deploy.json
**/*azure-deploy.json
$(Build.SourcesDirectory)/azure-deploy.json
$(Pipeline.Workspace)/azure-deploy.json
$(System.DefaultWorkingDirectory)/azure-deploy.json
Having read:
Azure Pipeline Error: Could not find any file matching the template file pattern
VSTS Pipeline Deployment of ARM Error: Could not find any file matching the template file pattern
https://github.com/microsoft/azure-pipelines-tasks/issues/11520
to no avail. Any ideas?
Update: I have added a publish pipeline as suggested by #ShaykiAbramczyk
Now I get a Template file pattern matches a directory instead of a file: /home/vsts/work/1/azure-deploy.json
- stage: Build
displayName: Build stage
jobs:
- job: Build
displayName: Build
pool:
vmImage: $(vmImageName)
steps:
- task: PublishPipelineArtifact#1
inputs:
targetPath: '$(Pipeline.Workspace)'
artifact: 'azure-deploy.json'
publishLocation: 'pipeline'
"A deployment job doesn't automatically clone the source repo. You can checkout the source repo within your job with checkout: self."
Source: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/deployment-jobs?view=azure-devops
Example from my setup, I put checkout: self as the first step and now my repository is cloned before before executing the Azure PowerShell:
strategy:
runOnce:
deploy:
steps:
- checkout: self
- task: AzurePowerShell#5
displayName: Setup Network
inputs:
It is a good strategy to go for the multi-stage pipelines for what you are doing.
Build is for composing your artifacts.
Deployment jobs is for the publishing part.
So you are on the right track.
If you need sources during the deployment jobs then use the checkout step to fetch the sources. ref. Repo Checkout docs.
Just my two cents
Because you use deployment job the sources from the master branch didn't download into the agent.
You need to publish the files in the build stage and consume them in the deployment - with pipeline artifacts.
Or, just run the AzureResourceGroupDeployment in a regular job, than the .json file will be in the agent.
My objective was to create a Blank and Starter Resource Group in Azure and nothing else. And this was repeatedly giving me the error
Template file pattern matches a directory instead of a file: /home/vsts/work/1/s
And thats what got me here.
I finally sorted that out with Stringfello's
- checkout self
step.
My full pipeline is as follows.
variables:
- name: finalBuildArtifactName
value: 'aspNetCoreDropFromYaml123'
- name: BuildParameters.RestoreBuildProjects
value: '**/*.csproj'
- name: BuildParameters.TestProjects
value: '**/*[Tt]ests/*.csproj'
- name: buildConfiguration
value: 'Release'
trigger:
- master
name: $(date:yyyyMMdd)$(rev:.r)
stages:
- stage: Build ##Basically prints out some vars and copies the template files.
jobs:
- job: buildWebApp
displayName: Build Stage for somepurpose
pool:
vmImage: ubuntu-latest
steps:
- script: |
echo build.artifactstagingdirectory and build.buildnumber are as follows.
echo $(build.artifactstagingdirectory) $(build.buildnumber)
echo $(projects)
echo $(BuildConfiguration)
echo Pipeline.Workspace is $(Pipeline.Workspace)
echo The current branch is - $(Build.SourceBranchName)!!.
echo $(finalBuildArtifactName)
echo "This is the build pipe line. This produces the necessary artifacts for subsequent release pipeline."
displayName: 'Command Line Script to write out some vars'
- powershell: |
# Write your PowerShell commands here.
Write-Host "This is from power shell command task"
Write-Host "This writes out the env vars"
get-childitem -path env:*
displayName: 'PowerShell script to write out env vars'
# The following task is needed. This copies the arm template files.
# Created these templates from Visual studio 2019 as follows.
# Right click your solution and Add -> New Project -> Azure Resource Group and gave the name Vivek_Aks_Rg
- task: CopyFiles#2
inputs:
SourceFolder: 'iac/ArmTemplates/Vivek_Aks_Rg/'
Contents: 'azuredeploy*.json'
TargetFolder: '$(build.artifactStagingDirectory)/ArmTemplates'
- task: PublishBuildArtifacts#1
displayName: Publish Artifact
condition: succeededOrFailed()
inputs:
PathtoPublish: '$(build.artifactstagingdirectory)'
ArtifactName: '$(finalBuildArtifactName)'
PublishLocation: 'Container'
- stage: DeployToDev
displayName: Deploy to Dev Env
jobs:
- deployment:
pool:
vmImage: ubuntu-latest
environment: Dev
strategy:
runOnce:
deploy:
steps:
- checkout: self
- task: AzureResourceManagerTemplateDeployment#3
displayName: 'Create Azure App Service in a Given Resource Group'
inputs:
deploymentScope: 'Subscription'
azureResourceManagerConnection: 'Pay-As-You-Go(123YourSubscriptionId)'
subscriptionId: '123YourSubscriptionId'
action: 'Create Or Update Resource Group'
resourceGroupName: 'YourResourceGroupName'
location: 'Central India'
csmFile: '$(Pipeline.Workspace)/$(finalBuildArtifactName)/ArmTemplates/azuredeploy.json'
csmParametersFile: '$(Pipeline.Workspace)/$(finalBuildArtifactName)/ArmTemplates/azuredeploy.parameters.json'
deploymentMode: 'Incremental'

Resources