Howto: Allow run time selection build artifact in Azure DevOps YAML pipelines - azure

I'm able to include the latest version of the artifact published by another pipeline (AppCIPipline) into my YAML pipeline using Conditional insertion:
name: '$(Build.SourceBranchName)-$(date:yyyyMMdd)$(rev:.r)'
resources:
pipelines:
- pipeline: AppBuildToDeploy # Required when source == Specific
source: App_Master_CI
branch: master
# buildToDeploy is a pipeline variable
${{ if ne(variables['buildToDeploy'], '') }}:
version: $(buildToDeploy) #let's leave it blank from the pipeline
project: NewHorizon
trigger: none
pool: 'Matrix' # Self hosted agent on a windows server
steps:
- download: 'AppBuildToDeploy'
patterns: '*_BuildScripts.zip'
displayName: 'Download Specified Artifacts'
I'm getting the following error: " A template expression is not allowed in this context"
Is there a way to get the version number from the user at run time and use the version, if supplied, else default to the current version?

For now the user experience is not supported yet. For now, we have to use hard-code way.
Someone has posted this feature request in DC before. You can vote for this open issue and follow it to track the request there. If it gets enough votes, the team would consider it seriously.

Related

NuKeeper - how to add reviewers for pull request?

I am currently using Nukeeper in my Azure DevOps pipeline to automatically update my packages. It works fine and automatically creates a Pull Request when the pipeline is run. However, the Pull Requests do not have any required/optional reviewers assigned. I would like to automatically assign Optional Reviewers with Specific names to the PR.
I have looked into the Nukeeper configurations at https://nukeeper.com/basics/configuration/ but could not find any options to achieve the above.
Below is my Yaml content:
trigger: none
schedules:
- cron: "0 3 * * 0"
displayName: Weekly Sunday update
branches:
include:
- master
always: true
pool: CICDBuildPool-VS2019
steps:
- task: NuKeeper#0
displayName: NuKeeper Updates
inputs:
arguments: --change Minor --branchnameprefix "NewUpdates/" --consolidate
Does anyone know if it is feasible to automatically assign specific optional reviewers via the Nukeeper pipeline?
This Can be done through branch policy "Automatically included reviewers" option.

Azure DevOps - Pipeline should not trigger build for PR

I have an Azure DevOps Pipeline for a Git repository. I currently have a script to validate the PR comments in the Azure Pipeline.
When the code is merged into the main branch I want to trigger a build. I am not sure how to achieve this with a Azure DevOps pipeline.
#Trigger for Development
trigger:
branches:
include:
- development
- master
#Trigger checks for PR
pr:
branches:
include:
- development
- master
- feature
- main
paths:
exclude:
- README/*
When the code is merged into the main branch I wanted to trigger build
If you want to verify the comments after the code is merged into the main branch, we need to trigger the build after the PR completed instead of when PR is created.
So, the PR triggers could not meet our requirement in this case.
To resolve this issue, we could enable CI triggers for the main branch with ** condition** eq(variables['Commitcomment'], 'Merge pull request') for the task of script to validate the PR comments.
With this condition, the pipeline will execute the job only when the Commitcomment is Merge pull request, this can filter out modifications not done by PR.
To get the value of the variable Commitcomment, we could to check the commits message on our github by the variable Build.SourceVersionMessage:
If the commit comes from PR, it will given a default comment, starting with: Merge pull request xxx, we could add a bash\powershell script to get the first few fields.
Then use Logging Command to set the variable Commitcomment to true if the first few fields is Merge pull request:
- task: CmdLine#2
displayName: get the first few fields
inputs:
script: >-
echo $(Build.SourceVersionMessage)
set TempVar=$(Build.SourceVersionMessage)
set Commitcomment=%TempVar:~0,18%
echo %Commitcomment%
echo ##vso[task.setvariable variable=Commitcomment]%Commitcomment%
Reference link: Is there a short 7-digit version of $(SourceVersion) in Azure Devops?
Then add this variable as condition condition: and(succeeded(), eq(variables['Commitcomment'], 'Merge pull request')) for your task to verify the PR comments:
- task: CmdLine#2
displayName: script to validate the PR comments
condition: and(succeeded(), eq(variables['Commitcomment'], 'Merge pull request'))
inputs:
script: >
echo To validate the PR comments
In this case, if the commit not comes from PR, it will skip the PR comments verify task:
If you just want to launch a build when the merge is done (pull request validated) in a specific branch, your code is good.
If you want to run a validation build currently it is not integrated into the Yaml pippeline configuration (https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=azure-devops&tabs=schema%2Cparameter-schema#pr-trigger)
To do this, it must be done via the graphical interface:
Project Settings -> Repositories -> Select your repo -> Policies -> Branch Policies -> Select your branch -> Build Validation -> + -> add build information
(https://learn.microsoft.com/en-us/azure/devops/repos/git/branch-policies?view=azure-devops#build-validation)

Pipelines using same yaml but deploying to different subscriptions

I have a YAML based dev ops pipeline which currently has a service connection and subscription hard coded.
I now want to deploy to either dev or live which are different subscriptions.
I also want to control who can execute these pipelines. This means I need 2 pipelines so I can manage the security of the them independently
I dont want the subscription and service connection to be parameters of the pipeline that the user must remember to enter correctly.
My current solution:
Im using YAML templates which contain most of the configuration.
I have a top level yaml file for each environment (dev.yml and live.yml).
These pass environment specific values to the template i.e. subscription
I have 2 pipelines. The dev pipeline maps to a dev.yaml file and the live pipeline maps to a live.yml
This approach means that for every combination of config I might have in the future (subscription, service connection etc) I need a new toplevel yml file.
This feels messy - Is there a better solution. What am I missing?
Pipelines using same yaml but deploying to different subscriptions
You could try to add the different subscriptions to different variable groups, then reference the variable group in a template:
Variable Template:
# variablesForDev.yml
variables:
- group: variable-group-Dev
# variablesForLive.yml
variables:
- group: variable-group-Live
dev.yml:
stages:
- stage: Dev
variables:
- template: variablesForDev.yml
jobs:
- job: Dev
steps:
- script: echo $(TestVarInDevGroup)
live.yml:
stages:
- stage: live
variables:
- template: variablesForLive.yml
jobs:
- job: live
steps:
- script: echo $(TestVarInLiveGroup)
You could check this document Add & use variable groups for some more details.
Update:
This approach means that for every combination of config I might have
in the future (subscription, service connection etc) I need a new
toplevel yml file.
Since you do not want to create the toplevel yml file each time for every combination of config you might have in the future, you could try to create a variable group for those (subscription, service connection etc) instead of the toplevel yml file:
dev.yml:
variables:
- group: SubscriptionsForDev
stages:
- stage: Dev
jobs:
- job: Dev
steps:
- script: echo $(TestVarInDevGroup)
In this case, we do not need create a new toplevel yml file when we add a new pipeline, just need add a new variable group.
Besides, we could also set security for each variable group:
Update2:
I wanted to know if it was possible to avoid the 2 top level files.
If you want to avoid the 2 top level files, then the question comes back to my original answer, we need a new pipeline to contain these two yml files. We just need to add condition to each stage:
stages:
- stage: Dev
condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/dev'))
variables:
- group: variable-group-Dev
jobs:
- job: Dev
steps:
- script: echo $(TestVarInDevGroup)
- stage: live
condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/live'))
variables:
- group: variable-group-Live
jobs:
- job: live
steps:
- script: echo $(TestVarInLiveGroup)
But if your two yaml files have no conditions that can be used as conditions, then you have to separate them.
Hope this helps.

Resources > Repository triggers not firing and default triggers not disabled in Azure DevOps yaml pipeline

I have a triggers set up in our azure-pipelines.yml like so below:
the scriptsconn represents a connection to the default/self repo that contains the deployment pipeline yaml.
the serviceconn represents a microservice repo we are building and deploying using the template and publish tasks.
We have multiple microservices with similar build pipelines, so this approach is an attempt to lessen the amount of work needed to update these steps.
Right now the issue we're running into is two fold:
no matter what branch we specify in the scriptsconn resources -> repositories section the build triggers for every commit to every branch in the repo.
no matter how we configure the trigger for serviceconn we cannot get the build to trigger for any commit, PR created, or PR merged.
According to the link below this configuration should be pretty straighforward. Can someone point out what mistake we're making?
https://github.com/microsoft/azure-pipelines-yaml/blob/master/design/pipeline-triggers.md#repositories
resources:
repositories:
- repository: scriptsconn
type: bitbucket
endpoint: BitbucketAzurePipelines
name: $(scripts.name)
ref: $(scripts.branch)
trigger:
- develop
- repository: serviceconn
type: bitbucket
endpoint: BitbucketAzurePipelines
name: (service.name)
ref: $(service.branch)
trigger:
- develop
pr:
branches:
- develop
variables:
- name: service.path
value: $(Agent.BuildDirectory)/s/$(service.name)
- name: scripts.path
value: $(Agent.BuildDirectory)/s/$(scripts.name)
- name: scripts.output
value: $(scripts.path)/$(release.folder)/$(release.filename)
- group: DeploymentScriptVariables.Dev
stages:
- stage: Build
displayName: Build and push an image
jobs:
- job: Build
displayName: Build
pool:
name: 'Self Hosted 1804'
steps:
- checkout: scriptsconn
- checkout: serviceconn
The document you linked to is actually a design document. So it's possible\likely that not everything on that page is implemented. In the design document I also see this line:
However, triggers are not enabled on repository resource today. So, we will keep the current behavior and in the next version of YAML we will enable the triggers by default.
The current docs on the YAML schema seem to indicate that triggers are not supported on Repository Resources yet.
Just as an FYI you can see the current supported YAML schema at this url.
https://dev.azure.com/{organization}/_apis/distributedtask/yamlschema?api-version=5.1
I am not 100% sure on what you are after template wise. General suggestion, if you are going with the reusable content template workflow, you could trigger from an azure-pipelines.yml file in each of your microservice repos, consuming the reusable steps from your template. Hope that helps!

Azure Pipeline to trigger Pipeline using YAML

Attempting to trigger an Azure pipeline when another pipeline has been completed using a YAML. There's documentation indicating that you can add a pipeline resource with:
resources: # types: pipelines | builds | repositories | containers | packages
pipelines:
- pipeline: string # identifier for the pipeline resource
connection: string # service connection for pipelines from other Azure DevOps organizations
project: string # project for the source; optional for current project
source: string # source defintion of the pipeline
version: string # the pipeline run number to pick the artifact, defaults to Latest pipeline successful across all stages
branch: string # branch to pick the artiafct, optional; defaults to master branch
tags: string # picks the artifacts on from the pipeline with given tag, optional; defaults to no tags
However, I've been unable to figure out what the "source" means. For example, I have a pipeline called myproject.myprogram:
resources:
pipelines:
- pipeline: myproject.myprogram
source: XXXXXXXX
Moreover, it's unclear how you'd build based a trigger based on this.
I know that this can be done from the web-GUI, but it should be possible to do this from a YAML.
For trigger of one pipeline from another azure official docs suggest this below solution. i.e. use pipeline triggers
resources:
pipelines:
- pipeline: RELEASE_PIPELINE // any arbitrary name
source: PIPELINE_NAME. // name of the pipeline shown on azure UI portal
trigger:
branches:
include:
- dummy_branch // name of branch on which pipeline need to trigger
But actually what happens, is that it triggers two pipelines. Take an example, let suppose we have two pipelines A and B and we want to trigger B when A finishes. So in this scenario B runs 2 times, once when you do a commit (parallel with A) and second after A finishes.
To avoid this two times pipeline run problem follow the below solution
trigger: none // add this trigger value to none
resources:
pipelines:
- pipeline: RELEASE_PIPELINE // any arbitrary name
source: PIPELINE_NAME. // name of the pipeline shown on azure UI portal
trigger:
branches:
include:
- dummy_branch // name of branch on which pipeline need to trigger
By adding trigger:none second pipeline will not trigger at start commit and only trigger when first finish its job.
Hope it will help.
Microsoft documentation says that YAML is the preferred approach. So, instead of going for the build-trigger option let's understand the, little bit confusing, YAML trigger. The following tags will work from the original question and now with a bit easier documentation:
resources:
pipelines:
- pipeline: aUniqueNameHereForLocalReferenceCanBeAnything
project: projectNameNOTtheGUID
source: nameOfTheOtherPipelineNotTheDefinitionId
trigger:
branches:
include:
- master
- AnyOtherBranch
The documentation from Microsoft is confusing and the IDs are numerous. At times they want the Project GUID at times the project name. At times they want the pipeline name and at times the pipeline definition Id. But they use the same name for the variable (project and pipeline). And on top of that they write documentation that does not make it easy to guess which one to use the best way is to trial and error.
I think to avoid the confusion in other places I'm giving example of another place in the pipeline you refer to the same variables with different values. In the DownloadArtifact task, you need to use the project GUID and the pipeline definition Id as shown below:
- task: DownloadPipelineArtifact#2
inputs:
source: specific (a literal constant value not the pipeline name)
project: projectGUIDNOTtheProjectName
pipeline: numericDefinitionIdOfPipelineNotPipelineNameOrUniqueRef
runVersion: 'latest'
Just look at how they used the same variables in a different way, but both referring to a pipeline and in my case the same exact pipeline. That could create confusion and to avoid stumbling into the next issue I give it here for clarification.
The resources are not for the Build Completion trigger. according to the docs the build completion trigger not yet supported in YAML syntax.
After you create the YAML pipeline you can go to the classic editor (click on settings or variables) and there create the trigger.
Edit:
Now you need to click on the "Triggers":
And then:
Second Edit:
Microsoft added this feature also the YAML :) see here:
# this is being defined in app-ci pipeline
resources:
pipelines:
- pipeline: security-lib
source: security-lib-ci
trigger:
branches:
- releases/*
- master
In the above example, we have two pipelines - app-ci and security-lib-ci. We want the app-ci pipeline to run automatically every time a new version of the security library is built in master or a release branch.
If you're not publishing an artifact from the triggering pipeline, it won't trigger the triggered pipeline.
Also, if the defaultBranch for manual and scheduled builds in the triggered pipeline is not the same as your working branch, the triggered pipeline won't kick in at the end of the triggering pipeline execution.
I have created a minimum viable product for a pipeline trigger, and I explain better the two issues I just mentioned in this answer.

Resources