When defining a GUI release I can make it be triggered by an Azure Artifact, is there a way to replicate this for pipelines in YML?
I am building in one AZDO tenant, pushing universal packages to another tenant, where the release definitions will be defined, I'm hoping this can be in YAML. But I don't see an obvious way to do this at the moment?
I see there is a design document that makes mention of packages, but no further details are provided
https://github.com/microsoft/azure-pipelines-yaml/blob/master/design/pipeline-resources.md
Cheers
Edit-
Is there a way to trigger pipeline using Azure Artifacts in YML?-But I don't see an obvious way to do this at the moment?
Yes, Yes. You are right !
That because the content in this document are speculative, designs, and future features.
If you check the upper level of design document you provided, there is a state:
Azure Pipelines YAML - Design Docs
The design docs within this repo are created at different times during
the development of Azure Pipelines, to support collaborative
contributions to the design process. Designs documents are for,
features considered for implementation but never implemented
already implemented features
future ideas for features
The design docs in this repo may not represent the current state
of an Azure Pipelines feature.
When you check the officially release document YAML schema reference-Resources, it only list:
resources:
pipelines: [ pipeline ]
repositories: [ repository ]
containers: [ container ]
So, Azure Artifacts source in YAML should be a future feature at this moment. Hope MS can achieve it one day earlier.
Hope this answer clear your question.
Build completion triggers are not yet supported in YAML syntax. After you create your YAML build pipeline, you can use the classic editor to specify a build completion trigger.
Reference:
https://learn.microsoft.com/en-us/azure/devops/pipelines/build/triggers?view=azure-devops&tabs=yaml#build-completion-triggers
Though i would suggest you to use below mechanism to trigger the release:
Resource triggers
Resources trigger will be helpful in below scenario:
I would like to trigger my pipeline when an artifact is published by ‘Helm-CI’ pipeline that ran on releases/* branch.
I would like to trigger my pipeline when an artifact is published and tested as part of Helm-CI pipeline and tagged as 'Production'.
I would like to trigger my pipeline when ‘TFS-Update’ pipeline has completed ‘Ring2’ stage so that I can run some diagnostics.
https://github.com/microsoft/azure-pipelines-yaml/blob/master/design/pipeline-triggers.md
Webhook triggers
At the end of the CI piepleing , you can add a task to hit webhook url which can trigger your CD is one way.
Hope it helps.
You can use a multi-stage pipeline to achieve this.
One stage would include a task that will push your artifacts to the feed. The next stage will contain other jobs that you want to be executed after pushing the Artifacts.
eg:
stages:
#Stage for preparing the Artifact
- stage: prepare
jobs:
- job: prepare
pool:
vmImage: xx
steps:
- task: PublishBuildArtifacts#1
inputs:
pathToPublish: xx
artifactName: xx
# Next stage in your pipeline
- stage: build
dependsOn: prepare
jobs:
steps:
- task: xx
Note that the second stage build dependsOn the stage prepare.
ps: Multi-Stage pipeline is currently under preview. If you enable it from the preview feature, you will also be able to see a nice visual representation of the stages.
Related
I have a requirement of giving multiple teams access to a shared resource in azure. I therefore want to limit how people can publish changes to the shared resource.
The idea is to limit the use of a service connection to a specific pipeline, as per this documentation. However if the pipeline is stored in their own repo the developer could change it. This would not give me enough control. I therefore found that it was possible using a template from a central repo. Using a shared repo, would then allow me to have a service connection solely for the template?
So how I imagine doing the above is I need to grant project X a service connection for my BuildTemplates Repo. But this is basically just access to the repo and to be able to use the shared templates. Then in BuildTemplates repo I can have a service connection for my template A.
Now the developer in project X - creates her deployments and configurations for her pipeline with her own service connection scoped for her resources. Then she inherits a template from BuildTemplates Repo and passes relevant parameters for the template A.
She cannot alter the template pipeline A and only the template pipeline A can publish to the shared resource, because of the scoped service connection. I can therefore create relevant guards for the shared azure resource in the template pipeline A - so I restrict how developer X can publish to my shared azure resource.
does this make sense and is it viable?
The pipeline part in A cannot be edited by developer in X ?
The service connection in A will not propagate out so developer in X can use it in an inappropriate way?
Update
The above solution does not seem to be viable since the pipeline template is executed in the source branch scope.
Proposed Solution
The benefits I see with the above suggestion doe not seem possible, because of the issues. However one can utilise pipeline triggers, as a viable solution. This however results in a new issue. When a pipeline is triggered by Developer Y in Y's repository and it succeeds. Then a trigger is made in MAIN repository and the pipeline in MAIN fails e.g., because the artifacts from Y introduced an Issue. How does developer Y get notified about the issues in MAIN pipeline?
Here is my solution, in same Azure organization, we can create a Azure Project, then create a repo to save common pipeline template.
All the repos in other Azure project can access this pipeline template.
UserProject/UserRepo/azure-pipelines.yml
trigger:
branches:
include:
- master
paths:
exclude:
- nuget.config
- README.md
- azure-pipelines.yml
- .gitignore
resources:
repositories:
- repository: devops-tools
type: git
name: PipelineTemplateProject/CommonPipeline
ref: 'refs/heads/master'
jobs:
- template: template-pipeline.yml#devops-tools
PipelineTemplateProject/CommonPipeline/template-pipeline.yml
Since the inline script of pipeline has 5000 characters limitation,
you can put your script(not only powershell, but also other languages) in PipelineTemplateProject/CommonPipeline/scripts/test.ps1
# Common Pipeline Template
jobs:
- job: Test_Job
pool:
name: AgentPoolName
steps:
- script: |
echo "$(Build.RequestedForEmail)"
echo "$(Build.RequestedFor)"
git config user.email "$(Build.RequestedForEmail)"
git config user.name "$(Build.RequestedFor)"
git config --global http.sslbackend schannel
echo '------------------------------------'
git clone -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" -b $(ToolsRepoBranch) --single-branch --depth=1 "https://PipelineTemplateProject/_git/CommonPipeline" DevOps_Tools
echo '------------------------------------'
displayName: 'Clone DevOps_Tools'
- task: PowerShell#2
displayName: 'Pipeline Debug'
inputs:
targetType: 'inline'
script: 'Get-ChildItem -Path Env:\ | Format-List'
condition: always()
- task: PowerShell#2
displayName: 'Run Powershell Scripts'
inputs:
targetType: filePath
filePath: 'DevOps_Tools/scripts/test.ps1'
arguments: "$(System.AccessToken)"
Notes:
Organization Setting - Settings - Disable Limit job authorization scope to current project for release pipelines
Organization Setting - Settings - Limit job authorization scope to current project for non-release pipelines
Check some option in project setting as well.
So the normal user only access their own repo, cannot access DevOps project, and DevOps owner can edit template pipeline only.
For the notification issue, I use an Email extention "rvo.SendEmailTask.send-email-build-task.SendEmail#1"
I have a release pipeline that combines two build pipelines artifacts to create the full release. I need to be able to download the result of this task after is done.
I run the Archive Task to zip the results but I don't know how to save it somewhere where I can download it using the Azure Pipeline agent.
Is there a task that can trigger that as a download or can I save it as an Artifact?
Thank you
Yeah, what you have to do is publish and then consume artifact in your Release pipeline. Here you have documentation
steps:
- publish: $(System.DefaultWorkingDirectory)/bin/WebApp
artifact: WebApp
Id you use Yaml then you should use this:
steps:
- download: current
artifact: WebApp
If you use Classic releases you needs to confgure it on designer.
Please use this task in your pipeline:
and then configure your release here:
Just realized that if I do this with my On-Prem agent I can get the file form the agent's file structure.
Would be nice though if I could get the file straight from Azure instead.
Thank you.
I've tried multi repo trigger using azure devops.
resources: repositories:
- repository: ToolsRepo
name: ToolsRepo
type: git
trigger:
batch: true
branches:
include:
- master
trigger does not work. It works only if I remove batch: true line. How could I make it work together with batching?
According to the test result, if you set batch: true to other repos (not the repo where the current yaml pipeline file is located), then CI trigger from other repos will not trigger the pipeline.
The CI trigger from the current repo can trigger the pipeline normally.
So I am afraid that batch: true is currently not supported in multi-repo trigger. In addition, the batch argument is not specified in the multi-repo trigger yaml syntax.
You could add your request for this feature on our UserVoice site, which is our main forum for product suggestions. The product team would provide the updates if they view it. Thank you for helping us build a better Azure DevOps.
I have a stage that uses 6 deployment jobs that can either deploy to dev, staging or production depending on specific conditions.
For deploying to production, I'd like to add manual approvals. I am aware that deployment jobs can specify environments on which you can add manual approvals, but I'd like to approve the entire stage and not each individual deployment job. This way, I can approve the stage once and all 6 deployment jobs can run at once, instead of having to approve 6 times.
Is this possible? The documentation says it should be, but it doesn't say how. Besides, in the YAML schema for stages, it doesn't look like you can specify environments inside stages.
Currently there is no such a built-in feature to approve entire stage in YAML. We can only Define approvals and checks for the environments. The documentation you mentioned is also indicated that this is commonly used to control deployments to production environments.
However, there's already a suggestion ticket to request the feature. You could vote and add your comments for the suggestion ticket to achieve that in future release.
Looks like the ManualValidation task could help you there. Using dependsOn will allow all other jobs to complete before final approval.
Example:
jobs:
- job: waitForValidation
dependsOn: 'previousJobName'
displayName: Wait for external validation
pool: server
timeoutInMinutes: 4320 # job times out in 3 days
steps:
- task: ManualValidation#0
timeoutInMinutes: 1440 # task times out in 1 day
inputs:
notifyUsers: |
test#test.com
example#example.com
instructions: 'Please validate the build configuration and resume'
onTimeout: 'reject'
We have a collection of Azure Function Apps in c# net core. Each App contains a small number of Azure Functions. All Function Apps reside in a single git repository.
We would like some of our environments to deploy automatically from source (e.g. bitBucket or gitHub).
How do we configure the project so that Azure knows which project in source relates to which created Function App?
I have searched around this problem for a number of days and have not seen any results that sit outside of "it just works" so can only assume that we are missing something fundamental.
I'd recommend using Azure DevOps (formerly VSTS) to deploy to Azure, you use YAML to define a build pipeline which can publish an artifact from each of your function apps. The artifacts then get picked up by a release pipeline and can be deployed to Azure.
The basic building blocks of this are, firstly some YAML like this in your build pipeline for each project:
...
steps:
# a script task that let's you use any CLI available on the DevOps build agent, also uses a variable for the build config
- script: dotnet build MyFirstProjectWithinSolution\MyFirstProject.csproj --configuration $(buildConfiguration)
displayName: 'dotnet build MyFirstProject'
# other steps removed, e.g. run and publish tests
- script: dotnet publish MyFirstProjectWithinSolution\MyFirstProject.csproj --configuration $(buildConfiguration) --output MyFirstArtifact
displayName: 'dotnet publish MyFirstProject'
# a DevOps named task called CopyFiles (which is version 2 = #2), DevOps supplies lots of standard tasks you can make use of
- task: CopyFiles#2
inputs:
contents: 'MyFirstProjectWithinSolution\MyFirstArtifact\**'
targetFolder: '$(Build.ArtifactStagingDirectory)'
# now publish the artifact which makes it available to the release pipeline, doing so into a sub folder allows multiple artifacts to be dealt with
- task: PublishBuildArtifacts#1
displayName: 'publish MyFirstArtifact artifact'
inputs:
pathtoPublish: '$(Build.ArtifactStagingDirectory)\MyFirstProjectWithinSolution\MyFirstArtifact'
artifactName: MyFirstArtifact
# now repeat the above for every project you need to deploy, each in their own artifact sub-folder
Next you create a release, which in its simplest form picks up the artifacts and does one or more deployment, here's a simple one which deploys two function app projects:
Within a deployment stage (right hand side above), you can define your release process, again in its simplest form you can just deploy straight to production or to a slot, although until function slots are out of preview you could also spin up another function app and deploy and test there.
This screenshot shows a simple deployment which uses a standard Azure Function App deployment from Azure DevOps:
Within your deployment stage you can define which artifact is deployed and after running your build pipeline for the first time you'll get to see all the available artifacts that it created.
All or parts of the above can be automated from pushing a branch (or other triggers such as on a schedule). Notifications and "gates" can be added as well if you want manual intervention before release or between release stages.
There are also other ways to cut this up, eg with multiple build pipelines, it’s basically completely flexible but the above are the elements you can use to deploy one or more function apps at a time.