i need some help on running selected envirorment from azure devops pipeline.
i used codeway.yml file which consisting of 3 env. like dev/qa/pro.
so when i run pipeline all the env are running one after other.
But my requirement is i want to run only selected env (other env should not run) with choice parameter like in jenkins
also needed "proceed to qa or pro" such a condition input needed.
could some help sample code for that.
Thanks in advance.
You can define a variable in the pipeline, and after that use this variable to execute the tasks stages that you want, previously consider that you need to have an starter point, here I use the stage info:
variables:
isDevelop: true
isQA: false
isProd: false
- stage: Info
displayName: 'Information for deploy'
jobs:
- whatever
steps:
- stage: Dev
displayName: 'Deploy to Dev'
dependsOn: Info
condition: and(succeeded(), eq(variables['isDevelop'], true))
jobs:
- whatever
- stage: QA
displayName: 'Deploy to QA'
dependsOn: Info
condition: and(succeeded(), eq(variables['isQA'], true))
jobs:
- whatever
- stage: Prod
displayName: 'Deploy to Prod'
dependsOn: Info
condition: and(succeeded(), eq(variables['isProd'], true))
jobs:
- whatever
The variables can be calculated or alse you can get the value from the variables pipeline definition in azure, in the code below would be calculated using the sourcebranch, consider that is an example, and you need to adapt it to your project:
variables:
isDevelop: $[eq(variables['sourceBranch'], 'refs/heads/develop')]
isQA: $[eq(variables['sourceBranch'], 'refs/heads/qa')]
isProd: $[eq(variables['sourceBranch'], 'refs/heads/prod')]
Related
Currently, our inhertited pipelines (not the best designed, but it's what I've got for now), look something like this:
build -> provision + deploy (pipeline per env) -> acceptance test
I only want to be able to do one deployment at a time, so I'm looking at Exclusive Locks.
Unfortunately, these seem to work at the step level, not the pipeline level. So given my provision + deploy pipeline contains multiple steps, how can I prevent step 1 from pipeline run 1 running then step 1 from pipeline 2 running etc?
I can't see a way to apply the exclusive lock at the pipeline level.
Our QA provision + deploy pipeline yml looks like this (before adding the locks):
pool:
vmImage: "ubuntu-latest"
resources:
pipelines:
- pipeline: apiBuild
source: "API/Build"
trigger:
enabled: true
branches:
include:
- main
trigger: none
pr: none
variables:
- template: _pipeline/variables/allVariables.yml
parameters:
deployEnvironment: Qa
stages:
- stage: provisionInfrastructureTemplates
displayName: Provision Templates infrastructure
dependsOn: []
jobs:
- template: _pipeline/stages/jobs/provisionTemplates.yml
- stage: templatesAcceptanceTests
displayName: Templates acceptance tests
dependsOn: provisionInfrastructureTemplates
jobs:
- template: _pipeline/stages/jobs/acceptanceTestsTemplates.yml
- stage: provisionInfrastructureClients
displayName: Provision Clients infrastructure
dependsOn: []
jobs:
- template: _pipeline/stages/jobs/provisionClients.yml
- stage: clientsAcceptanceTests
displayName: Clients acceptance tests
dependsOn: provisionInfrastructureClients
jobs:
- template: _pipeline/stages/jobs/acceptanceTestsClients.yml
- stage: provisionInfrastructureReports
displayName: Provision Reports infrastructure
dependsOn: []
jobs:
- template: _pipeline/stages/jobs/provisionReports.yml
- stage: reportsAcceptanceTests
displayName: Reports acceptance tests
dependsOn: provisionInfrastructureReports
jobs:
- template: _pipeline/stages/jobs/acceptanceTestsReports.yml
- stage: upgradePreviewImage
displayName: Upgrade preview image
dependsOn: []
jobs:
- template: _pipeline/stages/jobs/upgradePreviewImage.yml
- stage: provisionInfrastructureDocuments
displayName: Provision Documents infrastructure
dependsOn: upgradePreviewImage
jobs:
- template: _pipeline/stages/jobs/provisionDocuments.yml
- stage: documentsAcceptanceTests
displayName: Documents acceptance tests
dependsOn: provisionInfrastructureDocuments
jobs:
- template: _pipeline/stages/jobs/acceptanceTestsDocuments.yml
- stage: provisionInfrastructureNotifications
displayName: Provision Notifications infrastructure
dependsOn: []
jobs:
- template: _pipeline/stages/jobs/provisionNotifications.yml
- stage: provisionEventGridSubscriptions
displayName: Provision Event Grid Subscriptions
dependsOn: [clientsAcceptanceTests, templatesAcceptanceTests, reportsAcceptanceTests, documentsAcceptanceTests, provisionInfrastructureNotifications]
jobs:
- template: _pipeline/stages/jobs/provisionEventGridSubscriptions.yml
- stage: workflowTests
displayName: Workflow tests
dependsOn: provisionEventGridSubscriptions
jobs:
- template: _pipeline/stages/jobs/workflowTests.yml
As an aside, I know our services ought to be independently deployable, they're not currently, that's part of the tech debt we're dealing with so, as things stand, they need deploying together.
When you put the jobs in separate stages. The exclusive lock of the environment will be judged separately.
It is not able to apply the exclusive lock at the pipeline level.
To meet your requirement, you can define required the jobs in the same stage. The Exclusive Lock will be judged before running.
When set the environments exclusive lock in the same stage, all environments referenced in a stage will be locked. Only when the current Pipeline Run Stage finished, the next Pipeline run will continue to run.
Here is an example:
stages:
- stage: Test
displayName: Test
lockBehavior: sequential
jobs:
- deployment: Test1
displayName: 'Test'
environment: Test2
strategy:
runOnce:
deploy:
steps:
- script: echo "Test"
- deployment: Test
displayName: 'Test2'
environment: Test2
strategy:
runOnce:
deploy:
steps:
- script: echo "Test"
link for the plugin : https://marketplace.visualstudio.com/items?itemName=touchify.vsts-changed-files
isPullRequest: ${{eq(variables['Build.Reason'], 'PullRequest')}}
- job: "check"
condition: and(succeeded(), eq(variables['Build.Reason'], 'PullRequest'))
displayName: Check Files Change
steps:
- task: ChangedFiles#1
name: CheckChanges
inputs:
rules: |
[Y]
X/**
variable: 'HasChanged'
isOutput: true
- job: "X"
dependsOn: check
condition: or(eq(dependencies.check.outputs['CheckChanges.Y'], 'true'),
or(succeeded(), ne(variables.isPullRequest, 'true')))
Hello I want to make the JOB run only if a change has been made to a particular folder or the run is not from Build Validation.
Need help exactly the conditions
You can define "policies" on branches to trigger specific pipeline definition(s) when file path(s) are involved.
Here you can find the Azure DevOps documentation (please have a look at the "path filters" part)
Even though I set System.Debug=True I get no other information than "The job was skipped". Literally just these four words.
I created a YAML-Release pipeline on Azure Devops which basically runs the jobs:
job: build_release
jobs: deployment: deploy_test
jobs: deployment: deploy_stage
To test the behavior I first only ran the first two jobs and deployed to TEST. Now I want to deploy to STAGE but it seems that the pipeline is only working when I start from the beginning / create a new release. But what I want to do right now is to deploy the already existing release from TEST to STAGE. When I try to do that by rerunning the pipeline Azure just skips all steps. Why is this happening? How can I avoid this and rerun the pipeline? I did not set any conditions.
EDIT with additonal info:
Structure of the pipeline
trigger:
- release/*
variables:
...
resources:
- repo: self
pool:
vmImage: $(vmImageName)
stages:
- stage: build_release
displayName: 'awesome build'
condition: contains(variables['Build.SourceBranchName'], 'release/')
jobs:
- job: build_release
steps:
...
- stage: deploy_test
displayName: 'awesome test deploy'
jobs:
- deployment: deploy_test
environment: 'test'
strategy:
runOnce:
deploy:
steps:
...
- stage: deploy_stage
displayName: 'awesome stage deploy'
jobs:
- deployment: deploy_stage
environment: 'stage'
strategy:
runOnce:
deploy:
steps:
...
I tried to trigger it in two different ways which had the same outcome (everything was skipped):
A. I created a new release which was a copy of the previously deployed release.
B. I clicked on run pipeline.
The issue is caused by the condition condition: contains(variables['Build.SourceBranchName'], 'release/'), which you specified for stage build_release.
When the trigger is set to - release/*. The variable variables['Build.SourceBranchName'] will be evaluated to the branch name after the /.
For example:
If you triggered your pipeline from branch release/release1.0. the value of variables['Build.SourceBranchName'] will be release1.0 instead of release/release1.0. So the condition contains(variables['Build.SourceBranchName'], 'release/') will always be false, which caused the stage build_release to be skipped.
And, if you didnot specify dependency for stage deploy_test and stage deploy_stage, the next stage will depends on the previous stage by default. So these two stages also got skipped, since stage build_release is skipped. This is why you saw all the steps were skipped.
Solution:
Using variable Build.SourceBranch in the condition.
Change the condition like below: (The yaml file in the release branches should also be changed like below)
- stage: build_release
displayName: 'awesome build'
condition: contains(variables['Build.SourceBranch'], 'release/') #use Build.SourceBranch
Noted: If you mannaly triggered your pipeline. Please make sure your select to trigger the pipeline from release branches. Or the pipeline will be triggered from main branch by default.
The issue here is when the run of the pipeline was create it sounds like the deploy stage was not selected. As such at compilation time of the pipeline the stage was skipped as it was defined as being skipped within that run.
As for what you are running the first question would be are these changes to run the deploy_stage is this in the main branch? The pipeline by default will run against the main branch unless otherwise specified.
Azure Pipelines support multiple stages in YAML. One typical example would be to have something like :
trigger:
- master
pool:
name: Default
demands:
- npm
- msbuild
- visualstudio
stages:
- stage: build
jobs:
- job: Build app
- stage: deploy
jobs:
- job: Deploy to dev
I'm not used to working like this. Usually, I would run the pipeline to build my application and drop artifacts to a drop folder. The pipeline would be the same regardless the environment that would later be targeted by the release.
Then I would choose to run a release, either Integration, UAT, or Production.
However, having multi-stages pipeline we are mixing the build and the release together. So how would I release in a given environment ? Do I have to duplicate this pipeline per environment ?
You can use template structure here. In this you will need to create separate files for different jobs and variables. Then you will need to execute templates with suitable variable template files for each stage.
Directory structure
Directory Structure
Pipeline:
Pipeline stages
Environments
Environments
Here's the sample pipeline:
trigger:
- master
variables:
- name: vmImage
value: 'ubuntu-latest'
stages:
- stage: Build
displayName: Build stage
jobs:
- job: BuildJob
pool:
vmImage: $(vmImage)
steps:
- template: Jobs/build.yml
- stage: NonProd
displayName: Deploy non prod stage
jobs:
- deployment: DeploymentJob1
pool:
vmImage: $(vmImage)
environment: non-prod
variables:
- template: Variables/non-prod.yml
strategy:
runOnce:
deploy:
steps:
- template: Jobs/deploy.yml
- stage: Prod
displayName: Deploy prod stage
jobs:
- deployment: DeploymentJob2
pool:
vmImage: $(vmImage)
environment: prod
variables:
- template: Variables/prod.yml
strategy:
runOnce:
deploy:
steps:
- template: Jobs/deploy.yml
Jobs/build.yml
steps:
- script: echo I am building!
displayName: 'Run Build'
Jobs/deploy.yml
steps:
- script: echo I am deploying to $(Environment)!
displayName: 'Run Deployment'
Variables/non-prod.yml
variables:
- name: Environment
value: non-prod
Variables/prod.yml
variables:
- name: Environment
value: prod
I have a YAML file that resembles the following:
stages:
- stage: A
pool:
vmImage: 'windows-2019'
jobs:
- job: a
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
#edits file "$(System.DefaultWorkingDirectory)/myfolder/myfile.json"
- stage: B
dependsOn: A
pool:
vmImage: 'windows-2019'
jobs:
- job: b
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
#uses file "$(System.DefaultWorkingDirectory)/myfolder/myfile.json"
I have split my pipeline into two stages; A: edits a file in a repository and B: works with the edited file.
My problem is, the files seem to get reset between stages. Is there any way of keeping the changes throughout the stages, rather than resetting them?
I don't want to publish artifacts and so on as in stage b, although not in the YAML above, I am running multiple PowerShell script files that contain hardcoded file paths and it would just be a mess overwriting the file paths to point at the artifacts directory before running the stage.
An
Based on my test , the cause of this issue is that the two stages run on the different Agent machines.
For example: Stage A -> Agent machine name: 'fv-az146' , Stage B -> Agent machine name: 'fv-az151'
You could check the agent information in Build log -> Initialize job.
Is there any way of keeping the changes throughout the stages, rather
than resetting them?
Since you don't want to publish artifacts, you could try to use Self-hosted agents to run two stages.
You need to add demands to the agent to ensure that the stages run on the same Self-hosted agent.
According to this doc:
The demands keyword is supported by private pools.
We couldn't specify specific "Agent Capabilities" in Microsoft-hosted agents. So we couldn't ensure that two stages can run on the same agent
Update:
Since the two stages are running on the same agent, the "check out" step in Stage B could override the files in Stage A.
So you also need to add the - checkout: none in Stage B.
Here is the updated Yaml template:
stages:
- stage: A
pool:
name: Pool name
demands:
- Agent.Name -equals agentname1
jobs:
- job: a
steps:
- task: PowerShell#2
...
- stage: B
dependsOn: A
pool:
name: Pool name
demands:
- Agent.Name -equals agentname1
jobs:
- job: b
steps:
- checkout: none
- task: PowerShell#2
...
The overall workflow: the Stage A edits the files and save it to $(System.DefaultWorkingDirectory).
Then Stage B could directly use the files in $(System.DefaultWorkingDirectory).
The files in $(System.DefaultWorkingDirectory) will keep changes in Stage A and B.