Azure devops pipeline condition of plugin - azure

link for the plugin : https://marketplace.visualstudio.com/items?itemName=touchify.vsts-changed-files
isPullRequest: ${{eq(variables['Build.Reason'], 'PullRequest')}}
- job: "check"
condition: and(succeeded(), eq(variables['Build.Reason'], 'PullRequest'))
displayName: Check Files Change
steps:
- task: ChangedFiles#1
name: CheckChanges
inputs:
rules: |
[Y]
X/**
variable: 'HasChanged'
isOutput: true
- job: "X"
dependsOn: check
condition: or(eq(dependencies.check.outputs['CheckChanges.Y'], 'true'),
or(succeeded(), ne(variables.isPullRequest, 'true')))
Hello I want to make the JOB run only if a change has been made to a particular folder or the run is not from Build Validation.
Need help exactly the conditions

You can define "policies" on branches to trigger specific pipeline definition(s) when file path(s) are involved.
Here you can find the Azure DevOps documentation (please have a look at the "path filters" part)

Related

Path Filters and Controlling Execution Order of Triggered Pipelines

Let's say I have three pipelines that do the following:
Pipeline 1:
Task A
Pipeline 2:
Task B
Pipeline 3:
Task A
Task B
Now let's say my repo has two directories:
AStuff
BStuff
Is there any way to set path filters such that:
If AStuff has changes but BStuff doesn't, Pipeline 1 runs (and nothing else)
If BStuff has changes but AStuff doesn't, Pipeline 2 runs (and nothing else)
If both AStuff and BStuff have changes, Pipeline 3 runs (and nothing else)
The root of my problem is that I want Task A to run if AStuff has changes, and I want Task B to run if BStuff has changes. But if they both have changes, I would prefer Task A runs and then Task B runs, instead of ADO selecting whichever one it wants to run first. So, alternatively, maybe there's some way for Pipeline 2 to have a triggers/conditions that cause it to run when Pipeline 1 completes, but only if the changes that triggered Pipeline 1 affected the BStuff directory.
No built-in feature to achieve your third requirement.
And it is impossible for the third pipeline to run independently, if the third pipeline runs, it must mean that the first two pipelines will also run. Unless the third pipeline is not in the same environment as the first two.
The following pipeline definition should meet your requirements.
Check_StuffA
trigger:
paths:
include:
- AStuff/*
exclude:
- BStuff/*
pool:
vmImage: ubuntu-latest
steps:
- script: echo Hello, world!
displayName: 'Run a one-line script'
Check_StuffB
trigger:
paths:
include:
- BStuff/*
exclude:
- AStuff/*
pool:
vmImage: ubuntu-latest
steps:
- script: echo Hello, world!
displayName: 'Run a one-line script'
Check_StuffA&B
trigger:
paths:
include:
- AStuff/*
- BStuff/*
pool:
vmImage: ubuntu-latest
jobs:
- job: check
displayName: Check changed files
pool:
vmImage: ubuntu-latest
steps:
- task: ChangedFiles#1
name: CheckChanges
inputs:
rules: |
[CodeChanged]
AStuff/*
[TestsChanged]
BStuff/*
- job: build
displayName: Build only when code changes
dependsOn: check
condition: and(eq(dependencies.check.outputs['CheckChanges.CodeChanged'], 'true'),eq(dependencies.check.outputs['CheckChanges.TestsChanged'], 'true'))
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
# Write your PowerShell commands here.
Write-Host "Hello World"

Azure DevOps Pipelines to run selected envirorment from codeway.yml file

i need some help on running selected envirorment from azure devops pipeline.
i used codeway.yml file which consisting of 3 env. like dev/qa/pro.
so when i run pipeline all the env are running one after other.
But my requirement is i want to run only selected env (other env should not run) with choice parameter like in jenkins
also needed "proceed to qa or pro" such a condition input needed.
could some help sample code for that.
Thanks in advance.
You can define a variable in the pipeline, and after that use this variable to execute the tasks stages that you want, previously consider that you need to have an starter point, here I use the stage info:
variables:
isDevelop: true
isQA: false
isProd: false
- stage: Info
displayName: 'Information for deploy'
jobs:
- whatever
steps:
- stage: Dev
displayName: 'Deploy to Dev'
dependsOn: Info
condition: and(succeeded(), eq(variables['isDevelop'], true))
jobs:
- whatever
- stage: QA
displayName: 'Deploy to QA'
dependsOn: Info
condition: and(succeeded(), eq(variables['isQA'], true))
jobs:
- whatever
- stage: Prod
displayName: 'Deploy to Prod'
dependsOn: Info
condition: and(succeeded(), eq(variables['isProd'], true))
jobs:
- whatever
The variables can be calculated or alse you can get the value from the variables pipeline definition in azure, in the code below would be calculated using the sourcebranch, consider that is an example, and you need to adapt it to your project:
variables:
isDevelop: $[eq(variables['sourceBranch'], 'refs/heads/develop')]
isQA: $[eq(variables['sourceBranch'], 'refs/heads/qa')]
isProd: $[eq(variables['sourceBranch'], 'refs/heads/prod')]

Azure YAML-Pipeline skips jobs and I have no idea why

Even though I set System.Debug=True I get no other information than "The job was skipped". Literally just these four words.
I created a YAML-Release pipeline on Azure Devops which basically runs the jobs:
job: build_release
jobs: deployment: deploy_test
jobs: deployment: deploy_stage
To test the behavior I first only ran the first two jobs and deployed to TEST. Now I want to deploy to STAGE but it seems that the pipeline is only working when I start from the beginning / create a new release. But what I want to do right now is to deploy the already existing release from TEST to STAGE. When I try to do that by rerunning the pipeline Azure just skips all steps. Why is this happening? How can I avoid this and rerun the pipeline? I did not set any conditions.
EDIT with additonal info:
Structure of the pipeline
trigger:
- release/*
variables:
...
resources:
- repo: self
pool:
vmImage: $(vmImageName)
stages:
- stage: build_release
displayName: 'awesome build'
condition: contains(variables['Build.SourceBranchName'], 'release/')
jobs:
- job: build_release
steps:
...
- stage: deploy_test
displayName: 'awesome test deploy'
jobs:
- deployment: deploy_test
environment: 'test'
strategy:
runOnce:
deploy:
steps:
...
- stage: deploy_stage
displayName: 'awesome stage deploy'
jobs:
- deployment: deploy_stage
environment: 'stage'
strategy:
runOnce:
deploy:
steps:
...
I tried to trigger it in two different ways which had the same outcome (everything was skipped):
A. I created a new release which was a copy of the previously deployed release.
B. I clicked on run pipeline.
The issue is caused by the condition condition: contains(variables['Build.SourceBranchName'], 'release/'), which you specified for stage build_release.
When the trigger is set to - release/*. The variable variables['Build.SourceBranchName'] will be evaluated to the branch name after the /.
For example:
If you triggered your pipeline from branch release/release1.0. the value of variables['Build.SourceBranchName'] will be release1.0 instead of release/release1.0. So the condition contains(variables['Build.SourceBranchName'], 'release/') will always be false, which caused the stage build_release to be skipped.
And, if you didnot specify dependency for stage deploy_test and stage deploy_stage, the next stage will depends on the previous stage by default. So these two stages also got skipped, since stage build_release is skipped. This is why you saw all the steps were skipped.
Solution:
Using variable Build.SourceBranch in the condition.
Change the condition like below: (The yaml file in the release branches should also be changed like below)
- stage: build_release
displayName: 'awesome build'
condition: contains(variables['Build.SourceBranch'], 'release/') #use Build.SourceBranch
Noted: If you mannaly triggered your pipeline. Please make sure your select to trigger the pipeline from release branches. Or the pipeline will be triggered from main branch by default.
The issue here is when the run of the pipeline was create it sounds like the deploy stage was not selected. As such at compilation time of the pipeline the stage was skipped as it was defined as being skipped within that run.
As for what you are running the first question would be are these changes to run the deploy_stage is this in the main branch? The pipeline by default will run against the main branch unless otherwise specified.

Keep changes to defaultWorkingDirectory thoughout stages within Azure build pipeline

I have a YAML file that resembles the following:
stages:
- stage: A
pool:
vmImage: 'windows-2019'
jobs:
- job: a
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
#edits file "$(System.DefaultWorkingDirectory)/myfolder/myfile.json"
- stage: B
dependsOn: A
pool:
vmImage: 'windows-2019'
jobs:
- job: b
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
#uses file "$(System.DefaultWorkingDirectory)/myfolder/myfile.json"
I have split my pipeline into two stages; A: edits a file in a repository and B: works with the edited file.
My problem is, the files seem to get reset between stages. Is there any way of keeping the changes throughout the stages, rather than resetting them?
I don't want to publish artifacts and so on as in stage b, although not in the YAML above, I am running multiple PowerShell script files that contain hardcoded file paths and it would just be a mess overwriting the file paths to point at the artifacts directory before running the stage.
An
Based on my test , the cause of this issue is that the two stages run on the different Agent machines.
For example: Stage A -> Agent machine name: 'fv-az146' , Stage B -> Agent machine name: 'fv-az151'
You could check the agent information in Build log -> Initialize job.
Is there any way of keeping the changes throughout the stages, rather
than resetting them?
Since you don't want to publish artifacts, you could try to use Self-hosted agents to run two stages.
You need to add demands to the agent to ensure that the stages run on the same Self-hosted agent.
According to this doc:
The demands keyword is supported by private pools.
We couldn't specify specific "Agent Capabilities" in Microsoft-hosted agents. So we couldn't ensure that two stages can run on the same agent
Update:
Since the two stages are running on the same agent, the "check out" step in Stage B could override the files in Stage A.
So you also need to add the - checkout: none in Stage B.
Here is the updated Yaml template:
stages:
- stage: A
pool:
name: Pool name
demands:
- Agent.Name -equals agentname1
jobs:
- job: a
steps:
- task: PowerShell#2
...
- stage: B
dependsOn: A
pool:
name: Pool name
demands:
- Agent.Name -equals agentname1
jobs:
- job: b
steps:
- checkout: none
- task: PowerShell#2
...
The overall workflow: the Stage A edits the files and save it to $(System.DefaultWorkingDirectory).
Then Stage B could directly use the files in $(System.DefaultWorkingDirectory).
The files in $(System.DefaultWorkingDirectory) will keep changes in Stage A and B.

What is auto-creating new Build pipeline?

For a given project, I have four build pipelines, each pipeline Trigger has CI enabled, each has a branch filter for a single branch - master, Staging, QA, development. These work successfully, any completed pull request to one of those four branches are successfully kicking off a build process.
This morning, I created a new branch based off "development" branch. IT was a one-liner change, so I decided to make the change online in the browser using DevOps editor. I saved the change.
Immediately after saving the change online, I saw a new build pipeline was created (I had received an email saying my build failed). What caused the new Build pipeline to be created?
The new build pipeline looks to be auto-created, it is pure YAML:
pool:
vmImage: 'Ubuntu 16.04'
variables:
buildConfiguration: 'Release'
BuildPlatform: 'Any CPU'
Parameters.solution: = '*.sln'
Parameters.ArtifactName: = 'xxxxxx'
steps:
- task: NuGetToolInstaller#0
displayName: 'Use NuGet 4.4.1'
inputs:
versionSpec: 4.4.1
- task: NuGetCommand#2
displayName: 'NuGet restore'
inputs:
restoreSolution: '$(Parameters.solution)'
- task: VSBuild#1
displayName: 'Build solution'
inputs:
solution: '$(Parameters.solution)'
platform: '$(BuildPlatform)'
configuration: '$(BuildConfiguration)'
- task: PublishSymbols#2
displayName: 'Publish symbols path'
inputs:
SearchPattern: '**\bin\**\*.pdb'
PublishSymbols: false
continueOnError: true
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact'
inputs:
PathtoPublish: '$(build.artifactstagingdirectory)'
ArtifactName: '$(Parameters.ArtifactName)'
In the project, there were no pull requests created, and my private branch, I can see my change.
The email I received had this in the title (actual names removed):
[Build failed] MyProjectName CI - MyProjectName:MyBranchName - MyProejctName - bf9524f9
========
EDIT
I just found there is an azure-pipelines.yml file in the root folder of the branch. the contents match the above. Is this competing with the designer pipelines?
Yaml pipelines are better at scale, you can manage them in a central place, you can easily make mass edits and\or you can make them depend on each other to have more control. Visual designer is only good when you have couple of pipelines or you are only getting started with the whole pipelines thing.
Yaml pipelines do not necessary have to be in azure-pipelines.yml file. I store them in a separate repo :)
Updated comment:
Also the fact that there are no triggers added in your yaml mean that every new branch will queue builds. Read about 'trigger' on the yaml schema to get more understanding on this.
You can use something like below;
trigger:
branches:
include:
- master
- develop
exclude:
- no-build-branch
Given that there is none defined is behaves as below;
trigger:
branches:
include:
- '*'
These two are the same....
Designer picks the azure-pipelines.yml when you click edit. This is the default file name that gets picked up automatically to create a pipeline.
E.g. if you add the pipeline source to azure-pipelines.yml and commit/push it will automatically create a pipeline named 'Repo_Name CI' and queue a build as well.
Any new changes will work on it's merits as per the yaml definition.
you can always use different names and add as many pipelines you want as well.....

Resources