Run only one build pipeline task on a schedule - azure

Is it possible to run only one build pipeline task on a schedule and not the whole pipeline on a schedule? I have a task in a build pipeline to generate some report about the pipeline. I would want the task to run once every month.

Yes.. it is possible in several ways:
you can set previously to the task the condition:
${{ if eq(variables['isBuild'], true) }}:
You can configure conditions to run the tasks that you want, depending of whatever, for example in this case using the variable isBuild, defined previously:
task: PublishBuildArtifacts#1
displayName: 'Publish artifact: drop'
inputs:
PathtoPublish: 'whatever'
ArtifactName: 'drop'
publishLocation: 'Container'
condition: eq(variables['isBuild'], true)
You can configure conditions to run the stages that you want, so previously you can group your tasks in stages, depending of whatever, for example using in this case in stage the variable isBuild, defined previously:
stage: Build
displayName: 'Build'
condition: eq(variables['isBuild'], true)
In every example in case of IsBuild be different to true will not be run.
You can find more info in https://github.com/MicrosoftDocs/azure-devops-docs/blob/main/docs/pipelines/process/conditions.md
Also If you want to schedule your task to be executed only 1st day of every month at 07:00:
trigger:
master #This is the trigger for other stages. It is not needed for the scheduled stage.
schedules:
cron: '0 7 1 * *'
displayName: 'Deploy every 1st day of every month at 07:00Z'
branches:
include:
main
always: true
Then to ensure that task runs as part of schedule, use the condition:
- stage: 'Test'
displayName: 'Deploy to the test environment'
dependsOn: Dev
condition: eq(variables['Build.Reason'], 'Schedule')
For more detail you can to to:
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/scheduled-triggers?view=azure-devops&tabs=yaml#scheduled-triggers

Put the task in a second pipeline, and run that on a monthly schedule.
If you want to avoid yaml code duplication, you could define a template containing that one task and have that template called from both pipelines.

Related

Azure Pipeline YAML dependsOn already defined

I have a pipeline that will clean, convert a VM to an image, and delete all of the resources. I would like to use that same pipeline to delete VMs without capturing an image. I can make all of the stages conditional, but when I end up with no stages that have no conditions. I keep seeing that the accepted solution to this problem is to make the dependsOn conditional as shown below. the deleteVM parameter is a boolean.
Parameters:
- name: deleteVM
displayName: "just delete it"
type: boolean
default: false
- stages:
- stage: aStage
${{ if eq(parameters.deleteVM, 'false') }}:
dependsOn: anActualStage
${{ if eq(parameters.deleteVM, 'true') }}:
dependsOn: []
The issue is that when I do this, and try to run the pipeline with the second condition being true, then it errors when parsing the pipeline YAML: 'dependsOn' is already defined.
P.S. I've looked at the following and they imply that the way I'm doing it is the "right" way:
Azure pipeline - Stage condition dependson
Azure Pipeline - Stage with Multiple depends on with if condition
Instead of 2 if statements, use if/else. Or simply leave out the else altogether, since you're setting the depends on to an empty list.
- stages:
- stage: aStage
${{ if eq(parameters.deleteVM, 'false') }}:
dependsOn: anActualStage
${{ else }}:
dependsOn: []
- stage: aStage
${{ if eq(parameters.deleteVM, 'false') }}:
dependsOn: anActualStage
#4c74356b41´s answer about using if-else is probably your best bet.
The way yaml works is that the dependsOn in your example is only really added to the "compiled" yaml file if your if statement resolves to true. Thus, there must be something wrong with your variables, as it looks like both of your logical statements resolves to true. I cannot really see why as it looks impossible by your example, but that is the only way I can see you getting the error you are seeing.

How to trigger a job manually in GItlab CI/CD

I want to trigger a single job manually in GitLab CI/CD. What steps I need to follow?
I have declared environment variable as well but while triggering a pipeline manually, selecting the branch it trigger all other jobs as well.
A Gitlab pipeline is a stream of jobs, by default you can not just execute a single job in isolation.
To work around it, you can use the rules keyword https://docs.gitlab.com/ee/ci/jobs/job_control.html#specify-when-jobs-run-with-rules
Example
job1:
...
rules:
- if: '$execute_job_1 == "true"'
when: always
- when: never
job2:
...
rules:
- if: '$execute_job_2 == "true"'
when: always
- when: never
So if want to execute only job1, you pass the appropriate variable when you run the pipeline

Run Azure DevOps CI Pipeline with Scheduler and Variable Group

We have one pipeline and one only (we cannot and do not want to create a 2nd pipeline or do it with a separate pipeline, it has to be done in the same pipeline), that pipeline has a task to either stop or start a function while accepting a variable group (its required) from the library (we specify those store A-E variables on the YAML but they also exist in the library), and specify the Azure subscription. Currently, we run this pipeline manually, this is what it looks like before I run it
What I'm looking for is a feature to automate this pipeline to run at 7 PM CST with the start function as the task, accept a variable group, and specify which azure subscription that i want. Then, at 6 AM CST the next day, I need to have that SAME pipeline to run a build with stop function as the task, accept a variable group, and specify which azure subscription i want.
I found a scheduler feature in the CI pipeline but it doesnt allow me to specify which variable group I want from the library, no option to select either start or stop the function, and no option to select the subscription. This is what I'm expecting to see
If it any helps this is the .YAML code that i have (some stuff has been removed for privacy purposes)
trigger:
- none
pool:
vmImage: 'windows-latest'
parameters:
- name: variableGroup
displayName: Variable Group
type: string
values:
- 'variable for store A'
- 'variable for store B'
- 'variable for store C'
- 'variable for store D'
- 'variable for store E'
- name: artifactVersion
displayName: ArtifactVersion (* (latest) or 1.{sprintNumber}.{ReleaseNo})
type: string
default: '*'
- name: Function
displayName: Function
type: string
default: 'deploy'
values:
- deploy
- name: task
displayName: ExecuteTask
type: string
default: ''
values:
- start thefunction
- stop the function
- name: Subscription
displayName: Subscription
type: string
values:
- 'sandbox'
- 'production '
I am afraid that there is no such method can meet your requirements for the time being.
Refer to this doc: Scheduled triggers
schedules:
- cron: string # cron syntax defining a schedule
displayName: string # friendly name given to a specific schedule
branches:
include: [ string ] # which branches the schedule applies to
exclude: [ string ] # which branches to exclude from the schedule
always: boolean # whether to always run the pipeline or only if there have been source code changes since the last successful scheduled run. The default is false.
The Schedule trigger does not support setting the target value for Parameters.
On the other hand, when you set the runtime parameters, the option or drop-down list to select the value can only be displayed when the pipeline is run manually.
I can fully understand your requirements.
You could add your request for this feature on our UserVoice site (https://developercommunity.visualstudio.com/content/idea/post.html?space=21 ), which is our main forum for product suggestions:

Azure DevOps yaml dependency

I have a problem with dependency on yaml. When I cancel the previous stage I want my next stage not to execute. The problem is with the or() and() part. When I add succeeded() to and() part it is working fine but it has to work on the second or() condition. Unfortunately when I add succeeded() to or() part it is not working as expected. The below code is not working, when I cancel the previous stage this one executes ?
dependsOn: 'CI'
condition: or(succeeded(), ne(variables['Build.Reason'], 'PullRequest'), eq('${{ parameters.devEnvironment }}', 'dev'), and(eq('${{ parameters.devEnvironment }}', 'dev'), eq(variables['Build.SourceBranch'], 'refs/heads/master'),succeeded()))
When I cancel the previous stage I want my next stage not to execute.
What you need to do is to use and() condition to connect your original conditions and suceeded():
condition: and({original condition}, succeeded())
Here is the exmaple:
condition: and(or(ne(variables['Build.Reason'], 'PullRequest'), eq('${{ parameters.devEnvironment }}', 'dev'), and(eq('${{ parameters.devEnvironment }}', 'dev'), eq(variables['Build.SourceBranch'], 'refs/heads/master')), succeeded())

Parallelisation of Azure Pipelines

I have a pipeline (in YAML) which upgrades an infrastructure(I have 2 stages each containing a series of jobs)
I now want to upgrade multiple infrastructures simultaneously i.e. pass a list of identifiers which represents deployments to the pipeline and then let it upgrade each.
What is the best practice here for organising the pipeline? It feels like I need to generate a set of parallel jobs using a loop?
As I understand it any job failure will result in a total failure which could leave us in a very confused state.
If you purchased parallel jobs for your organization. You can use Template to generate multiple jobs according the identities parameter using expression ${{each id in parameters.identities}}.
So you can move the job which upgrades the infrastructures into a template and define your yaml pipeline as below. See below example:
Template file: upgrade-infrastructure.yml
parameters:
id: 1
jobs:
- job: upgradeinfra${{parameters.id}}
steps:
- powershell: echo "upgrade-infra-${{parameters.id}}"
azure-pipelines.yml:
#define the identities as a object to hold a array object of ids
parameters:
- name: identities
type: object
default:
ids:
- 1
- 2
trigger: none
stages:
- stage: Upgrage
pool:
vmImage: windows-latest
jobs:
- job: A
steps:
- powershell: echo "job A"
#loop through the ids array object and the each id to the template paramter to generate multiple jobs for each id.
#indentation is very important, bad indentation may cause pipeline compile error.
- ${{ each id in parameters.identities.ids }}:
- template: upgrade-infrastructure.yml
parameters:
id: ${{id}}
After you set up your yaml pipeline as above, you can enter the identities in the parameter when executing the pipeline:
Then you will see the multiple jobs are generated and run in parallel:
To make your deployments run in parallel, all you need to do is to set the dependencies. (The dependency on the previous step is automatically set). Here is an example of a stage that only depends on the build before all stage will run in parallel:
stage:
-stages : DeployTo${{ parameters.environment }}
dependsOn: ["Build"] //The stage that build the code is called "Build"
The Result looks like this:
Without the dependsOn property your pipeline stages will run sequentially and looks like this:
stages:
-stage : DeployTo${{ parameters.environment }}

Resources