I have 2 stages with multiple jobs and the jobs in the first stage have some rules that tell them if the need to run or not, so what I am trying to do is to tell some of the jobs in the second stage to execute only if the relevant job in the first stage ran.
I don't want to use the same rules I used for the first stage job to prevent conflicts.
Is there a way to do that?
stages:
- build
- deploy
Build0:
stage: build
extends:
- .Build0Rules
- .Build0Make
Build1:
stage: build
extends:
- .Build1Rules
- .Build1Make
Deploy0:
stage: deploy
dependencies:
- Build0
script:
- bash gitlab-ci/deploy0.sh
Deploy1:
stage: deploy
dependencies:
- Build1
script:
- bash gitlab-ci/deploy1.sh
Thank you in advance :)
No you cannot specify that a job should be added to the pipeline if another job was added to the pipeline. Each job can specify whether it is added to the pipeline using only/except conditions or rules, but these are not able to reference other jobs.
It is possible to generate a pipeline yaml file and then trigger it, but I think this would not be ideal because of the amount of work involved.
stages:
- Build
- Deploy
build:
stage: Build
script:
- do something...
artifacts:
paths:
- deploy-pipeline-gitlab-ci.yml
deploy:
stage: Deploy
trigger:
include:
- artifact: deploy-pipeline-gitlab-ci.yml
job: build
strategy: depend
I would recommend using similar only/except conditions or rules on each job to build the pipeline that you want.
Yes you can. You should check the keyword needs that allow to do what you want: execute a job based on the execution of other jobs, ignoring stages order.
The documentation: https://docs.gitlab.com/ee/ci/yaml/#needs
Here is also an exemple of how to build a DAG (direct acrylic graph) using needs: https://about.gitlab.com/blog/2020/12/10/basics-of-gitlab-ci-updated/#directed-acyclic-graphs-get-faster-and-more-flexible-pipelines
In your case:
Deploy0:
stage: deploy
needs: ["Build0"]
script:
- bash gitlab-ci/deploy0.sh
Deploy1:
stage: deploy
needs: ["Build1"]
script:
- bash gitlab-ci/deploy1.sh
Note you can also specify multiple jobs in the needs command:
needs: ["build0", "test0", "test1"]
i need some help on running selected envirorment from azure devops pipeline.
i used codeway.yml file which consisting of 3 env. like dev/qa/pro.
so when i run pipeline all the env are running one after other.
But my requirement is i want to run only selected env (other env should not run) with choice parameter like in jenkins
also needed "proceed to qa or pro" such a condition input needed.
could some help sample code for that.
Thanks in advance.
You can define a variable in the pipeline, and after that use this variable to execute the tasks stages that you want, previously consider that you need to have an starter point, here I use the stage info:
variables:
isDevelop: true
isQA: false
isProd: false
- stage: Info
displayName: 'Information for deploy'
jobs:
- whatever
steps:
- stage: Dev
displayName: 'Deploy to Dev'
dependsOn: Info
condition: and(succeeded(), eq(variables['isDevelop'], true))
jobs:
- whatever
- stage: QA
displayName: 'Deploy to QA'
dependsOn: Info
condition: and(succeeded(), eq(variables['isQA'], true))
jobs:
- whatever
- stage: Prod
displayName: 'Deploy to Prod'
dependsOn: Info
condition: and(succeeded(), eq(variables['isProd'], true))
jobs:
- whatever
The variables can be calculated or alse you can get the value from the variables pipeline definition in azure, in the code below would be calculated using the sourcebranch, consider that is an example, and you need to adapt it to your project:
variables:
isDevelop: $[eq(variables['sourceBranch'], 'refs/heads/develop')]
isQA: $[eq(variables['sourceBranch'], 'refs/heads/qa')]
isProd: $[eq(variables['sourceBranch'], 'refs/heads/prod')]
I have a series of pipeline stages, with only one job per stage.
What is the best practice for having only one job per stage?
Below I have my example yml setup:
trigger:
- main
resources:
- repo: self
stages:
# Test
##########################
- stage: Run_Tests
displayName: Run Tests
jobs:
- job: Run_Tests
displayName: Run Tests
pool:
vmImage: 'ubuntu-18.04'
steps:
# Testing Steps ...
# Build
##########################
- stage: Build
displayName: Build
jobs:
- job: Build
displayName: Build
pool:
vmImage: 'ubuntu-18.04'
steps:
# Build Steps ...
# Deploy
##########################
- stage: Deploy
displayName: Deploy
jobs:
- deployment: VMDeploy
displayName: Deploy
# Deploy Steps ...
I have the below multiple times throughout the file.
jobs:
-jobs:
It seems so unnecessary and cluttered to me.
Am I just being pedantic, or is there a better way to do this?
This could make sense when you use deployment job. It is because environment restrictions aplied on job level are evaluated on stage level. So if you combine Build, Test and Deploy stage and you have approval confogured on envrionemnt used in Deploy job you will be asked for approval before first job start.
For me Build and Test could go together and in fact they should be part of the same job as you may reuse - why because - is the change is valid when build is fine but tests are failed? You may also leverage fact that you already download dependencies for all your project. Having seprate job for test makes sense for me when we are speaking about integration tests.
I have a YAML file that resembles the following:
stages:
- stage: A
pool:
vmImage: 'windows-2019'
jobs:
- job: a
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
#edits file "$(System.DefaultWorkingDirectory)/myfolder/myfile.json"
- stage: B
dependsOn: A
pool:
vmImage: 'windows-2019'
jobs:
- job: b
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
#uses file "$(System.DefaultWorkingDirectory)/myfolder/myfile.json"
I have split my pipeline into two stages; A: edits a file in a repository and B: works with the edited file.
My problem is, the files seem to get reset between stages. Is there any way of keeping the changes throughout the stages, rather than resetting them?
I don't want to publish artifacts and so on as in stage b, although not in the YAML above, I am running multiple PowerShell script files that contain hardcoded file paths and it would just be a mess overwriting the file paths to point at the artifacts directory before running the stage.
An
Based on my test , the cause of this issue is that the two stages run on the different Agent machines.
For example: Stage A -> Agent machine name: 'fv-az146' , Stage B -> Agent machine name: 'fv-az151'
You could check the agent information in Build log -> Initialize job.
Is there any way of keeping the changes throughout the stages, rather
than resetting them?
Since you don't want to publish artifacts, you could try to use Self-hosted agents to run two stages.
You need to add demands to the agent to ensure that the stages run on the same Self-hosted agent.
According to this doc:
The demands keyword is supported by private pools.
We couldn't specify specific "Agent Capabilities" in Microsoft-hosted agents. So we couldn't ensure that two stages can run on the same agent
Update:
Since the two stages are running on the same agent, the "check out" step in Stage B could override the files in Stage A.
So you also need to add the - checkout: none in Stage B.
Here is the updated Yaml template:
stages:
- stage: A
pool:
name: Pool name
demands:
- Agent.Name -equals agentname1
jobs:
- job: a
steps:
- task: PowerShell#2
...
- stage: B
dependsOn: A
pool:
name: Pool name
demands:
- Agent.Name -equals agentname1
jobs:
- job: b
steps:
- checkout: none
- task: PowerShell#2
...
The overall workflow: the Stage A edits the files and save it to $(System.DefaultWorkingDirectory).
Then Stage B could directly use the files in $(System.DefaultWorkingDirectory).
The files in $(System.DefaultWorkingDirectory) will keep changes in Stage A and B.
I am trying to figure out how to share custom variables across ADO pipelines in my script. Below is my script with 2 stages.
I am setting the curProjVersion as an output variable and trying to access it from a different stage. Am I doing it right?
stages:
- stage: Build
displayName: Build stage
jobs:
- job: VersionCheck
pool:
vmImage: 'ubuntu-latest'
displayName: Version Check
continueOnError: false
steps:
- script: |
echo "##vso[task.setvariable variable=curProjVersion;isOutput=true]1.4.5"
name: setCurProjVersion
displayName: "Collect Application Version ID"
- stage: Deploy
displayName: Deploy stage
dependsOn: Build
variables:
curProjVersion1: $[ dependencies.Build.VersionCheck.outputs['setCurProjVersion.curProjVersion'] ]
jobs:
- job:
steps:
- script: |
echo $(curProjVersion1)
Updated:
Share variables across stages feature has been released in Sprint 168 now.
Please use below format to access output variables from previous stage:
stageDependencies.{stageName}.{jobName}.outputs['{stepName}.{variableName}']
Original:
Share variables across stages in Azure DevOps Pipelines
I'm afraid to say that it does not supported to share the variable which defined in one stage and pass it into another stage.
This is the feature we are plan to add, but until now, it does not supported. You can follow this Github issue, many people has the same demand with you. You can follow track that.
Until now, we only support set a multi-job output variable, but this only support YAML. For Classic Editor, there's no any plan to add this feature in release.
For work around, you can predefined the variables before the stages. But one important thing is if you change its value in one stage. The new value could not be passed to the next stage. The lifetime of variable with new value only exists in stage.
What is important to mention stageDependencies is not available in condition at stage level. It is aviable in jobs, but not in stage directly (at least at the moment).
stages:
- stage: A
jobs:
- job: JA
steps:
- script: |
echo "This is job Foo."
echo "##vso[task.setvariable variable=doThing;isOutput=true]Yes" #The variable doThing is set to true
name: DetermineResult
- script: echo $(DetermineResult.doThing)
name: echovar
- job: JA_2
dependsOn: JA
condition: eq(dependencies.JA.outputs['DetermineResult.doThing'], 'Yes')
steps:
- script: |
echo "This is job Bar."
#stage B runs if DetermineResult task set doThing variable n stage A
- stage: B
dependsOn: A
jobs:
- job: JB
condition: eq(stageDependencies.A.JA.outputs['DetermineResult.doThing'], 'Yes') #map doThing and check if true
variables:
varFromStageA: $[ stageDependencies.A.JA.outputs['DetermineResult.doThing'] ]
steps:
- bash: echo "Hello world stage B first job"
- script: echo $(varFromStageA)
Jobs can now access variables from previous stages
Output variables are still produced by steps inside of jobs. Instead of referring to dependencies.jobName.outputs['stepName.variableName'], stages refer to stageDependencies.stageName.jobName.outputs['stepName.variableName'].
https://learn.microsoft.com/en-us/azure/devops/release-notes/2020/sprint-168-update#azure-pipelines-1
This is available as of May 4th 2020
Jobs can access output variables from previous stages:
Output variables may now be used across stages in a YAML-based pipeline. This helps you pass useful information, such as a go/no-go decision or the ID of a generated output, from one stage to the next. The result (status) of a previous stage and its jobs is also available.
Output variables are still produced by steps inside of jobs. Instead of referring to dependencies.jobName.outputs['stepName.variableName'], stages refer to stageDependencies.stageName.jobName.outputs['stepName.variableName'].
Note
By default, each stage in a pipeline depends on the one just before it in the YAML file. Therefore, each stage can use output variables from the prior stage. You can alter the dependency graph, which will also alter which output variables are available. For instance, if stage 3 needs a variable from stage 1, it will need to declare an explicit dependency on stage 1.
For stage conditions on Azure DevOps Version Dev17.M153.5 with Agent Version 2.153.1 the following works:
stages:
- stage: A
jobs:
- job: JA
steps:
- script: |
echo "This is job Foo."
echo "##vso[task.setvariable variable=doThing;isOutput=true]Yes" #The variable doThing is set to 'Yes'
name: DetermineResult
#stage B runs if DetermineResult task set doThing variable on stage A
- stage: B
dependsOn: A
condition: eq(dependencies.A.outputs['JA.DetermineResult.doThing'], 'Yes')
jobs:
- job: JB
steps:
- bash: echo "Hello world stage B first job"
Note: The layout of properties is different on stage compared to job:
dependencies.{stage name}.outputs['{job name}.{script name}.{variable name}']
Note: The expression with 'stageDependencies' failed with the following error message:
An error occurred while loading the YAML build pipeline. Unrecognized value: 'stageDependencies'. Located at position XX within expression: and(always(), eq(stageDependencies.A.outputs['JA.DetermineResult.doThing'], 'Yes')). For more help, refer to https://go.microsoft.com/fwlink/?linkid=842996
Bonus:
See the following documentation on how to access the status of the dependent stages: link
Corresponding documentations:
Use output variables from tasks
Stage to stage dependencies
To use the output from a different stage, you must use the syntax depending on whether you're at the stage or job level:
At the stage level, the format for referencing variables from a different stage is dependencies.STAGE.outputs['JOB.TASK.VARIABLE']. You can use these variables in conditions.
At the job level, the format for referencing variables from a different stage is stageDependencies.STAGE.JOB.outputs['TASK.VARIABLE']
Note: By default, each stage in a pipeline depends on the one just before it in the YAML file. If you need to refer to a stage that isn't immediately prior to the current one, you can override this automatic default by adding a dependsOn section to the stage.
Yes this is possible, you need stageDependencies like below:
stages:
- stage: Build
displayName: Build stage
jobs:
- job: VersionCheck
pool:
vmImage: 'ubuntu-latest'
displayName: Version Check
continueOnError: false
steps:
- script: |
echo "##vso[task.setvariable variable=curProjVersion;isOutput=true]1.4.5"
name: setCurProjVersion
displayName: "Collect Application Version ID"
- stage: Deploy
displayName: Deploy stage
dependsOn: Build
variables:
curProjVersion1: $[ stageDependencies.Build.VersionCheck.outputs['setCurProjVersion.curProjVersion'] ]
jobs:
- job:
steps:
- script: |
echo $(curProjVersion1)
Note that I have changed
$[ dependencies.Build.VersionCheck.outputs['setCurProjVersion.curProjVersion'] ]
TO
$[ stageDependencies.Build.VersionCheck.outputs['setCurProjVersion.curProjVersion'] ]
source: https://jimferrari.com/2023/01/05/pass-variables-across-jobs-and-stages-in-azure-devops-pipelines/
As an update for anyone seeing this question, it seems passing variables between stages has been implemented and should be released in the next couple of weeks.
You can define a global variable and use a Powershell to assign the value of the stage variable to the global variable.
Write-Output ("##vso[task.setvariable variable=globalVar;]$stageVar")
Global variables can either be defined in the yaml itself or in variable groups.
Initialise the var with an empty value.
E.g. yaml
variables:
globalVar: ''
You can define the variables just after you define the trigger and before you define the stages
trigger:
- master
variables:
VarA: aaaaa
VarB: bbbbb
stages:
- stage: Build
jobs:
- job: Build
pool:
vmImage: 'vs2017-win2016'