How to conditionally fail an Azure Devops Pipeline stage? - azure

Say I have a pipeline where I invoke a Lambda and it responds 'true'. In this case, the pipeline should proceed. In the other case where the lambda responds 'false', I want to be able to look at that output variable and if it is false, manually fail the deployment stage (so that I can redeploy an old version of the code).
Seems like a simple enough question, but I can't find any info on this simple task.

Add a script task as below - exit if lambda output is false
- script: |
lamdaOutput = [result from call]
if lamdaOutput; then
exit 0
else
exit 1
fi

Related

Waiting custom time between stages

I want the "verify" stage does not to start running after "test" stage is completed, I want the "verify" stage to start running X minutes after completion of the "test" is completed.
stages:
- configure-site
- test
- verify
You could add a new stage in between the "test" and "verify" stage. That stage could simply execute a "sleep" command with the given time in seconds.
Otherwise, you could add the sleep-command to the end of your "test"-job.
Please be aware, that a job cannot run longer than 60 minutes per default (docs).
stages:
- configure-site
- test
- wait
- verify
configure-site:
stage: configure-site
...
test:
stage: test
needs:
- "configure-site"
...
wait:
stage: wait
needs:
- "test"
script:
- sleep 600
verify:
stage: verify
needs:
- "wait"
...
However, as the comments to your question already stated, I would tend to amend your jobs process to alert the CI when the publishing is finished, instead of just waiting for a specified time. I think, the "test" job should automatically wait until everything is really finished, instead of starting some process and returning some kind of success without knowing if the process really was successful.
Otherwise, maybe you could use the external status checks to ensure your publishing job is executed after the tests. Your external job could than call the API when the process is finished (Gitlab collective article).

Yaml multi stage run using dependson and conditions

I have a need to run the "Secure" stage if one of the previous stages of INT was sucessfully passed. I tried with dependson and conditions but I can't find the solution.
I have a need to run the "Secure" stage if one of the previous stages of INT was sucessfully passed.
I am afraid there is no such out of YAML syntax to achieve this at this moment.
Since we need to set multiple depend on for the stage Secure:
- stage: Deploy
dependsOn:
- INT_API
- INT_FuncIntergration
- INT_Web
condition: or(succeeded('INT_API'), succeeded('INT_FuncIntergration'), succeeded('INT_Web'))
Restriction:
This method can only be used the previous stage has a success, then this stage will be executed, but the current stage needs to be executed after all the previous stages have been executed. If you need to execute the current stage as long as one of the previous stages is successful, this method is still not enough.
That is because there is no "OR" syntax for the depend on. And we could not add the condition for the depend on, like:
- stage: Deploy
${{ if eq(result.INT_API, successed) }}:
dependsOn:
- INT_API
- INT_FuncIntergration
- INT_Web
condition: or(succeeded('INT_API'), succeeded('INT_FuncIntergration'), succeeded('INT_Web'))
Because the condition is parsed when YAML is compiled, but at this time the running result of the previous stage has not yet come out.
You could submit this request condition "OR" to our UserVoice site (https://developercommunity.visualstudio.com/content/idea/post.html?space=21 ), which is our main forum for product suggestions. Thank you for helping us build a better Azure DevOps.
Workaround:
The main idea of the solution is: You could try to set depend on for the stage Secure with [], then add a Inline powershell task before other tasks. This task will call the REST API Definitions - Get to monitor whether all the stages in the current release pipeline have inprocess and queue states. If so, wait for 30 seconds, and then loop again until all other stages in the current release pipeline have no inprocess and queue states. Then next execute other tasks will be executed.
You could check my previous ticket for detailed info:
I similar cases, I use the actual result status variables of all it's depending stages. Here, you can build quite complicated logic if necessary because there are more options than the default functions offer.
So if you want the Secure stage to run if any (one or more) of it's predecessors ran (successfully), then you could do something like:
- stage: Secure
dependsOn:
- stage_A
- stage_B
condition: |
and (
in(dependencies.stage_A.result, 'Succeeded', SucceededWithIssues', 'Skipped'),
in(dependencies.stage_B.result, 'Succeeded', SucceededWithIssues', 'Skipped'),
not (
eq(dependencies.stage_A.result, 'Skipped'),
eq(dependencies.stage_B.result, 'Skipped')
)
)
In the above example, the Secure stage would run if all the depending stages either ran successfully or did not run at all, except if none of them ran.
Hope this helps!

Azure DevOps conditional can't evaluate Powershell set-variable

I have a powershell script, which is not in a job, task, or stage, it's on its own. running in my DevOps build yaml like this:
- powershell: |
if (//something that's irrelevant) {
Write-Host "##vso[task.setvariable variable=myCustomVar;isOutput=true]true"
} else {
Write-Host "##vso[task.setvariable variable=myCustomVar;isOutput=true]false"
}
After this powershell script, I have another script to echo out myCustomVar to see what the value is. Like this:
- script: |
echo "What is my custom variable?"
echo $(myCustomVar)
When the build runs, in the devops logs, it echos literally "$(myCustomVar)" and not either True/False
After that, I have a task which sends an email, but we only want to send an email if myCustomVar is true. So I use a conditional.
- task: SendEmail#1
displayName: "Send email"
condition: and(succeeded(), eq(variables.myCustomVar, true))
however, this is breaking. I've tried a few other ways of doing it. myCustomVar, on the task condition, always returns NULL. Any help on syntax?
You could use ##vso[task.setvariable]value to set variables from a script in pipeline. The first task can set a variable, and following tasks are able to use the variable. The variable is exposed to the following tasks as an environment variable.
Check following useful links:
https://learn.microsoft.com/en-us/azure/devops/pipelines/scripts/logging-commands?view=azure-devops&tabs=bash#setvariable-initialize-or-modify-the-value-of-a-variable
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#set-variables-in-scripts

Run a task B when all the previous tasks have succeeded except few tasks(Run a task if a list of task is succeeded)

I want to run a task B when all the previous tasks have succeeded except one. To be clear, task A comes before task B. The output of task A should not matter to task B and it should run if all the tasks other than A have passed. How to set a custom condition for that?
Or in other words, Run a task if a list of task is succeeded?
Let's guessing the structure of your pipeline.
job:
-Task A
-Task B
-Task C
-Task NoRelated
-Task D
Now, what you are looking for is Task D will run until Task A & Task B & Task C are all succeeded, and the result of Task NoRelated will not matters on Task D, right?
I think you should knew that as of now, there's no direct expression can let you achieve this. We have to consider to use work around. Just need add one additional task and configure the condition settings of tasks.
To counter your requirement, you can try with below suggestion.
Step1.
Firstly, please ensure the condition setting of Task A & Task B & Task C are all Only when all previous tasks have succeeded:
At this moment, the Task C will run until Task A & Task B are all succeeded. If any task in Task A and Task B fails, Task C will be in the cancel state.
So, we just need to know the status of Task C to confirm whether Task A & Task B & Task C are all succeeded.
Step2:
Add one powershell task into current agent job, and make sure this powershell task is executed after Task A & Task B & Task C. Then using the sample script of this to check the status of Task C. Let me name this powershell task as Signing Status.
Since we need to retrieve the status of Task C even Task C is failed, please set the condition of Signing Status task is Even if a previous task has failed, even if the build was canceled.
Step3:
Set the condition of Task D is eq(variables['SigningStatus'], 'succeeded').
Overview of pipeline structure:
condition setting
job:
-Task A --"Only when all previous tasks have succeeded"
-Task B --"Only when all previous tasks have succeeded"
-Task C --"Only when all previous tasks have succeeded"
-Task NoRelated --"Depend on yourself"
-Task Signing Status --"Even if a previous task has failed, even if the build was canceled"
-Task D --"eq(variables['SigningStatus'], 'succeeded')"
You can wrap and run your Task A under powershell or some other script. Make your script for TaskA such a way so it should not fail! You can have a condition under script for which if it fails to set the pipeline variable for further inspection but for pipeline task it should always pass.
Based on the pipeline variable you can run or cancel the upcoming tasks.
you can use $? after each command to check if it was successful or not. It retruns true or false. Or you can use try, catch for errorhandling.
If($?)
{
# execute if last command returned true
}
else
{
# execute if last command returned false
}
Or
Try
{
# command with parameter -ErrorAction Stop
# to catch error in catch block
# command to execute if successful
}
Catch
{
# catch error with $_
# or start another command
}

Azure Data Factory v2: Activity execute pipeline output

Is there a way to reference the output of an executed pipeline in the activity "Execute pipeline"?
I.e.: master pipeline executes 2 pipelines in sequence. The first pipeline generates an own created run_id that needs to be forwarded as a parameter to the second pipeline.
I've read the documentation and checked that the master pipeline log the output of the first pipeline, but it looks like that this is not directly possible?
We've used until now only 2 pipelines without a master pipeline, but we want to re-use the logic more. Currently we have 1 pipeline that calls the next pipeline and forwards the run_id.
ExecutePipline currently cannot pass anything from its insides to its output. You can only get the runID or name.
For some weird reason, the output of ExecutePipeline is returned not as a JSON object but as a string. So if you try to select a property of output like this #activity('ExecutePipelineActivityName').output.something then you get this error:
Property selection is not supported on values of type 'String'
I found that I had to use the following to get the run ID:
#json(activity('ExecutePipelineActivityName').output).pipelineRunId
The execute pipeline activity is just another activity with outputs that can be captured by other activities. https://learn.microsoft.com/en-us/azure/data-factory/control-flow-execute-pipeline-activity#type-properties
If you want to use the runId of the pipeline executed previosly, it would look like this:
#activity('ExecutePipelineActivityName').output.pipeline.runId
Hope this helped!

Resources