Argo workflow: schedule once at a specific time - cron

I want to trigger an Argo workflow to start running at a specific time. I am currently using a workflowTemplate and the rest API to submit a workflow with this template. Is there a way to tell Argo to start the workflow at a specific time.
I am aware of the existence of cron-workflow and cron-workflow-template. But I am not able to figure out how to use either workflow or cron-workflow to achieve what I want.
To have any scheduling, do I must use cron-workflow? Or is there a way to trigger a regular workflow at a delay by passing the schedule-time in submitOptions or in some other way through the rest API?
If I must use cron workflow, what should I set the schedule value at? I don't want it to run automatically or periodically, but only when I want and at a specific time. Is there a way achieve that using cronWorkflowTemplate and the rest API?
I will appreciate any help/pointers.

Likely not the answer you're looking for, but if you are able to alter your WorkflowTemplate, you can make the first step be an immediate suspend step, with a value that is provided as an input (by you, when deciding you want to submit the workflow, just not now). For instance, your workflow may look something like this:
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
metadata:
name: my-workflow-template
spec:
entrypoint: my-wf
arguments:
parameters:
- name: wait-time-sec
value: "0" # override me
templates:
- name: my-wf
inputs:
parameters:
- name: wait-time-sec
steps:
- - name: wait
template: wait-template
arguments:
parameters:
- name: wait-time-sec
value: "{{inputs.parameters.wait-time-sec}}"
- - name: rest of original workflow steps...
- name: wait-template
inputs:
parameters:
- name: wait-time-sec
suspend:
duration: "{{inputs.parameters.wait-time-sec}}"
...
When you want to submit it, just pass in the parameter wait-time-sec. Granted, if you would rather have a specific time, you would either have to calculate how many seconds that would be before submitting, or write a simple script before the wait step to do that calculation for you, taking in an input of a datetime and outputting seconds for the wait step to use.

Related

How to get yaml-pipeline demands?

How to get the demands of a pipeline created from a yaml-file? The yaml-file contains the demands:
...
jobs:
- job: my_job
displayName: My Job
pool:
name: my_pool
demands:
- Agent.Name -equals My-Agent-1
step:
...
If the pipeline is created through the user interface, then I can get the demands using the get-request. The response body will contain the demands.
But if I usually use the get-request for a yaml-pipeline, then the response body will not contain the demands! Do I need to parse the yaml-file myself?!
Unfortunately, I did not find an answer to my question.
REST API Definitions - List doesn't support get the definitions inside the YAML. You need to parse the yaml-file yourself.

How to avoid hardcoded picklist values in Azure pipeline yaml?

Can azure devops pipeline yaml drive picklist values from a file so to avoid hardcoded values in yaml ? Even better allow an api call to dynamically populate a list
Naturally we can roll our own synthesis of pipeline yaml code short of that would be nice if it allowed an include snippets file ability ... one means would be to run a pipeline to generate a pipeline to get run
azure pipeline bad technique today requires a hardcode
parameters:
- name: some_parm
type: string
default: fav_value
values:
- val01
- val02
- val03
what is needed ... populate list dynamically or at a minimum from a file
parameters:
- name: some_parm
type: string
default: fav_value
values:
${{ some_file_or_api_lookup }}
Possibly this yaml preprocessor could work https://github.com/birchb1024/goyamp ... dunno yet
UPDATE 20220906 so far no solution found suggestions welcome

Run Azure DevOps CI Pipeline with Scheduler and Variable Group

We have one pipeline and one only (we cannot and do not want to create a 2nd pipeline or do it with a separate pipeline, it has to be done in the same pipeline), that pipeline has a task to either stop or start a function while accepting a variable group (its required) from the library (we specify those store A-E variables on the YAML but they also exist in the library), and specify the Azure subscription. Currently, we run this pipeline manually, this is what it looks like before I run it
What I'm looking for is a feature to automate this pipeline to run at 7 PM CST with the start function as the task, accept a variable group, and specify which azure subscription that i want. Then, at 6 AM CST the next day, I need to have that SAME pipeline to run a build with stop function as the task, accept a variable group, and specify which azure subscription i want.
I found a scheduler feature in the CI pipeline but it doesnt allow me to specify which variable group I want from the library, no option to select either start or stop the function, and no option to select the subscription. This is what I'm expecting to see
If it any helps this is the .YAML code that i have (some stuff has been removed for privacy purposes)
trigger:
- none
pool:
vmImage: 'windows-latest'
parameters:
- name: variableGroup
displayName: Variable Group
type: string
values:
- 'variable for store A'
- 'variable for store B'
- 'variable for store C'
- 'variable for store D'
- 'variable for store E'
- name: artifactVersion
displayName: ArtifactVersion (* (latest) or 1.{sprintNumber}.{ReleaseNo})
type: string
default: '*'
- name: Function
displayName: Function
type: string
default: 'deploy'
values:
- deploy
- name: task
displayName: ExecuteTask
type: string
default: ''
values:
- start thefunction
- stop the function
- name: Subscription
displayName: Subscription
type: string
values:
- 'sandbox'
- 'production '
I am afraid that there is no such method can meet your requirements for the time being.
Refer to this doc: Scheduled triggers
schedules:
- cron: string # cron syntax defining a schedule
displayName: string # friendly name given to a specific schedule
branches:
include: [ string ] # which branches the schedule applies to
exclude: [ string ] # which branches to exclude from the schedule
always: boolean # whether to always run the pipeline or only if there have been source code changes since the last successful scheduled run. The default is false.
The Schedule trigger does not support setting the target value for Parameters.
On the other hand, when you set the runtime parameters, the option or drop-down list to select the value can only be displayed when the pipeline is run manually.
I can fully understand your requirements.
You could add your request for this feature on our UserVoice site (https://developercommunity.visualstudio.com/content/idea/post.html?space=21 ), which is our main forum for product suggestions:

Yaml multi stage run using dependson and conditions

I have a need to run the "Secure" stage if one of the previous stages of INT was sucessfully passed. I tried with dependson and conditions but I can't find the solution.
I have a need to run the "Secure" stage if one of the previous stages of INT was sucessfully passed.
I am afraid there is no such out of YAML syntax to achieve this at this moment.
Since we need to set multiple depend on for the stage Secure:
- stage: Deploy
dependsOn:
- INT_API
- INT_FuncIntergration
- INT_Web
condition: or(succeeded('INT_API'), succeeded('INT_FuncIntergration'), succeeded('INT_Web'))
Restriction:
This method can only be used the previous stage has a success, then this stage will be executed, but the current stage needs to be executed after all the previous stages have been executed. If you need to execute the current stage as long as one of the previous stages is successful, this method is still not enough.
That is because there is no "OR" syntax for the depend on. And we could not add the condition for the depend on, like:
- stage: Deploy
${{ if eq(result.INT_API, successed) }}:
dependsOn:
- INT_API
- INT_FuncIntergration
- INT_Web
condition: or(succeeded('INT_API'), succeeded('INT_FuncIntergration'), succeeded('INT_Web'))
Because the condition is parsed when YAML is compiled, but at this time the running result of the previous stage has not yet come out.
You could submit this request condition "OR" to our UserVoice site (https://developercommunity.visualstudio.com/content/idea/post.html?space=21 ), which is our main forum for product suggestions. Thank you for helping us build a better Azure DevOps.
Workaround:
The main idea of the solution is: You could try to set depend on for the stage Secure with [], then add a Inline powershell task before other tasks. This task will call the REST API Definitions - Get to monitor whether all the stages in the current release pipeline have inprocess and queue states. If so, wait for 30 seconds, and then loop again until all other stages in the current release pipeline have no inprocess and queue states. Then next execute other tasks will be executed.
You could check my previous ticket for detailed info:
I similar cases, I use the actual result status variables of all it's depending stages. Here, you can build quite complicated logic if necessary because there are more options than the default functions offer.
So if you want the Secure stage to run if any (one or more) of it's predecessors ran (successfully), then you could do something like:
- stage: Secure
dependsOn:
- stage_A
- stage_B
condition: |
and (
in(dependencies.stage_A.result, 'Succeeded', SucceededWithIssues', 'Skipped'),
in(dependencies.stage_B.result, 'Succeeded', SucceededWithIssues', 'Skipped'),
not (
eq(dependencies.stage_A.result, 'Skipped'),
eq(dependencies.stage_B.result, 'Skipped')
)
)
In the above example, the Secure stage would run if all the depending stages either ran successfully or did not run at all, except if none of them ran.
Hope this helps!

Azure DevOps pipelines for project and deployment specific variables

I am wanting opinion on the following. Have a javascript qna bot that I have in Azure DevOps. I have an azure pipeline created that deploys to an Azure environment. This works well. However, this is a common use bot that can be used in multiple scenarios. Write Once, Use Many. So I want to variabl-ize the process for multiple environments (DEV vs PROD) and instances (PROD1, PROD2, PROD3...)
1st Case: Within the project, there is a .env file with name-value pairs stored. I need to have distinct values for multiple environments and instances. One option could be to have a distinct file per environment+instance. So
.env.DEV, .env.PROD1, .env.PROD2, .env.PROD3, etc.
And then as part of the build process that zips the files, rename only one of the .env files by dropping the suffix based on the case. Can delete the other .env files prior to zipping. Is this a good way to do it OR is there a more standardized process that I should use?
2nd Case: As part of the deployment, I want to variabl-ize the azure-pipeline.yml file so that the target webapp, resource group, subscription, etc are dynamic (different for DEV, PROD1, PROD2, ...). I can create multiple yaml files and link it to separate pipelines. Is this the way? Or am I creating one pipeline and somehow toggling these values for 'n' different cases?
I can hack something. But I wanted to make sure I was using the right approach before starting.
Thanks in advance,
Jake.
1st Case:
Is this a good way to do it OR is there a more standardized process that I should use?
I suggest you can use replace token task to achieve your needs which could be more convenient. Here is my sample:
1.*.env file:
name1:#{value1}#
name2:#{value2}#
name3:#{value3}#
2.Create variables and set values when running the pipeline:
3.Replace token task:
- task: replacetokens#3
inputs:
targetFiles: '**/*.env'
encoding: 'auto'
writeBOM: true
actionOnMissing: 'warn'
keepToken: true
tokenPrefix: '#{'
tokenSuffix: '}#'
useLegacyPattern: false
enableTelemetry: false
4.Result of *.env file:
name1:a
name2:b
name3:c
2nd Case:
I can create multiple yaml files and link it to separate pipelines. Is
this the way? Or am I creating one pipeline and somehow toggling these
values for 'n' different cases?
I suggest you can use parameters and select values when running pipelines. For example:
parameters:
- name: subscription
type: string
default: test1
values:
- test1
- test2
- name: WebAppName
type: string
default: test1
values:
- test1
- test2
- name: ResourceGroupName
type: string
default: test1
values:
- test1
- test2
steps:
- task: AzureAppServiceManage#0
inputs:
azureSubscription: ${{ parameters.subscription }}
Action: 'Stop Azure App Service'
WebAppName: ${{ parameters.WebAppName }}
SpecifySlotOrASE: true
ResourceGroupName: ${{ parameters.ResourceGroupName }}
You can choose the resource group name and subscription name when running pipelins:

Resources