I would like to execute certain rest commands after my ADF Pipeline has finished executing.
Does anyone have any ideas how I can do this?
Thank you!
Have you looked at using a .net custom activity at the end of your pipeline process? The other alternative could be to use PowerShell to look for a completed pipeline and trigger your REST command.
https://learn.microsoft.com/en-us/azure/data-factory/data-factory-use-custom-activities
Related
I have a Powershell CRUD script that I am trying to run using Azure Function on periodic basis in order to test the resource provider is working correctly.
I am assuming this is done by using a Timer Trigger.
Has anyone done something similar? If so, can I please see an example?
I have reproduced in my environment, I have opted PowerShell core in runtime stack, then I created the PowerShell Function as below:
Then I have read the available list of modules.
Get-Module -ListAvailable.
I have set the Timer Trigger for every one minute and the result is below:
I'm discovering Rundeck; is there a way to create and run a Terraform process from inside Rundeck?
Thanks in advance
Regards
You can call your Terraform scripts from the command step or the script step on your workflow in the same way you can call any program. Here you can see a complete example of Rundeck+Terraform integration (and about how to automate your deploys).
UPDATE: Also you can test (and collaborate) with this unofficial plugin.
Is it possible to loop running tasks in agent at build or release pipelines?
like for-each from json file we have list of blocks in json file
for-each block i will start the running the task of the agent again
It's not able to use for-each control to run tasks multiple times in your pipeline.
If you just want to run the tasks multiple times, you could simply add the task multiple times when using Classic UI pipeline or using template when using YAML pipeline.
For how to use template, kindly refer answer in this question: Azure Devops YAML pipeline - how to repeat a task.
If you just want to use the loop to re-run/re-try failed task/step. This is also not supported at the moment.
There has been a related user voice.
Rerun failed build task/step
https://developercommunity.visualstudio.com/idea/365697/rerun-failed-build-taskstep.html
Multiple persons commented and echoed. You could monitor the status of above user voice.
How can I get the value of the following in DataFactory:
Last time the pipeline was triggered
Current starting time of the triggered pipeline
There is no easy way. As far as I know, you cannot do that with just data factory, I'd run an Azure Function to look for that using PowerShell or Python's sdk.
This one is easy, you can get it using:
"#trigger().startTime"
And that will give you the current starting time. Doc here: https://learn.microsoft.com/en-us/azure/data-factory/control-flow-system-variables
Hope this helped!
You could get some of this messages in Data Factory monitor/pipeline run page:
It include: last run time(triggered time) and DURACTION.
But for now, we can not export it.
Hope this helps.
I have a data factory that I would like to publish, however I want to delay one of the pipelines from running as it uses a shared resource that isn't quite ready.
If possible I would like to allow the previous pipelines to run and then enable the downstream pipeline when the resource is ready for it.
How can I disable a pipeline so that I can re-enable it at a later time?
Edit your trigger and make sure Activated is checked NO. And of course don't forget to publish your changes!
Its not really possible in ADF directly. However, I think you have a couple of options to dealing with this.
Option 1.
Chain the datasets in the activities to enforce a fake dependency making the second activity wait. This is a bit clunky and requires the provisioning of fake datasets. But could work.
Option 2.
Manage it at a higher level with something like PowerShell.
For example:
Use the following cmdlet to check the status of the first activity and wait maybe in some sort of looping process.
Get-AzureRmDataFactoryActivityWindow
Next, use the following cmdlet to pause/unpause the downstream pipeline as required.
Suspend-AzureRmDataFactoryPipeline
Hope this helps.
You mentioned publishing, so if you are publishing trough Visual Studio, it is possible to disable a pipeline by setting its property "isPaused" to true in .json pipeline configuration file.
Property for making pipeline paused
You can disable pipeline by clicking Monitor & Manage in Data Factory you are using. Then click on the pipeline and in the upper left corner you have two options:
Pause: Will not terminate current running job, but will not start next
Terminate: Terminates all job instances (as well as not starting future ones)
GUI disabling pipeline
(TIP: Paused and Terminated pipeline have orange color, resumed have green color)
Use the powershell cmdlet to check the status of the activity
Get-AzureRmDataFactoryActivityWindow
Use the powershell cmdlet to pause/unpause a pipeline as required.
Suspend-AzureRmDataFactoryPipeline
Right click on the pipeline in the "Monitor and Manage" application and select "Pause Pipeline".
In case you're using ADF V2 and your pipeline is scheduled to run using a trigger, check which trigger your pipeline uses. Then go to the Manage tab and click on Author->Triggers. There you will get an option to stop the trigger. Publish the changes once you've stopped the trigger.