Pass a value from inside the Azure ADF pipeline to a PowerShell where the pipeline invoked - azure

I want to do some steps in my PowerShell based on a value from an Azure ADF(Azure Data Factory) pipeline. How can I pass a value from an ADF pipeline to the PowerShell, where I invoked this ADF Pipeline? So that, I can do the appropriate steps in the PowerShell based on a value I received from ADF pipeline.
NOTE: I am not looking for the run-status of the pipeline (success, failure etc), but I am looking for some variable-value that we get inside a pipeline - say, a flag-value we obtained from a table using a Lookup activity etc.
Any thoughts?

KPK, the requirements you're talking about definitely can be fulfilled though I do not know where does your Powershell scripts run.
You could write your Powershell scripts in HTTP Trigger Azure Function,please refer to this doc. Then you could get the output of the pipeline in Powershell:
https://learn.microsoft.com/en-us/powershell/module/azurerm.datafactoryv2/invoke-azurermdatafactoryv2pipeline?view=azurermps-4.4.1#outputs.
Then pass the value you want to HTTP Trigger Azure Function as parameters.

Related

How to pass variable to Azure DevOps Run Pipeline

I am trying to put predefined value RELEASE_RELEASENAME to Azure DevOps Run Pipeline task, but it ends always with error: "##[error]Build parameters is not a valid json object array. Example valid object: [{"VAR1":"VALUE1","VAR2":"VALUE2"},{"VAR1":"VALUE1","VAR2":"VALUE2"}]"
Azure Setup
You could try the change the expression of the variable likeļ¼š
[{"var1": "$(Release.ReleaseName)"}]

Send parameters in trigger ADF

I am working on a pipeline and we have DEV,QA and UAT env, so we are trying to use a parameter in the linked service in order to change the conection to the different DB (based on the environment)
we also have different triggers to run the pipeline based on the environment so my question is, is there a way to add a parameter in the trigger, execute the pipeline and send the linked service to connect to a specific environment?
You can have parameters to any type of triggers. Assuming you have a custom event trigger and SQL server as source, checkout the below example:
While creating SQL server linked service as a dataset, create string parameter for database name field
Create New parameter in dataset
Assign dataset parameter to Linked service parameter, which we will then use to store the data from trigger.
Create or use existing trigger, I am using a custom event trigger for example
A custom event trigger can parse and send a custom data payload to your pipeline. You create the pipeline parameters, and then fill in the values on the Parameters page. Use the format #triggerBody().event.data._keyName_ to parse the data payload and pass values to the pipeline parameters.
For a detailed explanation, see the following articles:
Reference trigger metadata in pipelines
System variables in custom event trigger
existing pipeline parameter.
Inside pipeline activity, specific your activity when used in source it would prompt for dataset parameter. Here use dynamic content and select the parameter holding trigger data.
Finally.. when the pipeline is triggered, trigger metadata is passed to pipeline parameter, which will be used in dataset property to switch between database dynamically in a server. Use multiple parameters similar to my example for use in different triggers and pipeline as per your environment.

How to reuse the Azure Data Factory pipeline for multiple users

I have a Azure data factory pipeline that is calling a Databricks notebook.
I have parameterized the pipeline and via this pipeline I am passing the product name to the databricks notebook.
Based on the parameter the Databricks will push the processed data into the specific ADLS directory.
Now the problem is- How do I make my pipeline aware that which parameter need to pass to the Databricks.
Example: If I pass the Nike via the adf to the databricks then my data would get pushed into Nike directory or If I pass Adidas then data would get pushed into Adidas directory.
Please note that I am triggering the ADF from the automation account.
As I understood, you are using product_name = dbutils.widgets.get('product_name') statement in the databricks notebook to get the param and based on that param you process the data. The question is how to pass different params to the notebook? You create one adf pipeline and you can pass different params to the triggers that execute the adf pipeline.
Create ADF pipeline
Adf pipeline
create trigger that will pass params to the ADF pipeline
triggers
This way you will have 1 ADF pipeline with multiple instances of it with different params like Adidas, Nike etc.

How to pass parameters to pipeline during trigger run in Azure Data Factory?

As far as I know I can pass a parameters in manual run(trigger now). But how if I want to set the pipeline to autorun everyday, and be able to pass a parameter without entering the trigger now pipeline page?
Another question is that during the deign of my pipeline, I have set up few parameters and logic linked to it, like "if the parameter is null then run all table, it there is value, then only run that table", that is for user enter re-run for specific table.
However, I noticed that the message "Parameters that are not provided a value will not be included in the trigger.", does that mean my logic in the pipeline cannot be setup this way if I want to trigger it automatically everyday?
Thanks a lot!
Implementing heavy ADF logic can be difficult. You can set the default value for parameters but I assume those need to be set dynamically?
You could also use pipeline variables and an Activity at the beginning of your pipeline named "Set variable" and work with that using expressions to run your activity based on variables that are set with parameters?
In our project we did even something more complicated and we deploy and trigger a Pipeline once a week from Azure Devops. So not the ADF itself triggers the pipeline but AzureDevops scheduled run does.
Powershell:
$parameters = #{
"parameterName1" = $parameterValue
"parameterName2" = $ParameterValue
}
Invoke-AzDataFactoryV2Pipeline -DataFactoryName $DataFactoryName -ResourceGroupName
$ResourceGroupName -PipelineName $pipelineName -Parameter $parameters
With powershell you can implement any logic you really want at this point passing values to ADF.

Pass parameter to Azure Data Factory-ADF activity based on trigger

I have Azure Data pipeline where I have to pass a parameter to Databricks activity. I have multiple Event based triggers (Updation of blob folder) added for that activity. When specific trigger gets activated, it should pass a parameter to Databricks activity and based on that notebook should run. Is there any way to pass parameter from Event based trigger to Databricks notebook activity?
Trigger gives out 2 parameters.
#triggerBody().fileName
#triggerBody().folderPath
You will have to add this to JSON code of trigger
"parameters": {
"FPath": "#triggerBody().folderPath"
}
Use this parameter as Pipeline variable #triggerBody().FPath and use that variable with other activities. Please refer to link below for detailed explanation
https://www.mssqltips.com/sqlservertip/6063/create-event-based-trigger-in-azure-data-factory/

Resources