Run cosmos scope script in ADF V2 - azure

I want to run cosmos scope script in ADF pipeline.
Scope script takes input parameter
current date
folder path.
My challenge is how can I read and call the parameters in my scope script(I am using scope activity).

Add these 2 parameters as scope script parameters.
Once you load the script in ADF under the activity in advanced you should see those parameters.
Then you can pass in the static/dynamic values to these parameters from ADF scope activity.

Related

Azure Data Factory Global variables

If I define a global variable at the management panel level. Is that variable accessible by other concurrently running data factories?
I am looking into using a global as a validation flag as it will be available as a return value from child pipelines. And I do not want a concurrent data factory invocation to have scope into that variable.
I believe you are referring to Global parameters in Azure Data factory. If that is the case, then the answer is No. Global parameters are constants across a particular Data Factory that can be consumed by a pipeline in any expression. You cannot use that global parameter in other data factories except the one in which it was created. But it can be referenced in other pipelines within the same data factory.
Adding to the #KranthiPakala-MSFT, Yes Global parameters are constant, and we cannot use a Global parameter of one ADF in another ADF.
But if you want to use it, store it in a file in blob using copy activity source additional column.
Then use this file value by lookup activity in another ADF workspace.
Or store it in a set variable.
Run this pipeline and in another ADF get this activity run details by using the below REST API in web activity.
https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DataFactory/factories/{factoryName}/pipelineruns/{runId}/queryActivityruns?api-version=2018-06-01

Use secret variable sent via REST API

I currently have an Azure devops pipeline that I trigger with a call to the REST API. I use the "Run pipeline" interface: https://learn.microsoft.com/en-us/rest/api/azure/devops/pipelines/runs/run-pipeline?view=azure-devops-rest-6.1
With this API I can trigger a run of my AZDO pipeline, and I can send variables that are NOT secrets. I can then access these parameters as env variables in the AZDO pipeline.
The format of these variables is defined here: https://learn.microsoft.com/en-us/rest/api/azure/devops/pipelines/runs/run-pipeline?view=azure-devops-rest-6.1#variable. Please note the isSecret part.
However, I soon as I set isSecret to true, I'm unable to read these variables as env variables on the pipeline side. They just appear like they don't contain anything.
I know I could use the library and a variable group to pass secret to the pipeline, but this isn't what I'm trying to do. I'd like to know if I can pass secret to the pipeline via the REST API.
I've been looking around for a few hours and I haven't found anything.
When setting IsSecret to true in API body, the echo output in pipeline will be shown as ***
"variables": {"variable 1": { "value": "{Some Vaule}", "isSecret" : true}
If you want to read these variables on pipeline side, you could put your variable into a .txt file and publish the .txt file to Artifacts to check.
If you want to use this API passed secret value as environment variable in your pipeline, you should define it as environment variable first instead of using it directly: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#secret-variables

Send parameters in trigger ADF

I am working on a pipeline and we have DEV,QA and UAT env, so we are trying to use a parameter in the linked service in order to change the conection to the different DB (based on the environment)
we also have different triggers to run the pipeline based on the environment so my question is, is there a way to add a parameter in the trigger, execute the pipeline and send the linked service to connect to a specific environment?
You can have parameters to any type of triggers. Assuming you have a custom event trigger and SQL server as source, checkout the below example:
While creating SQL server linked service as a dataset, create string parameter for database name field
Create New parameter in dataset
Assign dataset parameter to Linked service parameter, which we will then use to store the data from trigger.
Create or use existing trigger, I am using a custom event trigger for example
A custom event trigger can parse and send a custom data payload to your pipeline. You create the pipeline parameters, and then fill in the values on the Parameters page. Use the format #triggerBody().event.data._keyName_ to parse the data payload and pass values to the pipeline parameters.
For a detailed explanation, see the following articles:
Reference trigger metadata in pipelines
System variables in custom event trigger
existing pipeline parameter.
Inside pipeline activity, specific your activity when used in source it would prompt for dataset parameter. Here use dynamic content and select the parameter holding trigger data.
Finally.. when the pipeline is triggered, trigger metadata is passed to pipeline parameter, which will be used in dataset property to switch between database dynamically in a server. Use multiple parameters similar to my example for use in different triggers and pipeline as per your environment.

Pass parameter to Azure Data Factory-ADF activity based on trigger

I have Azure Data pipeline where I have to pass a parameter to Databricks activity. I have multiple Event based triggers (Updation of blob folder) added for that activity. When specific trigger gets activated, it should pass a parameter to Databricks activity and based on that notebook should run. Is there any way to pass parameter from Event based trigger to Databricks notebook activity?
Trigger gives out 2 parameters.
#triggerBody().fileName
#triggerBody().folderPath
You will have to add this to JSON code of trigger
"parameters": {
"FPath": "#triggerBody().folderPath"
}
Use this parameter as Pipeline variable #triggerBody().FPath and use that variable with other activities. Please refer to link below for detailed explanation
https://www.mssqltips.com/sqlservertip/6063/create-event-based-trigger-in-azure-data-factory/

Pass a value from inside the Azure ADF pipeline to a PowerShell where the pipeline invoked

I want to do some steps in my PowerShell based on a value from an Azure ADF(Azure Data Factory) pipeline. How can I pass a value from an ADF pipeline to the PowerShell, where I invoked this ADF Pipeline? So that, I can do the appropriate steps in the PowerShell based on a value I received from ADF pipeline.
NOTE: I am not looking for the run-status of the pipeline (success, failure etc), but I am looking for some variable-value that we get inside a pipeline - say, a flag-value we obtained from a table using a Lookup activity etc.
Any thoughts?
KPK, the requirements you're talking about definitely can be fulfilled though I do not know where does your Powershell scripts run.
You could write your Powershell scripts in HTTP Trigger Azure Function,please refer to this doc. Then you could get the output of the pipeline in Powershell:
https://learn.microsoft.com/en-us/powershell/module/azurerm.datafactoryv2/invoke-azurermdatafactoryv2pipeline?view=azurermps-4.4.1#outputs.
Then pass the value you want to HTTP Trigger Azure Function as parameters.

Resources