How to use dynamic content in relative URL in Azure Data Factory - azure

I have a REST data source where I need pass in multiple parameters to build out a dataset in Azure Data Factory V2.
I have about 500 parameters that I need to pass in so don’t want to pass these individually. I can manually put these in a list (I don’t have to link to another data source to source these). The parameters would be something like [a123, d345, e678]
I'm working in the UI. I cannot figure out how to pass these into the relative URL (where it says Parameter) to then form the dataset. I could do this in Power BI using functions and parameters but can't figure it out in Azure Data Factory as I'm completely new to it. I'm using the Copy Data functionality in ADF to do this.
The sink would be a json file in an Azure blob that I can then access via Power BI. I'm fine with this part.
Relative URL with Parameter requirement
How to add dynamic content

I'm afraid that your requirement can't be implemented.As you know,ADF REST dataset is used to retrieving data from a REST endpoint by using the GET or POST methods with Http request. No way to configure a list of parameters in the relativeUrl property which ADF would loop it automaticlly for you.
Two ways to retrieve your goal:
1.Loop your parameter array ,pass single item into relativeUrl to execute copy activity individually.Using this way,you could use foreach activity in the ADF.
2.Write a overall api to accept list paramter from the requestBody,execute your business in the api inside with loop.

Related

Send parameters in trigger ADF

I am working on a pipeline and we have DEV,QA and UAT env, so we are trying to use a parameter in the linked service in order to change the conection to the different DB (based on the environment)
we also have different triggers to run the pipeline based on the environment so my question is, is there a way to add a parameter in the trigger, execute the pipeline and send the linked service to connect to a specific environment?
You can have parameters to any type of triggers. Assuming you have a custom event trigger and SQL server as source, checkout the below example:
While creating SQL server linked service as a dataset, create string parameter for database name field
Create New parameter in dataset
Assign dataset parameter to Linked service parameter, which we will then use to store the data from trigger.
Create or use existing trigger, I am using a custom event trigger for example
A custom event trigger can parse and send a custom data payload to your pipeline. You create the pipeline parameters, and then fill in the values on the Parameters page. Use the format #triggerBody().event.data._keyName_ to parse the data payload and pass values to the pipeline parameters.
For a detailed explanation, see the following articles:
Reference trigger metadata in pipelines
System variables in custom event trigger
existing pipeline parameter.
Inside pipeline activity, specific your activity when used in source it would prompt for dataset parameter. Here use dynamic content and select the parameter holding trigger data.
Finally.. when the pipeline is triggered, trigger metadata is passed to pipeline parameter, which will be used in dataset property to switch between database dynamically in a server. Use multiple parameters similar to my example for use in different triggers and pipeline as per your environment.

Import Schemas in Azure Data Factory with Parameters

I am trying to develop a simple ADF pipeline that copies data from a delimited file to MySQL database, when such a file is uploaded to a Blob Storage Account. I am using parameters to define the name of the Storage Account, the Container that houses the files and file name (inputStorageAccount, inputContainer, inputFile). The name of the Storage Account is a global parameter and the other two are meant to be provided by the trigger. The Linked Service has also been parameterized.
However, I want to define the mappings for this operation. So, I am trying to 'import schemas' by providing the values for these parameters (I have stored a sample file in the Storage Account). But, I keep getting this error when trying to do so,
What am I doing wrong? How can I get this to work?
I would also like to know why I am not being asked to provide a value for the inputContainer parameter when I try to use 'import schema' at the dataset level,
Where you have to add the values Add dynamic content [Alt+P] :
Just as mentioned here in the below Snip, Go to the + Symbol where you will find a window and need to fill in the parameter name, type and value:
Where we can directly select the parameter according to the options :
Here is another detailed scenario which might help: Using Azure DataFactory Parameterized Linked Service | Docs, then you can reset the schema.

how to pass variables to Azure Data Factory REST url's query stirng

i am new to Azure data factory. I have a source dataset that is linked to a REST api. The url of this API has a query string. I have an activity that copy data from REST to Database. But i have to pass different values in query string and run the same activity against different values. How can this be done in Azure Data Factory?
i kind of reached here. but how to pass the value of this "HospitalCode" ?
Please try something like this:
1.create a pipeline and set a variable(your HospitalCode):
Name:CodeArray Type:Array DEFAULT VALUE:["01","02"]
2.create a ForEach Activity:
Items:#variables('CodeArray')
3.create a Parameter name is code,type is String.
Setting of Linked service like this:
dynamic content:#concat('pjjs/pls?HospitalCode=',dataset().code)
4.Setting of Copy Activity's Source
Hope this can help you.

Azure Data Factory Copy Data dynamically get last blob

I have an Azure Data Factory Pipeline that runs on a Blob Created Trigger, I want it to grab the last Blob added and copy that to the desired location.
How do I dynamically generate the file path for this outcome?
System Variables
Expressions and Functions
"#triggerBody().folderPath" and "#triggerBody().fileName" captures the last created blob file path in event trigger. You need to map your pipeline parameter to these two trigger properties. Please follow this link to do the parameter passing and reference. Thanks.

how to pass a folder from a data lake store as a parameter to a pipeline?

In data factory, I know you can pass a parameter at the beginning of a pipeline, and then access it later using #pipeline(). If I have a folder in a data lake store, how can I pass that as a parameter and have access to it later (let's say I want to loop a for-each over each file inside it.) Do I pass the path to the folder? Am I passing it as an object?
Here are the steps that you can use -
You can use pass folder path as a parameter (string) to the pipeline.
Use the path and "Get Metadata" activity with "Child Items". This will return the list of files in JSON Format
Get Metadata Selection
Loop through using "Foreach" activity and perform any action.
Use output from metadata activity as Items in Foreach activity (example below)
#activity('Get List of Files').output
Hope this helps
First, you need create a data lake store linked service. It will contain the path of azure data lake store. You could use azure data factory UI to create the linked service
Then you need create a data lake store dataset reference that linked service in step 2.
Then you create a getMetaData activity reference dataset in step 2.
Then following steps provided by summit.
All of these can be done in UI.https://learn.microsoft.com/en-us/azure/data-factory/quickstart-create-data-factory-portal#create-a-pipeline

Resources