can we use dataflow inside a for each loop in ADF ? If so then how can we pass the for each item into data flow parameters.
yes, You can use dataflow within foreach
Related
I have a pipeline built that reads metadata from a blob container subfolder raw/subfolder. I then execute a foreach loop with another get metadata task to get data for each subfolder, it returns the following type of data. /raw/subfolder1/folder1, /raw/subfolder2/folder1, /raw/subfolder2/folder1 and so on. I need another foreach loop to access the files inside of each folder. The problem is that you cannot run a foreach loop inside of another foreach loop so I cannot iterate further on the files.
I have an execute datapipline that calls the above pipeline and then uses a foreach. My issue with this is that I'm not finding a way to pass the item().name from the above iteration to my new pipeline. It doesn't appear you can pass in objects form the previous pipeline? How would I be able to accomplish this nested foreach metat data gathering so I can iterate further on my files?
Have you tried using parameters? Here is how it would look like:
In your parent pipeline, click on the "Execute Pipeline" activity which triggers the inner (your new pipeline) go to Settings and specify item name as a parameter "name".
In your inner pipeline, click anywhere on empty space and add new parameter "name".
Now you can refer to that parameter like this: pipeline().parameters.name
Using Parameters works in this scenario as #Andrii mentioned.
For more on passing parameters between activities refer to this link.
https://azure.microsoft.com/en-in/resources/azure-data-factory-passing-parameters/
I am trying this new functionality, and when I try to use set variable activity inside foreach loop I cannot select a variable that I declared in a pipeline.
Also inside IF activity.
Is it supposed to behave like this? That you cant set variable inside some inner activities, only at the root level of the pipeline?
This is a known bug, where set variables and append variables activities are not correctly detecting changes when they're nested in another activity. Actively working on a fix for, hopefully will resolve this problem soon :)
I'm using an Execute Sql Query action in logic app.
Returned result is composed of 1..n tables ("selects").
I want to create a csv table and send it over tfs.
The issue I'm having is that the tables are elements of resultset, and not part of an array.
Is there some way to perform ForEach action on the resultset elements (i.e. - 'Table1', 'Table2', etc...)?
Is there some way to perform ForEach action on the resultset elements (i.e. - 'Table1', 'Table2', etc...)?'
According to the mentioned data format, it seems that it is not supported via foreach action in logic app.
If Azure function is acceptable, I recommend that you could use the Azure function to implement it with your customized logic to deal with data.
Foreach on a Resultset will return JSON object of each row.
I couldn't find any option DesignView to extract the value but you can achieve same in CodeView by assigning following code to your variable in CodeView.
In below MEETINGID is my column name.
"#{items('For_each')?['MEETINGID']}"
"imeetingid": "#{items('For_each')?['MEETINGID']}"
In DesignView, you can use Parse JSON after SQL querying step.
Then, you can use for each in order to reach each database record. Because SQL resultset returns as JSON object in the Logic App.
I am trying to use two forEach activities to iterate on subfolders of folders with parameters to get metadata of subfolders. I have forEach1 and forEach2 with their own items array. Within the second for loop I need to combine both for loops' item() in a Metada activity to access my dataset like #item1()#item2(). Is this possible?
Nested foreach activity is not allowed. But you could use an execute pipeline activity inside the foreach activity. And in the nested pipeline, you could have another foreach.
It is possible but the second ForEach activity needs to be inside the first one, not another activity in the pipeline.
As you have it now, the first ForEach will run until completion, then the second one will start and you cannot access the items in the first one.
Hope this helped!
I am new to data factory and powershell. Looking a way to provide user input to the sqlReaderQuery as a where clause.
So that user can select a subset of data from sql server and push it azure sql.
I can see the parameters for date and time values but I am looking to provide ID with date.
Is there a way to write powershell to pass these values to the pipeline.
Any help is highly appreciated!!
The sqlReaderQuery in Azure Data Factory is unfortunately not very dynamic; the only variables that are really available are SliceStart, SliceEnd, WindowStart, and WindowEnd. You can tweak these with functions like AddDays and so on but that is not really going to do what you want I don't think.
One option with PowerShell is to generate a new pipeline JSON file based on your users input and use New-AzureRmDataFactoryPipeline to add that JSON to your data factory as a new pipeline. Of course this will mean you'll have a lot of pipelines unless you use Remove-AzureRmDataFactoryPipeline.
Another option would be to use a Stored Procedure activity. Your user input could be saved in the database and the stored procedure would then dynamically create the extract to another table.