Azure datafactory "Set_Variabe" array and "FOR_EACH" exe_pipelines activities - azure

as you can see in the following images, the matrix reaches the activity (Exe_pipeline) completely, but it does not allow separating the values ​​for the different parameters. try a (Split), but it doesn't work.
*img1 activity (for_each) receives array string correctly *Img_2 evidences that it sends the array correctly, but indicating the value to each parameter does not work. example: #item().FileJson *img_3 if I send only #item () if it presents the whole array.
Can someone tell me how to separate the information that comes in item (). other than #item. ???

Related

In Azure Data Factory, how do I pass the Index of a ForEach as a parameter properly

Sorry if this is a bit vague or rambly, I'm still getting to grips with Data Factory and a lot of it seems a bit obtuse...
What I want to do is query my Cosmos Database for a list of Ids of records that need to be updated. For each of these records, I want to call a REST API using the Id (i.e. /Record/{Id}/Details)
I've created a Data Flow that took a string as a parameter and then called the REST API fine.
I then made a pipeline using a Lookup with a query (select c.RecordId from c where...) and pass that into a ForEach with items set to #activity('Lookup1').output.value
I then setup the Activity of the ForEach to my Data flow. From research, I think I'm supposed to set the Parameter value to "#item().RecordId", but that gives an error "parameter [name] does not match parameter type 'string'".
I can change the type of the parameter to any (and use toString([parameter]) to cast it ) and then when I try and debug it passes the parameter in, but it gives an error of "Job failed due to reason: at (Line 2/Col 14): Datatype any not found".
I'm not sure what the solution is. Is there a way to cast the result of the lookup to an integer or string? Is there a way to narrow an any down? Is there a better way than toString() that would work? Is there a better way than ForEach?
I tried to reproduce similar scenario what you are trying.
My sample data in cosmos
To query Cosmos Database for a list of Ids and call a REST API using the Id For each of these records.
First, I took Lookup activity in data factory and selected the id's where the last_name is Bluth
Its output and settings are as below:
Then I passed the output of lookup activity to For-each activity.
Then inside for each activity I created Dataflow activity and for that DataSource I gave the source as Rest API. My Rest API to call specific user is https://reqres.in/api/users/2 I gave base URL as https://reqres.in/api/users.
Then I created parameter called demoId as datatype string and in relative URL I gave that dynamic value as #dataset().demoId
After this I gave value source parameter as #item().id as after https://reqres.in/api/users there is only id should be provided to get data in you case you can try Record/#{item().id}/Details.
For each id it is successfully passing id to rest API and fetching data:

How to stuff a result of a query into a variable and use it another query in a logic app

I haven't used logic apps a lot, my boss is having trouble stuffing the results of one query into a variable and then using that variable in another query.
Basically, all he wants to do is get a list of of Id's returned from the first query and use that list in the second.
Here is a picture of what his logic app looks like:
You can see at the end of the second query he wants to check if the id is in the list or not. He's out for the day and I'm not sure if that variable is even receiving the list of id's successfully, but is there anything from the picture that you can tell that needs to be corrected? Or any suggestions that he could try, to achieve what he's trying to achieve?
According to the image, no data is getting stored into the variable AppId. While in the query you can just directly use c.EntityId. Below query to check if c.id is present in c.EntityId.
SELECT c.Vechicle.GrossVechicleWeight as GVW, c.EntityId as ApplicationId FROM c where c.RiskTypeId = 1 and c.Discriminator = 'RiskEntity' and c.EntityTypeId = 4500 and c.id in (c.EntityId)
Consider if you are trying to store c.Entity into AppId variable then you can Query SELECT c.EntityId FROM c and then store the result into the variable using Append to array variable action by extracting only c.EntityId using Parse JSON.
Here is my logic app
RESULT:

How to append array data to array variable in logic app?

I'm calling API using HTTP connector getting result array data. and used until loop. so every time i will get some records into result array.
Now I want to append all records so that i will those all.
Like 1st time i got 2 records like below and 2nd time 1 then I want to append so that it will be 3 total.
1st iteration result -
"results":[
{"id":"2","name":"t1"},{"id":"3","name":"t4"}
]
2nd iteration result -
"results":[
{"id":"66","name":"i7"}]
I want to append all data so that final result will be like -
[{"id":"2","name":"t1"},{"id":"3","name":"t4"}, {"id":"66","name":"i7"}]
instead of foreach I tried using append array variable but it throws below error -
its a type of array need to be string to append.
I can able to achieve it using foreach but it does not make sense just to add values use foreach instead if we found any way to directly add array it will be great.
You can use JS inline code to implement your requirement. I did some test on my side, post to arrays(result1 and result2) to logic app and compose them using JS :
Result :
Please note if you want to use this feature , you should create an integration account and associated with your logic app in "Workflow settings" blade .
The above solution works only if you have integration account.
Other simple option - use union function inside compose action to append two array collections:
union(variables('ResponseArray'),body('Response'))
https://learn.microsoft.com/en-us/azure/logic-apps/workflow-definition-language-functions-reference#union

How do you iterate within an Azure Logic App Response component

I have an Azure MS SQL Server component that is returning multiple rows and feeding into a Response component.
The Body of the Response component looks like this:
{
"MyID":"#{body('Get_rows')['value'][1]['Id']}"
}
I can make the number in the bracket 0 and get the first result. I can make it 1 and get the second result. But what I am trying to find is the syntax to loop through all the results that are passed so that it would effectively provide the following (assuming there were 2 results total:
{
"MyID":"#{body('Get_rows')['value'][0]['Id']}"
}
{
"MyID":"#{body('Get_rows')['value'][1]['Id']}"
}
Thanks in advance for advice on where to find the correct syntax or for examples of correct syntax.
It took me a while but figured out that I needed to do two things:
I had to run a for each after the Get Rows and within that I created a Data Operations - Compose component. Within that I was able to create a single JSON object with all the parameters.
From there I used #outputs command as shown below in the Body of the Response and it inserted the array brackets and the commas to delimit the Compose entries automagically.
Here is what the code in the Body of the Response looks like:
#outputs('Compose')
Note that 'Compose' is the default name given to the first Compose component you place in the application.

cucumber multiple assertions in a step

I am trying to validate one block of json data that I receive from server
json consists of information about bunch of orders. An each order includes cost of each part, taxes and total. And it is kind of strict requirement each order contains exactly 4 parts. And each order has three kind of taxes and a total.
I have a step which looks like this
And "standardorder" includes parts "1..4", taxes "1..3" and total
and step implementation is like following. Here #jsonhelper.json is shared state (json for one order) passed from previous step.
And /^"([^"]*)" includes parts "([^"]*)", taxes "([^"]*)" and total$/ do |arg1, arg2, arg3|
json = #jsonhelper.json
validkeys = ["total"]
parts = arg2.split('..').map{|d| Integer(d)}
(parts[0]..parts[1]).each do |i|
validkeys.push "p#{i}"
end
taxes = arg3.split('..').map{|d| Integer(d)}
(taxes[0]..taxes[1]).each do |i|
validkeys.push "t#{i}"
end
validkeys.each do |key|
json[arg1].keys.include?(key).should be_true
end
end
Now this script works fine except that if any one key is missing it doesn't state which one is missing. Either it passes or fails as assertions are iterated for each key.
I would like to know if there is any possibility of sending keys which are found ok to result stream. Thus my intention is to know to which keys are ok and which failed and which one skipped. As such order of keys is not expected in json.
Thanks in advance.
It's probably best to split the step definitions first:
And "standardorder" should be received
And the order should include parts 1 to 4
And the order should include taxes 1 to 3
And the order should include the total
Then you can re-use the steps elsewhere.
The 'order' check easy to implement as you're just checking one element.
For the other two, you are really just checking the presence of items in an array, e.g.:
actual_values.should == expected_values
If that fails, RSpec will give you a report showing how the arrays differ.

Resources