Azure Data Factory Copy XML Activity - azure

I'm copying XML to a database using a "Copy Activity".
XML file has a nested structure so I have defined a "Collection reference" at "Cond_Tbl_Data_Record" level.
mapping
the following is written to db correctly.
`<Cond_Tbl_Data_Set>
<Cond_Tbl_Data_Record>
<Base_Per_Quantity>1</Base_Per_Quantity>
<Base_UOM_Code>GA</Base_UOM_Code>
<Condition_Table_ID>A02</Condition_Table_ID>
<Condition_Type>COCO</Condition_Type>
<Condition_Value>829</Condition_Value>
<Currency_Code>USC</Currency_Code>
<Extraction_Time>20230113 19:41:03</Extraction_Time>
<Key_Values>US/000001/001</Key_Values>
<Valid_From_Date>20230113</Valid_From_Date>
<Valid_To_Date>99991231</Valid_To_Date>
<Effective_Start_Time>13:22:42</Effective_Start_Time>
<Condition_Change_Value>300</Condition_Change_Value>
</Cond_Tbl_Data_Record>
<Cond_Tbl_Data_Record>
<Base_Per_Quantity>1</Base_Per_Quantity>
<Base_UOM_Code>GA</Base_UOM_Code>
<Condition_Table_ID>A04</Condition_Table_ID>
<Condition_Type>COCO</Condition_Type>
<Condition_Value>829</Condition_Value>
<Currency_Code>USC</Currency_Code>
<Extraction_Time>20230113 19:41:03</Extraction_Time>
<Key_Values>US/000001/002</Key_Values>
<Valid_From_Date>20230113</Valid_From_Date>
<Valid_To_Date>99991231</Valid_To_Date>
<Effective_Start_Time>13:22:42</Effective_Start_Time>
<Condition_Change_Value>300</Condition_Change_Value>
</Cond_Tbl_Data_Record>
</Cond_Tbl_Data_Set>`
but the following is not written to db at all.
`<Cond_Tbl_Data_Set>
<Cond_Tbl_Data_Record>
<Base_Per_Quantity>1</Base_Per_Quantity>
<Base_UOM_Code>GA</Base_UOM_Code>
<Condition_Table_ID>A02</Condition_Table_ID>
<Condition_Type>COCO</Condition_Type>
<Condition_Value>829</Condition_Value>
<Currency_Code>USC</Currency_Code>
<Extraction_Time>20230113 19:41:03</Extraction_Time>
<Key_Values>US/000001/001</Key_Values>
<Valid_From_Date>20230113</Valid_From_Date>
<Valid_To_Date>99991231</Valid_To_Date>
<Effective_Start_Time>13:22:42</Effective_Start_Time>
<Condition_Change_Value>300</Condition_Change_Value>
</Cond_Tbl_Data_Record>
</Cond_Tbl_Data_Set>`
I tried to reimport schema but still did not work

As per information you are providing when your file has single object in array Its not getting copied.
The cause of issue is when you have single object in array it will take it as object not as array.
To resolve the issue, you have to first clear the mapping and then again import the mapping. So, it will take that as another object not an array as shown in below image:
My sample input:
Output:

Related

In Azure Data Factory, how do I pass the Index of a ForEach as a parameter properly

Sorry if this is a bit vague or rambly, I'm still getting to grips with Data Factory and a lot of it seems a bit obtuse...
What I want to do is query my Cosmos Database for a list of Ids of records that need to be updated. For each of these records, I want to call a REST API using the Id (i.e. /Record/{Id}/Details)
I've created a Data Flow that took a string as a parameter and then called the REST API fine.
I then made a pipeline using a Lookup with a query (select c.RecordId from c where...) and pass that into a ForEach with items set to #activity('Lookup1').output.value
I then setup the Activity of the ForEach to my Data flow. From research, I think I'm supposed to set the Parameter value to "#item().RecordId", but that gives an error "parameter [name] does not match parameter type 'string'".
I can change the type of the parameter to any (and use toString([parameter]) to cast it ) and then when I try and debug it passes the parameter in, but it gives an error of "Job failed due to reason: at (Line 2/Col 14): Datatype any not found".
I'm not sure what the solution is. Is there a way to cast the result of the lookup to an integer or string? Is there a way to narrow an any down? Is there a better way than toString() that would work? Is there a better way than ForEach?
I tried to reproduce similar scenario what you are trying.
My sample data in cosmos
To query Cosmos Database for a list of Ids and call a REST API using the Id For each of these records.
First, I took Lookup activity in data factory and selected the id's where the last_name is Bluth
Its output and settings are as below:
Then I passed the output of lookup activity to For-each activity.
Then inside for each activity I created Dataflow activity and for that DataSource I gave the source as Rest API. My Rest API to call specific user is https://reqres.in/api/users/2 I gave base URL as https://reqres.in/api/users.
Then I created parameter called demoId as datatype string and in relative URL I gave that dynamic value as #dataset().demoId
After this I gave value source parameter as #item().id as after https://reqres.in/api/users there is only id should be provided to get data in you case you can try Record/#{item().id}/Details.
For each id it is successfully passing id to rest API and fetching data:

Problem accesing a dictionary value on dialogflow fullfilment

I am using fullfilment section on dialogflow on a fairly basic program I have started to show myself I can do a bigger project on dialogflow.
I have an object setup that is a dictionary.
I can make the keys a constant through
const KEYS=Object.keys(overflow);
I am going through the values using
if(KEYS.length>0){
var dictionary = overflow[keys[i]]
if I stringify dictionary using
JSON.stringify(item);
I get:
{"stack":"overflow","stack2":"overflowtoo", "stack3":3}
This leads me to believe I am actually facing a dictionary, hence the name of the variable.
I am accesing a string variable such as stack and unlike stack3
Everything I have read online tells me
dictionary.stack
Should work since
JSON.stringify(item);
Shows me:
{"stack":"overflow","stack2":"overflowtoo","stack3":3}
Whenever I:
Try to add the variable to the response string.
Append it to a string using output+=${item.tipo};
I get an error that the function crashed. I can replace it with the stringify line and it works and it gives me the JSON provided so the issue isnt there
Dictionary values are created as such before being accessed on another function:
dictionary[request.body.responseId]={
"stack":"overflow",
"stack2":"overflowtoo",
"stack3":3 };
Based on the suggestion here I saw that the properties where being accessed properly but their type was undefined. Going over things repeatedly I saw that the properties where defined as list rather than single values.
Dot notation works when they stop being lists.
Thanks for guiding me towards the problem.

ADF: can't build simple Json file transformation (one field flattening)

I need a help in transforming simple json file inside Azure Data Flow. I need to flatten just one field date_sk in example here:
{
"date_sk": {"string":"2021-09-03"}
"is_influencer": 0,
"is_premium": -1,
"doc_id": "234"
}
Desired transformation:
"date_sk": {"string":"2021-09-03"}
to become
"dateToGroupBy" : "2021-09-03"
I create source stream, note the strange projection Azure picks, there is no "string" field anymore, but this is how automatic Azure transformation works for some reason:
Data preview of the same source stream node:
And here's how it suggest me to transform it in a separate "Derived Column" modifier. I played with the right part, but this is the only format (date_sk.{}) that does not display any error I was able to pick:
But then output dateToGroupBy field happens to be empty:
Any ideas on what could got wrong and how can I build the expected transformation? Thank you
Alright, it happened to be a Microsoft bug in ADF.
ADF stumbles upon "string" field name as JSON field, can't handle it, though schema and data validation passes through Ok, and showing no errors.
When I replace date_sk": {"string":"2021-09-03"} by date_sk": {"s1":"2021-09-03"} or anything other than string everything starts working just fine
and dateToGroupBy is filled with date values taken from date_sk.s1
When I return string back, it shows NULL in output values.
It supposed to either show error on verification stage or handle this field naming properly.

Unable to insert data through EntityRepository.create or EntityRepository.persist #MikroOrm #NestJS

:)
I am trying to test my Entity operations using the code in the file.
I am creating a userRepository object as follows:
image
When I console.log find{} from the repository, it fetches the previously stored records:
image
I create a dummy object using faker and it works fine but as soon as I try to create it in DB or persist it, it does not seem to work:
enter image description here
I also tried orm.em.persist. Let me know if more details are required.
Just for future readers, this has been asked & asnwered on the GH.
https://github.com/mikro-orm/mikro-orm/discussions/1571

BizTalk: Getting error in Promoted Property

I am getting below error when I run the Orchestration and try to assign value to a promoted property by reading the value of another promoted property.
Error in Suspended Orchestration:
Inner exception: There is no value associated with the property BankProcesses.Schemas.Internal_ID' in the message.
Detail:
I have 2 XSD schemas, 1 for calling a stored procedure and reading its response and another to write it into a flat file. The internal ID returned in the response from SP needs to be passed to a node in another XSD schema to write to a flat file format.
I have promoted an element from the response schema and also promoted an element from the schema to write to flat file. I am assigning the value to promoted propeties as below:
strInternalId = msgCallHeaderSP_Response(BankProcesses.Schemas.Internal_ID);
msgCallSP(BankProcesses.Schemas.Header_Internal_ID) = strInternalId;
But when I run the orchestration I get the error as mentioned above. I have checked the reponse from stored procedure and the reponse XML does contain some value but I am unable to assign that value to another schema. Please advice
Thanks,
Mayur
You can use exists to check the existence of property.
if(BankProcesses.Schemas.Internal_ID exists msgCallHeaderSP_Response)
{
strInternalId = msgCallHeaderSP_Response(BankProcesses.Schemas.Internal_ID);
msgCallSP(BankProcesses.Schemas.Header_Internal_ID) = strInternalId;
}
One scenario that might cause this error is that there is no Header_Internal_ID element in the message you are trying to modify. Can you inspect the message before modification to ensure that there is an element whose value should be changed - drop the message out to a file location, maybe.
If this is the case, then just ensure that you create this element when you instantiate you r message for the first time - even if you initially set it to an empty element.
HTH
To check if the property exists, you can use this syntax:
BMWFS.LS.BizTalk.CFS.BankProcesses.Schemas.Internal_ID exists msgCallHeaderSP_Response
However, if the case is that the source field would always be there, you have to work backwards to find out why the Property is not appearing on the Context.
If it's coming from a Port, is the message passign through an XmlDisassembler Component? If it's coming from another Orchestration, are you actually setting the Property?
The easiest way to look at the Context is to route the Message, msgCallHeaderSP_Response, to a Stopped Send Port. You can then view the Context in BizTalk Administrator.

Resources