I need a help in transforming simple json file inside Azure Data Flow. I need to flatten just one field date_sk in example here:
{
"date_sk": {"string":"2021-09-03"}
"is_influencer": 0,
"is_premium": -1,
"doc_id": "234"
}
Desired transformation:
"date_sk": {"string":"2021-09-03"}
to become
"dateToGroupBy" : "2021-09-03"
I create source stream, note the strange projection Azure picks, there is no "string" field anymore, but this is how automatic Azure transformation works for some reason:
Data preview of the same source stream node:
And here's how it suggest me to transform it in a separate "Derived Column" modifier. I played with the right part, but this is the only format (date_sk.{}) that does not display any error I was able to pick:
But then output dateToGroupBy field happens to be empty:
Any ideas on what could got wrong and how can I build the expected transformation? Thank you
Alright, it happened to be a Microsoft bug in ADF.
ADF stumbles upon "string" field name as JSON field, can't handle it, though schema and data validation passes through Ok, and showing no errors.
When I replace date_sk": {"string":"2021-09-03"} by date_sk": {"s1":"2021-09-03"} or anything other than string everything starts working just fine
and dateToGroupBy is filled with date values taken from date_sk.s1
When I return string back, it shows NULL in output values.
It supposed to either show error on verification stage or handle this field naming properly.
Related
Sorry if this is a bit vague or rambly, I'm still getting to grips with Data Factory and a lot of it seems a bit obtuse...
What I want to do is query my Cosmos Database for a list of Ids of records that need to be updated. For each of these records, I want to call a REST API using the Id (i.e. /Record/{Id}/Details)
I've created a Data Flow that took a string as a parameter and then called the REST API fine.
I then made a pipeline using a Lookup with a query (select c.RecordId from c where...) and pass that into a ForEach with items set to #activity('Lookup1').output.value
I then setup the Activity of the ForEach to my Data flow. From research, I think I'm supposed to set the Parameter value to "#item().RecordId", but that gives an error "parameter [name] does not match parameter type 'string'".
I can change the type of the parameter to any (and use toString([parameter]) to cast it ) and then when I try and debug it passes the parameter in, but it gives an error of "Job failed due to reason: at (Line 2/Col 14): Datatype any not found".
I'm not sure what the solution is. Is there a way to cast the result of the lookup to an integer or string? Is there a way to narrow an any down? Is there a better way than toString() that would work? Is there a better way than ForEach?
I tried to reproduce similar scenario what you are trying.
My sample data in cosmos
To query Cosmos Database for a list of Ids and call a REST API using the Id For each of these records.
First, I took Lookup activity in data factory and selected the id's where the last_name is Bluth
Its output and settings are as below:
Then I passed the output of lookup activity to For-each activity.
Then inside for each activity I created Dataflow activity and for that DataSource I gave the source as Rest API. My Rest API to call specific user is https://reqres.in/api/users/2 I gave base URL as https://reqres.in/api/users.
Then I created parameter called demoId as datatype string and in relative URL I gave that dynamic value as #dataset().demoId
After this I gave value source parameter as #item().id as after https://reqres.in/api/users there is only id should be provided to get data in you case you can try Record/#{item().id}/Details.
For each id it is successfully passing id to rest API and fetching data:
Trying to add values from the JSON Parser into a dynamic Oracle query but always shows up as blank/empty.
Same results trying to use the formal parameters and declaring them within the Oracle query as well.
Is this possible?
Service Bus trigger. Message comes in and goes into the "For each message" loop. It then gets parsed in the "Parse JSON Message" and when I try to use the dynamic content from that in an Oracle query (e.g. #body('Parse_JSON_Message')?['contentData']?['dynamicValue']), they always show up as nothing/blank.
I can use this same reference later (again #body('Parse_JSON_Message')?['contentData']?['dynamicValue']) with no issues but it always ends up blank in the query.
Even using formal parameterized key values shows the #body('Parse_JSON_Message')?['contentData']?['dynamicValue']) to be NULL.
So my queries are coded as:
SELECT thisValue
WHERE columnName = #body('Parse_JSON_Message')?['contentData']?['dynamicValue'] (have also tried #{body('Parse_JSON_Message')?['contentData']?['dynamicValue']})
But are coming out as:
SELECT thisValue
WHERE columnName =
...with no result obviously.
This query does work if I hard code the value.
If I use dynamic values in my output (where I'm mapping data), it works fine. So how should I correctly dynamic values in this Oracle query?
I have an Azure MS SQL Server component that is returning multiple rows and feeding into a Response component.
The Body of the Response component looks like this:
{
"MyID":"#{body('Get_rows')['value'][1]['Id']}"
}
I can make the number in the bracket 0 and get the first result. I can make it 1 and get the second result. But what I am trying to find is the syntax to loop through all the results that are passed so that it would effectively provide the following (assuming there were 2 results total:
{
"MyID":"#{body('Get_rows')['value'][0]['Id']}"
}
{
"MyID":"#{body('Get_rows')['value'][1]['Id']}"
}
Thanks in advance for advice on where to find the correct syntax or for examples of correct syntax.
It took me a while but figured out that I needed to do two things:
I had to run a for each after the Get Rows and within that I created a Data Operations - Compose component. Within that I was able to create a single JSON object with all the parameters.
From there I used #outputs command as shown below in the Body of the Response and it inserted the array brackets and the commas to delimit the Compose entries automagically.
Here is what the code in the Body of the Response looks like:
#outputs('Compose')
Note that 'Compose' is the default name given to the first Compose component you place in the application.
I burned a couple of hours on a problem today and thought I would share.
I tried to start up a previously-working Azure Stream Analytics job and was greeted by a quick failure:
Failed to start Streaming Job 'shayward10ProcessLogs'.
I looked at the JSON log and found nothing helpful whatsoever. The only description of the problem was:
Stream Analytics job has validation errors: The given key was not present in the dictionary.
Given the error and some changes to our database, I tried the following to no effect:
Deleting and Recreating all Inputs
Deleting and Recreating all Outputs
Running tests against the data (coming from Event Hub) and the output looked good
My query looked as followed:
SELECT
dateTimeUtc,
context.tenantId AS tenantId,
context.userId AS userId,
context.deviceId AS deviceId,
changeType,
dataType,
changeStatus,
failureReason,
ipAddress,
UDF.JsonToString(details) AS details
INTO
[MyOutput]
FROM
[MyInput]
WHERE
logType = 'MyLogType';
Nothing made sense so I started deconstructing my query. I took it down to a single field and it succeeded. I went field by field, trying to figure out which field (if any) was the cause.
See my answer below.
The answer was simple (yet frustrating). When I got to the final field, that's where the failure was:
UDF.JsonToString(details) AS details
This was the only field that used a user-defined function. After futsing around, I noticed that the Function Editor showed the title of the function as:
udf.JsonToString
It was a casing issue. I had UDF in UPPERCASE and Azure Stream Analytics expected it in lowercase. I changed my final field to:
udf.JsonToString(details) AS details
It worked.
The strange thing is, it was previously working. Microsoft may have made a change to Azure Stream Analytics to make it case-sensitive in a place where it seemingly wasn't before.
It makes sense, though. JavaScript is case-sensitive. Every JavaScript object is basically a dictionary of members. Consider the error:
Stream Analytics job has validation errors: The given key was not present in the dictionary.
The "udf" object had a dictionary member with my function in it. The UDF object would be undefined. Undefined doesn't have my function as a member.
I hope my 2-hour head-banging session helps someone else.
This is a multi faceted question, but any help is appreciated
Background:
I have a Application Definition with 6 entities using SSO
The database back end is Firebird through ODBC
All the data is coming from stored procedures
Questions:
1 While trying to implement one or any of the entities from the BDC in a Business Data List web part I get the following error: "An error occurred while retrieving data from . Administrators, see the server log for more information." It only happens when I have fields that are null, in this instance a field that was declared as a string.
2.When I check the logs, it's a System.OverFlowException.
3.If I change it so the output from the procedure is a blank string, I suddenly get "The title property of entity is set to an invalid value"
4.The error from the logs after changing to a blank string is "Exception handed to HandleXslException.HandleException System.ArgumentException: '.', hexadecimal value 0x00, is an invalid character"
What gives? It worked last night without issue until a record appeared that had a null value in one of the string field. Now, even replacing the null value with something generic is still giving me the title property invalid error.
Most puzzling: If I change the query so that the rows with what would be a null or blank string aren't in the query, the error goes away. But, if I add them back and replace the null string with anything, the error comes back. What the !##$? How does it know I've replaced a null value with something else before the records are returned to the XmlReader?
I've run into this exact scenario and it brought back some angry/confused moments. As you said in your comment:
I set the encoding to be unicode on all varchar and char outputs and it fixed it. The lack of encoding caused there to be null characters (not a null record, but one null character) for that column and Sharepoint could not parse the field. Changed the encoding, and everything works.
It took me a couple days of swearing at the computer before we took it down to the metal and discovered the unicode issue. I don't even know when it changed but we realized the same thing and all was right with the world again.