Azure Logic Apps - Get Blob Content - Setting Content type - azure

The Azure Logic Apps action "Get Blob Content" doesn't allow us to set the return content-type.
By default, it returns the blob as binary (octet-stream), which is useless in most cases. In general it would be useful to have text (e.g. json, xml, csv, etc.).
I know the action is in beta. Is that on the short term roadmap?

Workaround I found is to use the Logic App expression base64ToString.
For instance, create an action of type "Compose" (Data Operations group) with the following code:
"ComposeToString": {
"inputs": "#base64ToString(body('Get_blob_content').$content)",
"runAfter": {
"Get_blob_content": [
"Succeeded"
]
},
"type": "Compose"
}
The output will be the text representation of the blob.

So I had a blob sitting in az storage with json in it.
Fetching blob got me a octet back that was pretty useless, as I was unable to parse it.
BadRequest. The property 'content' must be of type JSON in the
'ParseJson' action inputs, but was of type 'application/octet-stream'.
So I setup an "Initialize variable", content type of string, pointing to GetBlobContent->File Content. The base64 conversion occurs under the hood and I am now able to access my json via the variable.
No code required.
JSON OUTPUT...
FLOW, NO CODE...
Enjoy! Healy in Tampa...

After fiddling much with Logic Apps, I finally understood what was going on.
The JSON output from the HTTP request is the JSON representation of an XML payload:
{
"$content-type": "application/xml",
"$content": "77u/PD94bWwgdm..."
}
So we can decode it, but it is useless really. That is an XML object for Logic App. We can apply xml functions to it, such as xpath.

You would need to know the content-type.
Use #{body('Get_blob_content')['$content']} to get the content part alone.

Is enough to "Initialize Variable" and take the output of the Get Blob Content as type "String". This will automatically parse the content:

Related

Azure Data Factory REST API return invalid JSON file with pagination

I'm building a pipeline, which copy a response from a API into a file in my storage account. There is also an element of pagination. However, that works like a charm and i get all my data from all the pages.
My result is something like this:
{"data": {
"id": "Something",
"value": "Some other thing"
}}
The problem, is that the copy function just appends the response to the file and thereby making it invalid JSON, which is a big problem further down the line. The final output would look like:
{"data": {
"id": "22222",
"value": "Some other thing"
}}
{"data": {
"id": "33333",
"value": "Some other thing"
}}
I have tried everything I could think of and google my way to, but nothing changes how the data is appended to the file and i'm stuck with an invalid JSON file :(
As a backup plan, i'll just make a loop and create a JSON file for each PAGE. But that seems a bit janky and really slow
Anyone got an idea or have a solution for my problem?
When you copy data from Rest API to blob storage it will copy data in the form of set of objects by default.
Example:
sample data
{ "time": "2015-04-29T07:12:20.9100000Z", "callingimsi": "466920403025604"}
sink data
{"time":"2015-04-29T07:12:20.9100000Z","callingimsi":"466920403025604"}
{"time":"2015-04-29T07:13:21.0220000Z","callingimsi":"466922202613463"}
{"time":"2015-04-29T07:13:21.4370000Z","callingimsi":"466923101048691"}
This is the invalid format of Json.
To work around this, select file pattern in sink activity setting as Array of objects this will return array of all objects.
Output:

Getting the files from Azure Blob and sending them in one email

I have a setup that is exporting all the json files generated from an API sent to an email address upon a request sent to a shared mailbox, but the thing is that currently logic app is sending out separate emails, one json per email, so it's 7 emails in my case.
My final solution would be sending all the json files in one email. Have tried to figure out the connector methods, but seems that I cannot find that out. Tried to Google of course, but no luck.
Would really appreciate any help!
Current setup looks like this:
Azure Logic App 1:
Azure Logic App 2:
You need to build an array of all of the attachments outside of the loop.
This is the flow I tested with ...
... the two important points are:
Construct an Attachments Array
As you can see, I've declared a variable at the top Attachments of type Array.
Get your blobs and then loop over each one of them.
Within the loop, get the contents of the blob and then add an object to the array that looks like the following JSON structure ...
This is the Peek Code of the array item, noting that I am encoding the content as base64 ...
{
"inputs": {
"name": "Attachments",
"value": {
"ContentBytes": "#{base64(body('Get_blob_content_(V2)'))}",
"Name": "#{items('For_Each_Blob')?['DisplayName']}"
}
}
}
Send the Email
Now, when you send the email, refer to the array as the contents of the Attachments parameter.
That should get the job done for you, it worked for me.
Have you tried adding the output of the GetBlob Content to an array or adding it into a string? https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-create-variables-store-values#initialize-variable and then using this variable to create the email body?

Get value from json in LogicApp

Rephrasing question entirely, as first attempt was unclear.
In my logic app I am reading a .json from blob which contains:
{
"alpha": {
"url": "https://linktoalpha.com",
"meta": "This logic app does job aaaa"
},
"beta": {
"url": "https://linktobeta.com",
"meta": "This logic app does job beta"
},
"theta": {
"url": "https://linktotheta.com",
"meta": "This logic app does job theta"
}
}
I'm triggering the logic app with a http post which contains in the body:
{ "logicappname": "beta" }
But the value for 'logicappname' could be alpha, beta or theta. I now need to set a variable which contains the url value for 'beta'. How can this be achieved without jsonpath support?
I am already json parsing the file contents from the blob and this IS giving me the tokens... but I cannot see how to select the value I need. Would appreciate any assistance, thank you.
For your requirement, I think just use "Parse JSON" action to do it. Please refer to the steps below:
1. I upload a file testJson.json to my blob storage, then get it and parse it in my logic app.
2. We can see there are three url in the screenshot below. As you want to get the url value for beta, it is the second one, so we can choose the second one.
If you want to get the url value by the param logicappname from the "When a HTTP request is received" trigger, you can use a expression when you create the result variable.
In my screenshot, the expression is:
body('Parse_JSON')?[triggerBody()?['logicappname']]?['url']
As the description of your question is a little unclear and I'm confused about the meaning of I am already json parsing the file contents from the blob and this IS giving me the tokens, why is "tokens" involved in it ? And in the original question it seems you want to do it by jsonpath but in the latest description you said without jsonpath ? So if I misunderstand your question, please let me know. Thanks.
Not sure if I understand your question. But I believe you can use Pars Json action after the http trigger.
With this you will get a control over the incoming JSON message and you can choose the 'URL' value as a dynamic content in the subsequent actions.
Let me know if my understanding about your question is wrong.

Azure Logic App - How to upload file to Azure Blob Storage from byte array

I am trying to create a Logic App that is triggered by an HttpRequest that contains as payload a JSON request. Inside of this JSON, one field contains the file:
{
"EntityId": "45643",
"SharedGuid": "00000000-0000-0000-0000-000000000000",
"Document": {
"DocumentName": "MyFileName.pdf",
"File": "JVBERi0xLjMKJfv8/f4KMS.....lJUVPRg=="
}}
This "file" content is being generated by the customer application by using the following C# function: File.ReadAllBytes("local path here").
I managed to upload the byte array to blob storage. But the file is not valid once it is uploaded in the Blob Storage.
I tried different file contents for the file in the JSON schema definition as: string, binary, application/octet-stream.
Any help will be appreciated.
Did you do the operation to convert the byte to Base64String in your httprequest code, just like the code below:
byte[] b = File.ReadAllBytes(#"filepath");
string s = Convert.ToBase64String(b);
According to the file content you provided, it seems you have convert it to base64string as above, so I provide the solution below:
For this requirement, you can just parse the response data as string(do not need to use "binary" in schema) in your "Parse JSON" action and then use base64ToBinary() method in the "Create blob" action, please refer to the screenshot shown as below:
The expression in "Blob content" is:
base64ToBinary(body('Parse_JSON')?['Document']?['File'])
Hope it helps~
If still have any problem, please feel free to let me know.

Azure : How to write path to get a file from a time series partitioned folder using the Azure logic apps

I am trying to retrieve a csv file from the Azure blob storage using the logic apps.
I set the azure storage explorer path in the parameters and in the get blob content action I am using that parameter.
In the Parameters I have set the value as:
concat('Directory1/','Year=',string(int(substring(utcNow(),0,4))),'/Month=',string(int(substring(utcnow(),5,2))),'/Day=',string(int(substring(utcnow(),8,2))),'/myfile.csv')
So during the run time this path should form as:
Directory1/Year=2019/Month=12/Day=30/myfile.csv
but during the execution action is getting failed with the following error message
{
"status": 400,
"message": "The specifed resource name contains invalid characters.\r\nclientRequestId: 1e2791be-8efd-413d-831e-7e2cd89278ba",
"error": {
"message": "The specifed resource name contains invalid characters."
},
"source": "azureblob-we.azconn-we-01.p.azurewebsites.net"
}
So my question is: How to write path to get data from the time series partitioned path.
The response of the Joy Wang was partially correct.
The Parameters in logic apps will treat values as a String only and will not be able to identify any functions such as concat().
The correct way to use the concat function is to use the expressions.
And my solution to the problem is:
concat('container1/','Directory1/','Year=',string(int(substring(utcNow(),0,4))),'/Month=',string(int(substring(utcnow(),5,2))),'/Day=',string(int(substring(utcnow(),8,2))),'/myfile.csv')
You should not use that in the parameters, when you use this line concat('Directory1/','Year=',string(int(substring(utcNow(),0,4))),'/Month=',string(int(substring(utcnow(),5,2))),'/Day=',string(int(substring(utcnow(),8,2))),'/myfile.csv') in the parameters, its type is String, it will be recognized as String by logic app, then the function will not take effect.
And you need to include the container name in the concat(), also, no need to use string(int()), because utcNow() and substring() both return the String.
To fix the issue, use the line below directly in the Blob option, my container name is container1.
concat('container1/','Directory1/','Year=',substring(utcNow(),0,4),'/Month=',substring(utcnow(),5,2),'/Day=',substring(utcnow(),8,2),'/myfile.csv')
Update:
As mentioned in #Stark's answer, if you want to drop the leading 0 from the left.
You can convert it from string to int, then convert it back to string.
concat('container1/','Directory1/','Year=',string(int(substring(utcNow(),0,4))),'/Month=',string(int(substring(utcnow(),5,2))),'/Day=',string(int(substring(utcnow(),8,2))),'/myfile.csv')

Resources