I have logic app that is triggered by a service bus. The message content is not usable as it is just random characters. I suspect that perhaps it needs to be parsed but it is not clear how to do this.
I have the following:
Not enough reputation to add an image - but screen shot from Azure
"Insert_Entity": {
"inputs": {
"body": {
"PartitionKey": "deviceID",
"RowKey": "#variables('curDate')",
"content": "#triggerBody()?['ContentData']"
},
When I look at the data that I am getting for the "content" coming from the "#triggerBody()?['ContentData']" it looks like this:
"W3sidHlwZSI6ImxvZyJ9LF...." I deleted most of this as it 100's of characters long.
I suspect that this needs to be parsed or something to look at the actual message body. I have checked this out but don't know where to insert code like this: Getting content from service bus in logic apps
Can you please explain how to see the message body.
Can you please explain how to see the message body.
The string W3sidHlwZSI6ImxvZyJ9LF.... you mentioned is base64string. If we want to see the message body we need to convert the base64string to string
We could do that with base64ToString(triggerBody()?['ContentData']) details please refer to the screenshot.
Body info:
After getting the value as Tom Sun solution, i had had to extract the json part of the result to be able to parse it, Logic App Expression :
substring(
variables('result'),sub(indexOf(variables('result'),'{'),1),
sub(lastIndexOf(variables('result'),'}'),indexOf(variables('result'),'{'))
)
Then use Parse JSON function to parse the result using the schema :
{
"properties": {
"data": {
"type": "string" // Change As Required
},
"dataVersion": {
"type": "string"
},
"eventTime": {
"type": "string"
},
"eventType": {
"type": "string"
},
"id": {
"type": "string"
},
"metadataVersion": {
"type": "string"
},
"subject": {
"type": "string"
},
"topic": {
"type": "string"
}
},
"type": "object"
}
Related
I am trying to set up an integration between Zendesk and Slack, where Zendesk sends new notifications to Slack using Slack's Incoming Webhooks. The issue is that under certain circumstances, some Zendesk fields would be empty strings, which are flagged by slack as an invalid_payload. I would love to be able to send the message with an empty string, or at least have a fallback string if the zendesk field is empty.
I have confirmed that populating the empty string, without making any other changes to the payload, results in a successful integration, so it's definitely the empty string blocking the message from being delivered. I have also found this SO thread, but this doesn't seem to work for me. I haven't been able to find any documentation from slack regarding empty strings in the JSON payload either.
So I guess my question is whether or not it's possible to send empty strings in a JSON payload for Slack webhooks at all, and if not, is there a workaround to account for the possibility of an empty string? As I mentioned, this is a Zendesk integration so I don't have the ability to write a script to check for an empty string because it's all happening within Zendesk's dashboard (at least as far as I'm aware).
Here's the overall JSON object that I am trying to send from Zendesk to Slack. The Organization field is the one that has the potential to be blank:
{
"blocks": [
{
"type": "divider"
},
{
"type": "header",
"text": {
"type": "plain_text",
"text": "Ticket Title/Subject"
}
},
{
"type": "context",
"elements": [
{
"type": "mrkdwn",
"text": "*Ticket #_000_*"
}
]
},
{
"type": "context",
"elements": [
{
"type": "mrkdwn",
"text": "*Status: _New_*"
}
]
},
{
"type": "section",
"fields": [
{
"type": "mrkdwn",
"text": "*Requester*"
},
{
"type": "mrkdwn",
"text": "*Organization*"
},
{
"type": "plain_text",
"text": "Client Name"
},
{
"type": "plain_text",
"text": "Organization Name" //Possible empty string
}
]
},
{
"type": "section",
"text": {
"type": "plain_text",
"text": "Message Body Text Here"
}
},
{
"type": "divider"
}
]
}
I'm using postman to make rest requests to the azure API to run a pipeline that is in synapse, in terms of permissions and the token I already get them and it works, the problem is that the pipeline receives 3 parameters but I don't know how to pass them, so I have this request, example:
https://hvdhgsad.dev.azuresynapse.net/pipelines/pipeName/createRun?api-version=2020-12-01
and the parameters I added them in the body:
{
"parameters": {
"p_dir": {
"type": "string",
"defaultValue": "val1"
},
"container": {
"type": "string",
"defaultValue": "val"
},
"p_folder": {
"type": "string",
"defaultValue": "val3"
}
}
}
but when I validate the run that was launched with the request I get this.
{
"id": "xxxxxxxxxxxxxxx",
"runId": "xxxxxxxxxxxxxxxxxxxxx",
"debugRunId": null,
"runGroupId": "xxxxxxxxxxxxxxxxxxxx",
"pipelineName": "xxxxxxxxxxxxxxxxx",
"parameters": {
"p_dir": "",
"p_folder": "",
"container": ""
},
"invokedBy": {
"id": "xxxxxxxxxxxxxxxxx",
"name": "Manual",
"invokedByType": "Manual"
},
"runStart": "2021-07-20T05:56:04.2468861Z",
"runEnd": "2021-07-20T05:59:10.1734654Z",
"durationInMs": 185926,
"status": "Failed",
"message": "Operation on target Data flow1 failed: {\"StatusCode\":\"DF-Executor-SourceInvalidPayload\",\"Message\":\"Job failed due to reason: Data preview, debug, and pipeline data flow execution failed because container does not exist\",\"Details\":\"\"}",
"lastUpdated": "2021-07-20T05:59:10.1734654Z",
"annotations": [],
"runDimension": {},
"isLatest": true
}
the params are empty, so I don't know what's wrong or missing.
what is the correct way to pass them???
ref: https://learn.microsoft.com/en-us/rest/api/synapse/data-plane/pipeline/create-pipeline-run#examples
Just created an account to answer this as i've had the same issue.
I resolved this by just having the name of the variable and its subsequent value in the body JSON.
e.g.
{"variable": "value", "variable": "value"}
Found this by following the documentation you had posted, under request body it passes the name of the variable and the value directly into the JSON body.
{
"OutputBlobNameList": [
"exampleoutput.csv"
]
}
This particular example is a list/array so it confused me by adding the brackets [] if you are passing string parameters this is unneeded.
Have built an Azure logic app with a HTTP trigger that receives a JSON file, provides a response, then attempts to send the payload to a Service Bus queue. Kept getting an status 400 error, so tried adding a parse JSON step, however am getting an invalid template per shot below.
Azure logic app
parse JSON error
The response is working correctly, with the payload being correctly displayed at the initiating website.
successful response
The JSON data in the schema is as follows. EDIT: Revised schema to include "null" per Rick's suggestion below.
{
"items": {
"properties": {
"Entry Price": {
"type": [
"string",
"null"
]
},
"Exit Price": {
"type": [
"string",
"null"
]
},
"Gain $": {
"type": [
"string",
"null"
]
},
"New Stop": {
"type": [
"string",
"null"
]
},
"ReceivedDate": {
"type": [
"string",
"null"
]
},
"ReceivedTime": {
"type": [
"string",
"null"
]
},
"Status": {
"type": [
"string",
"null"
]
},
"Stop": {
"type": [
"string",
"null"
]
},
"Symbol": {
"type": [
"string",
"null"
]
},
"Tranche": {
"type": [
"string",
"null"
]
}
},
"required": [
"Symbol",
"Status",
"ReceivedDate",
"ReceivedTime"
],
"type": "object"
},
"type": "array"
}
I have spent days looking at forums and trying a number of suggested fixes, however am unable to get the app to execute without error.
Appreciate any ideas or suggestions to fix. Thanks.
It looks like you are passing in a null value for the property content, while the schema has it as a required property.
You can solve this by changing the definition in the schema for content to be as follows:
"content":{
"type":[
"string",
"null"
]
}
More information (and source) here: Parsing JSON with null-able properties in Logic Apps
There is also default values, that could be considered, but everything depends on the context your logic app operates in.
I'm trying to create a generic CSV dataset with parametrized filename and schema to be able to use it in foreach loops with file lists and I'm having some trouble on publishing, and I don't know if I'm doing something wrong or if the framework docs are not correct.
According to documentation the schema description is:
Columns that define the physical type schema of the dataset. Type: array (or Expression with resultType array), itemType: DatasetSchemaDataElement.
I have a dataset with a parameter named Schema of type Array and the "schema" set to an expression that returns this parameter:
{
"name": "GenericCSVFile",
"properties": {
"linkedServiceName": {
"referenceName": "LinkedServiceReferenceName",
"type": "LinkedServiceReference"
},
"parameters": {
"Schema": {
"type": "array"
},
"TableName": {
"type": "string"
},
"TableSchema": {
"type": "string"
}
},
"folder": {
"name": "Folder"
},
"type": "DelimitedText",
"typeProperties": {
"location": {
"type": "AzureDataLakeStoreLocation",
"fileName": {
"value": "#concat(dataset().TableSchema,'.',dataset().TableName,'.csv')",
"type": "Expression"
},
"folderPath": "Path"
},
"columnDelimiter": ",",
"escapeChar": "\\",
"firstRowAsHeader": true,
"quoteChar": "\""
},
"schema": {
"value": "#dataset().Schema",
"type": "Expression"
}
},
"type": "Microsoft.DataFactory/factories/datasets"
}
However, when I publish, i get the following error:
Error code: BadRequest
Inner error code: InvalidPropertyValue
Message: Invalid value for property 'schema'
Am I doing something wrong? are the docs wrong?
Yes, this is the expected behavior. If you need to set dynamic value for column mapping, please ignore schema in DelimitedText dataset, which is more for a visually display of physical schema information and would not take effect when do copy activity column mapping. The expression setting for it is also not allowed. You could configure mapping as an expression to achieve this goal and pass it a proper value when trigger run.
My Service Bus queue is receiving telemetry of 2 different objects. For Object1 is have to send mail to MailId1 and for Object2 j have to send mail to MailId2. Also, I have to use some of the content from JSON telemetry as the body of my mail.
For a single object, it is working fine. In my logic app, I have used service bus (its queue is receiving telemetry messages) followed by parse JSON (to parse content as JSON) and lastly SMTP to send mail. In case I need to make decisions based on JSON, what workflow can I use in LogicApp?
I have used Condition action as shown in the image below.
JSON parsed in IF condition is
{
"properties": {
"dbt": {
"type": "integer"
},
"latitude": {
"type": "number"
},
"location": {
"type": "string"
},
"longitude": {
"type": "number"
},
"owner": {
"type": "string"
},
"speed": {
"type": "integer"
},
"stdb": {
"type": "integer"
},
"timeCreated": {
"type": "integer"
}
},
"type": "object"
}
JSON parsed in ELSE condition
{
"properties": {
"message": {
"type": "string"
},
"owner": {
"type": "string"
},
"timeCreated": {
"type": "integer"
}
},
"type": "object"
}
For either of the telemetry, the condition always fails and executes else part. IF part is never executed. Where am I going wrong in setting condition for IF part?
Any help would be appropriated.
You can use conditional statements.