Seperate Body/Header in logic app custom connector - azure

Hi have an ARM Template that deploys some custom connectors. I have a connector called Start in that when i try to use this below piece of code which sets message body and header seperately it works fine. But when i do the same on a following connector , Assue Connector-Start as A and its followed by connector B. The input is not showing in different fields. Instead its not even relevant. can anyone help.
In the connector parameters here's the first block
{
"name": "Body",
"in": "body",
"schema": {
"type": "object",
"properties": {
"MessageBody": {
"type": "object",
"description": "Message body passed to the http trigger"
},
"MessageHeader": {
"type": "object",
"description": "Message header passed to the http trigger"
}
},
"required": [
"MessageBody",
"MessageHeader"
]
},
"description": "Message body to get properties from the message payload.",
"required": true
},
Here's the input shown in portal for connector-A

It seems, after deployment the old connector is cached. Removing it and adding it again worked.

Related

Slack API triggering action event for element in input block

My Slack app sends a radio-button question to the users. The app is built using bolt-js. The radio buttons are inside an input block (ref: Slack Block Kit).
According to bolt-js reference for the method app.action:
... Note that action elements included in an input block do not trigger any events.
However, I still receive events on the app.action listener from elements that are inside the input block.
The code for sending the message is this:
await client.chat.postMessage({
"channel": channelId,
"blocks": [
{
"type": "input",
"label": {
"type": "plain_text",
"text": "Some question"
},
"element": {
"type": "radio_buttons",
"options": [
{
"text": {
"type": "plain_text",
"text": "Option A",
},
"value": "value-0"
},
{
"text": {
"type": "plain_text",
"text": "Option B",
},
"value": "value-1"
},
],
"action_id": "some_action",
}
}
],
"text": "Some Text"
});
I have tried:
Removing the app.action('some_action') listener for this particular action. With this, the app logs show the following error:
[ERROR] An incoming event was not acknowledged within 3 seconds. Ensure that the ack() argument is called in a listener.
Removing the "action_id": "some_action" line from the block json. I still get the above mentioned error in the logs.
Explicitly setting "dispatch_action": false in the block json (which is false by default according to the input block reference). The event is still triggered.
I do not want the event to be triggered. What am I doing wrong?

Index Out of Range Error When Creating SnowFlake Linked Service in Azure Data Factory

I am passing the credentials and parameters required but I get the error
The value of the property 'index' is invalid: 'Index was out of range.
Must be non-negative and less than the size of the collection.
Parameter name: index'. Index was out of range. Must be non-negative
and less than the size of the collection. Parameter name: index
Activity ID: 36a4265d-3607-4472-8641-332f5656661d.
I had the same issue, the password contained a ' and that's causing the trouble. Changed the password with no symbols and it works like a charm
Seems the UI doesn't generate the linked service correctly. Using Microsoft Docs Example JSON I received the same index error when attempting to create the linked service. If I remove the password from the connection string and add it as a separate property I am able to successfully generate the linked service.
Microsoft Docs Example (Doesn't Work)
{
"name": "SnowflakeLinkedService",
"properties": {
"type": "Snowflake",
"typeProperties": {
"connectionString": "jdbc:snowflake://<accountname>.snowflakecomputing.com/?user=<username>&password=<password>&db=<database>&warehouse=<warehouse>&role=<myRole>"
},
"connectVia": {
"referenceName": "<name of Integration Runtime>",
"type": "IntegrationRuntimeReference"
}
}
}
Working Example
{
"name": "SnowflakeLinkedService",
"properties": {
"type": "Snowflake",
"typeProperties": {
"connectionString": "jdbc:snowflake://<accountname>.snowflakecomputing.com/?user=<username>&db=<database>&warehouse=<warehouse>",
"password": {
"type": "SecureString",
"value": "<password>"
}
},
"connectVia": {
"referenceName": "<name of Integration Runtime>",
"type": "IntegrationRuntimeReference"
}
}
}
We hit this same issue today, it was because our password had an ampersand (&) at the end. This seemed to mess up the connection string as it contained this:
&password=abc123&&role=MyRole
Changing the password to not include an ampersand fixed it

Is it possible to change verbiage of listAuditEvents?

The current [Envelopes: listAuditEvents] creates the following verbiage for correction:
"eventFields": [
{
"name": "logTime",
"value": "2018-09-18T19:09:01.3603686Z"
},
{
"name": "Source",
"value": "api"
},
{
"name": "UserName",
"value": "Staging"
},
{
"name": "UserId",
"value": "8c57af14-e46a-4965-ae8b-42bb0c29b706"
},
{
"name": "Action",
"value": "Correction Initiated"
},
{
"name": "Message",
"value": "Staging initiated correction"
},
{
"name": "EnvelopeStatus",
"value": "correct"
},
I would like to modify the Message values. I have gone through Docusigns API but I have not found any indication that this is possible.
Has anyone had the same need? and if so were you able to add custom message verbiage for certain events/actions?
Thanks.
It's not possible for you to configure the contents of the API response for the listAuditEvents operation. However, you could (in your code) include logic to parse the API response and based on certain values in the response, substitute values (for purposes in your app) with the verbiage you prefer.
For example, let's say that you have a page in your app that displays the various events that have occurred for an Envelope, but you don't want to display the verbiage "[UserName] initiated correction" as the text in your UI when a user initiates an envelope correction -- instead you want to display the text "[UserName] changed envelope settings." The logic in your code could do something like this psuedo code shows (where auditEvent represents an object within the API response body for the listAuditEvents operation):
if (auditEvent.ActionInitiated == "Correction Initiated") {
displayMessageInUI(auditEvent.UserName + " changed envelope settings.");
}

Stream Analytics Job deployed as Azure Resource Manager (ARM) template

I am trying to setup an output EventHub for a Stream Analytics Job defined as a JSON template. Without the output bit the template is successfully deployed, however when adding the output definition it fails with:
Deployment failed. Correlation ID: <SOME_UUID>. {
"code": "BadRequest",
"message": "The JSON provided in the request body is invalid. Property 'eventHubName' value 'parameters('eh_name')' is not acceptable.",
"details": {
"code": "400",
"message": "The JSON provided in the request body is invalid. Property 'eventHubName' value 'parameters('eh_name')' is not acceptable.",
"correlationId": "<SOME_UUID>",
"requestId": "<SOME_UUID>"
}
}
I've defined the ARM template as:
{
"$schema": "http://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"location": {
"type": "string",
"defaultValue": "westeurope"
},
"hubName": {
"type": "string",
"defaultValue": "fooIotHub"
},
"eh_name": {
"defaultValue": "fooEhName",
"type": "String"
},
"eh_namespace": {
"defaultValue": "fooEhNamespace",
"type": "String"
},
"streamAnalyticsJobName": {
"type": "string",
"defaultValue": "fooStreamAnalyticsJobName"
}
},
"resources": [{
"type": "Microsoft.StreamAnalytics/StreamingJobs",
"apiVersion": "2016-03-01",
"name": "[parameters('streamAnalyticsJobName')]",
"location": "[resourceGroup().location]",
"properties": {
"sku": {
"name": "standard"
},
"outputErrorPolicy": "Drop",
"eventsOutOfOrderPolicy": "adjust",
"eventsOutOfOrderMaxDelayInSeconds": 0,
"eventsLateArrivalMaxDelayInSeconds": 86400,
"inputs": [{
"Name": "IoTHubInputLable",
"Properties": {
"DataSource": {
"Properties": {
"iotHubNamespace": "[parameters('hubName')]",
"sharedAccessPolicyKey": "[listkeys(resourceId('Microsoft.Devices/IotHubs/IotHubKeys',parameters('hubName'), 'iothubowner'),'2016-02-03').primaryKey]",
"sharedAccessPolicyName": "iothubowner",
"endpoint": "messages/events"
},
"Type": "Microsoft.Devices/IotHubs"
},
"Serialization": {
"Properties": {
"Encoding": "UTF8"
},
"Type": "Json"
},
"Type": "Stream"
}
}],
"transformation": {
"name": "Transformation",
"properties": {
"streamingUnits": 1,
"query": "<THE SQL-LIKE CODE FOR THE JOB QUERY>"
}
},
"outputs": [{
"name": "EventHubOutputLable",
"properties": {
"dataSource": {
"type": "Microsoft.ServiceBus/EventHub",
"properties": {
"eventHubName": "parameters('eh_name')",
"serviceBusNamespace": "parameters('eh_namespace')",
"sharedAccessPolicyName": "RootManageSharedAccessKey"
}
},
"serialization": {
"Properties": {
"Encoding": "UTF8"
}
}
}
}]
}
}]
}
Checking here https://learn.microsoft.com/en-us/azure/templates/microsoft.streamanalytics/streamingjobs
it looks like the structure of the JSON for the output is as the expected one (with the properties field along with the type).
I've figured out those "Event Hub properties" from the Chrome browser using Developer Tools and checking the details of the HTTP request "GetOutputs", otherwise I am not sure where I could see how to specify those properties? The structure looks quite similar to the one for the input IoT Hub (which is working), in that case using different lables for the properties related to the IoT Hub details.
Checking this blog post https://blogs.msdn.microsoft.com/david/2017/07/20/building-azure-stream-analytics-resources-via-arm-templates-part-2-the-template/
the output part is related to PowerBI and it looks like a different structure is used for the properties: outputPowerBISource, so I've tried to use for the Event Hub the field outputEventHubSource (from the checks using Chrome Developer Tools) instead of properties, but then I get this error:
Deployment failed. Correlation ID: <SOME_UUID>. {
"code": "BadRequest",
"message": "The JSON provided in the request body is invalid. Required property 'properties' not found in JSON. Path '', line 1, position 1419.",
"details": {
"code": "400",
"message": "The JSON provided in the request body is invalid. Required property 'properties' not found in JSON. Path '', line 1, position 1419.",
"correlationId": "<SOME_UUID>",
"requestId": "<SOME_UUID>"
}
}
The command I am using to deploy this template is the Azure CLI (from a Linux Debian machine):
az group deployment create \
--name "deployStreamAnalyticsJobs" \
--resource-group "MyRGName" \
--template-file ./templates/stream-analytics-jobs.json
How do I specify an output in an Azure Resource Manager (ARM) template for a Stream Analytics Job?
Any property that contains a parameter (or any expression that needs to be evaluated must contain square brackets, e.g.
"eventHubName": "[parameters('eh_name')]",
"serviceBusNamespace": "[parameters('eh_namespace')]",
Otherwise the literal value in the quotes is used.
That help?
I found out all parameters need to be wrapped in square brakets (as pointed out in the other answer to this question).
Also to dynamically retrieve the shared access policy key (or any other parameter for an existing resource like the Event Hub) a combination of functions like listKeys and resourceId etc must be used, see below for a full example of an Event Hub described as an output for a Stream Analytics Job.
In details:
the parameters defined for eventHubName and serviceBusNamespace must be evaluated using square brackets (see how I defined those parameters in the JSON example in the body of the question I asked above),
the shared access policy could be either an hardcoded string (or a parameter as before) like sharedAccessPolicyName or dynamically retrieved using "sharedAccessPolicyKey": "[listKeys(resourceId('Microsoft.EventHub/namespaces/eventhubs/authorizationRules', parameters('eh_namespace'), parameters('eh_name'), 'RootManageSharedAccessKey'),'2017-04-01').primaryKey]" for the sharedAccessPolicyKey (this is sensitive data and it should be protected avoiding hardcoding information as a plain string)
The following JSON configuration shows an existing Event Hub defined as an output defined for the Stream Analytics Job:
"outputs": [{
"Name": "EventHubOutputLable",
"Properties": {
"DataSource": {
"Type": "Microsoft.ServiceBus/EventHub",
"Properties": {
"eventHubName": "[parameters('eh_name')]",
"serviceBusNamespace": "[parameters('eh_namespace')]",
"sharedAccessPolicyKey": "[listKeys(resourceId('Microsoft.EventHub/namespaces/eventhubs/authorizationRules', parameters('eh_namespace'), parameters('eh_name'), 'RootManageSharedAccessKey'),'2017-04-01').primaryKey]",
"sharedAccessPolicyName": "RootManageSharedAccessKey"
}
},
"Serialization": {
"Properties": {
"Encoding": "UTF8"
},
"Type": "Json"
}
}
}]

Azure API App not recognizing Swagger definition

I'm trying out Azure's API App service, and I have a valid Swagger schema exposed for the service to consume, following the documentation here. I can retrieve the Swagger schema at its endpoint both on my local server and once the API App is deployed, and I've updated my web.config file to include the application/json MIME type. My apiapp.json file is as follows:
{
"$schema": "http://json-schema.org/schema#",
"id": "apiapp.dlxdev",
"namespace": "microsoft.com",
"gateway": "/* gateway, copied from Azure portal */",
"version": "1.0.0",
"title": "DLX API App (Dev)",
"summary": "The developer version of the DLX API App.",
"author": "Daniel W. Hieber",
"endpoints": {
"apiDefinition": "/api.json",
"status": null
}
}
Even though my endpoint is defined as /api.json, when I go to the API Definition blade in the Azure Portal, it says Failed to get metadata for 'apiApp.dlxDev' from endpoint '/swagger/docs/v1': Failed status code: 'NotFound'. Response Body: 'Not Found'.. It seems as though Azure is still looking for my Swagger file at the default /swagger/docs/v1 endpoint rather than the /api.json endpoint I specified.
I've also tried creating a metadata folder and placing my Swagger schema there (renaming it to apiDefinition.swagger.json, following the documentation), and didn't have any luck with that either.
Any ideas where I'm going wrong? Why isn't Azure detecting the endpoint for my Swagger schema?
UPDATE 1
Now I receive the following error in the API Definition blade: #/definitions/: Cannot determine schema of data definition named ''. This doesn't appear to be a problem with the Swagger schema itself, since all of my schema references are formatted correctly, and the schema itself is valid.
UPDATE 2
One thing that I needed to do was restart the gateway under which my API app was being hosted. Restarting the gateway is what caused the error message to change. So I think the app is recognizing my Swagger schema now. But I'm still not sure why I'm getting the 'Cannot determine schema' error, since my schema is formatted correctly.
Now I receive the following error in the API Definition blade: >#/definitions/: Cannot determine schema of data definition named ''. This >doesn't appear to be a problem with the Swagger schema itself, since all of my >schema references are formatted correctly, and the schema itself is valid.
I believe Azure requires a default response defined in swagger for every operation. Perhaps double check that every operation has a default response and every response has a schema property which resolves to a valid schema in the definitions section.
Like:
"paths": {
"/Categories": {
"get": {
"tags": [
"Categories"
],
...
],
...
"responses": {
"200": {
"description": "EntitySet Categories",
"schema": {
"$ref": "#/definitions/NorthwindAPI.Models.Category"
}
},
"default": {
"description": "Unexpected error",
"schema": {
"$ref": "#/definitions/_Error"
}
}
},
...
"definitions": {
"NorthwindAPI.Models.Category": {
"properties": {
"CategoryID": {
"format": "int32",
"description": "CategoryID",
"type": "integer"
},
"CategoryName": {
"description": "CategoryName",
"type": "string"
},
"Description": {
"description": "Description",
"type": "string"
},
"Picture": {
"description": "Picture",
"type": "string"
}
}
},
"_Error": {
"properties": {
"error": {
"$ref": "#/definitions/_InError"
}
}
},
HTH,
Josh

Resources