Okta management api within Azure logic app - retrieve header with duplicate key - azure

Using logic apps to extract some logs from Okta and come across an interesting issue.
The okta management api paginates the responses and provide the link to the next list of paginated responses in a 'Link' header; however they also provide another 'Link' header which gives you the link to the current page.
Ie:
link : <https://dev1-web.okta.com/api/v1/logs?since=2022-07-07T20%3A19%3A.0837307Z&sortOrder=ASCENDING>; rel="self"
link : <https://dev1-web.okta.com/api/v1/logs?since=2022-07-07T20%3A19%3A.0837307Z&sortOrder=ASCENDING&after=1657289356194_1>; rel="next"
When retrieving the headers from the http call it merges them as:
<https://dev-web.okta.com/api/v1/logs?since=2022-07-07T20%3A19%3A.0837307Z&sortOrder=ASCENDING>; rel="self",<https://dev-web.okta.com/api/v1/logs?since=2022-07-07T20%3A19%3A.0837307Z&sortOrder=ASCENDING&after=1657289356194_1>; rel="next"
I am trying to find a way to pull the link header out which has the rel="next" wihtout having to add a load of condition logic to check which side of the split has the rel="next". Any ideas?

Since I don't find the response to be in any format, One of the workarounds that can meet the requirement is to extract the link using string functions such as lastIndexOf and slice functions in Compose connector. Here is my logic app flow.
Below is the expression I'm using in my compose connector.
slice(outputs('Compose'),add(lastIndexOf(outputs('Compose'),'<'),1),lastIndexOf(outputs('Compose'),'>'))
RESULTS:
Below is the code-view of my logic app
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Compose": {
"inputs": "link : <https://dev1-web.okta.com/api/v1/logs?since=2022-07-07T20%3A19%3A.0837307Z&sortOrder=ASCENDING>; rel=\"self\"\nlink : <https://dev1-web.okta.com/api/v1/logs?since=2022-07-07T20%3A19%3A.0837307Z&sortOrder=ASCENDING&after=1657289356194_1>; rel=\"next\"",
"runAfter": {},
"type": "Compose"
},
"Compose_2": {
"inputs": "#slice(outputs('Compose'),add(lastIndexOf(outputs('Compose'),'<'),1),lastIndexOf(outputs('Compose'),'>'))",
"runAfter": {
"Compose": [
"Succeeded"
]
},
"type": "Compose"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {},
"triggers": {
"manual": {
"inputs": {
"schema": {}
},
"kind": "Http",
"type": "Request"
}
}
},
"parameters": {}
}
Consider if you are receiving a JSON response, then you can parse the response and can extract it directly.

Related

No Workflow Resource found in the template in Logic App

I am trying to copy the logic app code from code view and paste it in the logic app project that I created locally.
Once I have pasted the code, I am trying to right click and open the file using logic app designer, then I am getting an error No workflow resource found in the template in Logic App.
Request some help if someone has faced this problem before.
I would suggest you to copy each array separately like triggers and actions and paste it in local project. I have done the same thing and i am able to open the designer. Below are steps I followed,
Created a standard logic app in portal.
Created a workflow as shown below,
Created a local project in vs code as followed by document.
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {},
"contentVersion": "1.0.0.0",
"outputs": {},
"triggers": {}
},
"kind": "Stateful"
}
Copied actions json from portal to vs code.
"actions": {
"Initialize_variable": {
"inputs": {
"variables": [
{
"name": "str",
"type": "string"
}
]
},
"runAfter": {},
"type": "InitializeVariable"
},
"Lists_blobs_(V2)": {
"inputs": {
"host": {
"connection": {
"referenceName": "azureblob"
}
},
"method": "get",
"path": "/v2/datasets/#{encodeURIComponent(encodeURIComponent('AccountNameFromSettings'))}/foldersV2/#{encodeURIComponent(encodeURIComponent('xxxx'))}",
"queries": {
"nextPageMarker": "",
"useFlatListing": false
}
},
"metadata": {
"xxxx": "/xxxx"
},
"runAfter": {
"Initialize_variable": [
"Succeeded"
]
},
"type": "ApiConnection"
}
}
Now copied triggers code from portal to vs code.
"triggers": {
"manual": {
"inputs": {},
"kind": "Http",
"type": "Request"
}
}
Saved the workflow.json file in vs code.
Able to open designer from local project,

Trigger pipeline upon approval in Azure

I have a report which generate after running an ADF pipeline, and I would need to append these records into a history table upon client approval. So, I need to use either PowerBi or Sharepoint to show this report to the client and get approval. So, I have this plan, Can someone please tell me if this is doable, if yes, how to achieve it? if not please suggest changes.
Show the report in either PowerBI or SharePoint, and have buttons Approve/Reject.
If the client clicks on Approve, it should trigger a pipeline using the Logic app, with necessary parameters passed.
if this is doable, can you please share the references? if not, please let me know how i achieve this functionality in another way.
One of the workaround to achieve your requirement is to use logic apps and send the PowerBI or SharePoint link using email with Send approval email action of outlook connector where the email is send with Approval or Reject user options.
The flow is kept on hold until the the response is received.
In my outlook
Run after response is received
Now to continue the flow if you receive accepted response, You can add condition action and check if the response is Approved and then continue the flow.
RESULTS:
Below is the codeview of my Logic App
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Condition": {
"actions": {
"Compose": {
"inputs": "Your response have been Accepted.",
"runAfter": {},
"type": "Compose"
}
},
"expression": {
"and": [
{
"equals": [
"#body('Send_approval_email')?['SelectedOption']",
"Approve"
]
}
]
},
"runAfter": {
"Send_approval_email": [
"Succeeded"
]
},
"type": "If"
},
"Send_approval_email": {
"inputs": {
"body": {
"Message": {
"Body": "https://microsoftapc.sharepoint.com/teams/Sample2408/Lists/SampleList/AllItems.aspx",
"HideHTMLMessage": false,
"Importance": "Normal",
"Options": "Approve, Reject",
"ShowHTMLConfirmationDialog": false,
"Subject": "Approval Request",
"To": "<Email_ID>"
},
"NotificationUrl": "#{listCallbackUrl()}"
},
"host": {
"connection": {
"name": "#parameters('$connections')['office365']['connectionId']"
}
},
"path": "/approvalmail/$subscriptions"
},
"runAfter": {},
"type": "ApiConnectionWebhook"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
}
},
"triggers": {
"manual": {
"inputs": {
"schema": {}
},
"kind": "Http",
"type": "Request"
}
}
},
"parameters": {
"$connections": {
"value": {
"office365": {
"connectionId": "/subscriptions/<Sub_Id>/resourceGroups/<RG>/providers/Microsoft.Web/connections/office365",
"connectionName": "office365",
"id": "/subscriptions/<Sub_Id>/providers/Microsoft.Web/locations/centralus/managedApis/office365"
}
}
}
}
}

How to add to Azure Blob Storage from Logic App

I have an Azure Stream Analytics job which adds JSON data to an Azure storage container as an output. The data is organized in a folder structure based on the date, as specified in the stream analytics output here:
ASA Output Details
I also have a Logic App which I want to add data to the same place. I am looking at the logic app Blob Storage actions and cant figure out how to do this. The Update Blob action seems to want to point to a single blob file, rather than having it integrated into the data based on date.
Is there a way to do this with the Logic Apps actions? Or maybe there is a better way to structure my data so that I can add events both from stream analytics as well as from logic apps?
Thanks!
To add blobs to the same folder structure you need to mention the path. To achieve your requirement I have used Create blob action and added blobs to the required folder.
In case if you are trying to automate this process to run everyday and add files to everyday to the date path, then try following the below flow.
You can reproduce the same in your environment using the below code view
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Compose": {
"inputs": {
"Sample": "Sample Json from Logic Apps"
},
"runAfter": {},
"type": "Compose"
},
"Create_blob_(V2)": {
"inputs": {
"body": "#outputs('Compose')",
"headers": {
"ReadFileMetadataFromServer": true
},
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "post",
"path": "/v2/datasets/#{encodeURIComponent(encodeURIComponent('AccountNameFromSettings'))}/files",
"queries": {
"folderPath": "/container1/#{formatDateTime(utcNow(),'yyyy')}/#{formatDateTime(utcNow(),'MM')}/#{formatDateTime(utcNow(),'dd')}",
"name": "LAEvent.json",
"queryParametersSingleEncoded": true
}
},
"runAfter": {
"Compose": [
"Succeeded"
]
},
"runtimeConfiguration": {
"contentTransfer": {
"transferMode": "Chunked"
}
},
"type": "ApiConnection"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
}
},
"triggers": {
"manual": {
"inputs": {
"schema": {}
},
"kind": "Http",
"type": "Request"
}
}
},
"parameters": {
"$connections": {
"value": {
"azureblob": {
"connectionId": "/subscriptions/<SUB_ID>/resourceGroups/<RG>/providers/Microsoft.Web/connections/azureblob",
"connectionName": "azureblob",
"id": "/subscriptions/<SUB_ID>/providers/Microsoft.Web/locations/centralus/managedApis/azureblob"
}
}
}
}
}
In Storage account:
Updated Answer
In the case where you want to add data to an existing blob, you can concat both the data which is already present in the blob and the new logic app's data using concat function.
concat(body('Get_blob_content_(V2)'),outputs('Compose'))
RESULTS:

Azure Logic App and Function App performance difference

We are in the process of setting up new project. Our requirements is to invoke multiple rest API's and aggregate the response and send it back to mobile client.
We are exploring these 2 options for our experience layer(Integration )
1. Logic Apps
2. Azure Function
We have observed one major difference with respect to performance between these two.
We run through simple use case to compare the performance.
we are just invoking a rest API to get some metrics with different options available
Just integrate with APIM as back-end service
Using Azure Function
Using Logic Apps
Below are the metrics
Logic app is taking longer time for execution compare to other options. Below is the simple logic app to invoke rest api
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"GetReferenceData": {
"inputs": {
"headers": {
"Authorization": "#variables('AuthToken')"
},
"method": "GET",
"uri": "url"
},
"runAfter": {
"Initialize_AuthToken": [
"Succeeded"
]
},
"type": "Http"
},
"Initialize_AuthToken": {
"inputs": {
"variables": [
{
"name": "AuthToken",
"type": "string",
"value": "#{triggerOutputs()['headers']?['Access-Token']}"
}
]
},
"runAfter": {},
"type": "InitializeVariable"
},
"Response": {
"inputs": {
"body": "#body('GetReferenceData')",
"statusCode": "#outputs('GetReferenceData')['statusCode']"
},
"kind": "Http",
"runAfter": {
"GetReferenceData": [
"Succeeded"
]
},
"type": "Response"
},
"Response_2": {
"inputs": {
"body": "#body('GetReferenceData')",
"statusCode": "#outputs('GetReferenceData')['statusCode']"
},
"kind": "Http",
"runAfter": {
"GetReferenceData": [
"Failed",
"Skipped",
"TimedOut"
]
},
"type": "Response"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {
"storageLocation": {
"defaultValue": [],
"type": "Array"
}
},
"triggers": {
"manual": {
"inputs": {
"method": "GET",
"relativePath": "/referenceData",
"schema": {}
},
"kind": "Http",
"type": "Request"
}
}
},
"parameters": {}
}
We have so many use cases where we need to invoke the multiple rest API's and aggregate the result. With the above numbers it seems Function App is doing a way better job than Function App.For parallel operations i may rely upon durable functions over Logic apps.
So i just want to understand why logic app is taking longer time almost double time compared to function for the similar operation?
Is logic app is not meant for these operations?
Refer another thread where this question is already answered. Is Logic Apps performance slower compared to a direct .NET REST Call?
Following addition may provide more insight
“Azure Functions is code being triggered by an event, whereas Logic Apps is a separate framework of workflow triggered by an event.”
The Logic App is a logical container for one workflow you can define using triggers and actions.
A Logic App runs on a set of infrastructure of an Azure region (VM’s in a data centre), and it consists of several components not visible to you as it is abstracted away. By provisioning a Logic App, you leverage a bit of that infrastructure (indirectly via the Logic App Service) once you define a workflow and the flow gets triggered.
Azure Functions are part of the Azure Web + Mobile suite of App Services and are designed to enable the creation of small pieces of meaningful, reusable methods, easily shared across services.

Template deployment to Azure API management with swagger fails with 'path' must not be empty

I am trying to create an API and operations in azure API management using the swagger import feature, using a template derived from the doumentation at https://learn.microsoft.com/en-us/azure/templates/microsoft.apimanagement/2018-01-01/service/apis
Every time I deploy my API using my Azure Resource manager template to Azure API management I get the error 'path' must not be empty. What am I doing wrong? Path is definitely not empty!
For this example you can just use any valid swagger file contents such as at https://petstore.swagger.io/v2/swagger.json
{
"$schema": "http://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"apim_name": {
"type": "string"
},
"api_name": {
"type": "string"
},
"swagger_json": {
"type": "string"
}
},
"variables": {},
"resources": [
{
"type": "Microsoft.ApiManagement/service/apis",
"name": "[concat(parameters('apim_name'), '/' ,parameters('api_name'))]",
"apiVersion": "2018-06-01-preview",
"properties": {
"displayName": "Pet Store",
"description": "Cool api def",
"serviceUrl": "https://petstore.swagger.io/v2",
"path": "petstore",
"protocols": [
"https"
],
"authenticationSettings": {
"oAuth2": null,
"openid": null,
"subscriptionKeyRequired": true
},
"subscriptionKeyParameterNames": {
"header": "Ocp-Apim-Subscription-Key",
"query": "subscription-key"
},
"contentValue": "[parameters('swagger_json')]",
"contentFormat": "swagger-json"
}
}
]
}
It seems the API management resource manager APIs are fussy about parameters when using the swagger import feature and the docs and error messages are a little lacking.
The secret is that the swagger file definition replaces most of the properties you would normally pass for an API in the template so you need a much reduced template, as below.
Hope this helps someone else!
{
"$schema": "http://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"apim_name": {
"type": "string"
},
"api_name": {
"type": "string"
},
"swagger_json": {
"type": "string"
}
},
"resources": [
{
"type": "Microsoft.ApiManagement/service/apis",
"name": "[concat(parameters('apim_name'), '/' ,parameters('api_name'))]",
"apiVersion": "2018-06-01-preview",
"properties": {
"path": "petstore",
"contentValue": "[parameters('swagger_json')]",
"contentFormat": "swagger-json"
}
}
]
}

Resources