Custom logic app connector - azure

We are creating a multi tenant application. To allow the users to create bussiness logic, we want to use Logic apps.
Therefore I want to create a web app which will expose the DocumentDB change feed.
When creating a logic app, you can choose between different out of the box connectors. How can we get ours included in the list? Is there any documentation on that?
The idea is to get the logic app running with every document insert.
To achieve this, I have two options: Polling triggers and Webhook triggers.
I prefer the polling trigger because this will be less work than implementing logic to handle all the subscribed URL's per tenant. Anyone who has concerns/suggestions on this approach?
The location header should become my continuation token from the DocumentDB change feed, is that correct?
Logic app will call my api the first time without location header
My api will call DocDb without continuation tokens, which will return all docs one by one, because the max doc count is set to 1
My api will return the first document that is retrieved, and will set the retry-after to 0 and the location to new continuation token that I have received. If no documents are found, the api will return the result like in step 5.
Logic app will start a new instance to handle the document and will call the API again with the continuation token in the header.
Step 3 to 4 will be repeated until all documents are processed. Because I am only processing one document per logic app instance, Azure should be able to scale for me automatically?
When all documents are processed, the api will return a 202 statuscode with a location header set to the latest continuation token and a retry-after to 15.
After 15 seconds, logic app will call our api with the latest continuation token. This will trigger the process again.
Is my solution feasible? What if I need to stop, or clone the logic app configuration for some reason, how can I know what the latest continuation was or do I need to save my continuation tokens in some data store?

Yes what you've described here should be supported. You can use your own connector in a logic app by clicking the dropdown above the search and selecting to use an API from API Management or App Services as detailed here and here.
The continuation token can be preserved in the "trigger state" of the location header assuming you are using the 202 polling pattern above. So for example the header may be https://mydocdbconnector.azurewebsites.net/api/trigger?triggerstate={thisCouldBeTheContinuationToken} -- that way on subsequent polls the last continuation token is sent back to the trigger and can be used in the operation. Trigger state is preserved as long as the trigger remains unchanged in the definition (enabled/disabling/etc all preserve trigger state).
The only part I'm not clear on is the multi-tenant requirements you have. I assume you mean you want each of the users to be able to trigger on their own documentDb instance -- the best supported pattern for this today is to have a logic app per customer - each with it's own triggerState and trigger. This could be leveraging a custom connector as well. This is the pattern that services like Microsoft Flow use which are built on Logic Apps.
Let me know if that helps.

Related

Azure Api Management: How to count amount of requests without Subscription Header

My users using AAM API endpoint for a third-party service that requires a webhook - and I need to show usage of this endpoint in User Reports
Third-party API does not support headers and I can't pass Ocp-Apim-Subscription-Key user subscription key to it, and the request will be called anonymously.
As far as I understand that will not allow counting that request in User Reports.
But I can use a token URI parameter to manually get subscription-id and keys for it, with send-request policy.
If I do this is there a way to add the Ocp-Apim-Subscription-Key header to (running?) request in order to perform it on behalf of user subscription?
So far, I can only think about wrapping the required request in another AAM request which will use send-request and set-header policies like that
> POST /endpoint/telegram/public/token123
>> <send-request>GET /token123/keys/primary</send-request>
> POST /endpoint/telegram/token123 +H 'Ocp-Apim-Subscription-Key:key123'
I returned to this question after a couple of days and feel extremely embarrassing now.
To perform a request on behalf of the User I can use the API key in the query instead of the header, there are literally separate setting for that
That solves everything.
You may try to integrate with Application insight to monitor details : How to integrate Azure API Management with Azure Application Insights
Also, you can leverage the metrices to analyze the request pattern.
Based on any condition you may create alert to notify you as well : https://learn.microsoft.com/en-us/azure/api-management/api-management-howto-use-azure-monitor

Peek and Complete Message using different Receiver Instances - Azure Service Bus

Scenario
When business transactions are performed, we're supposed to make that data available to end clients.
Current Design
Our web app publishes transaction messages are added to a topic on the Azure Service Bus.
We expose APIs to clients through which they can consume the data from those transactions.
Upon calling these APIs, we read the messages from the Subscription and return it to the client.
Problem
We want a guaranteed delivery - we want to make sure the client acknowledges the delivery of the data. So we don't want to remove the message from the subscription immediately. We want to keep it until the client acknowledges it.
So we only want to do a "Peek" instead of "Receive".
So the client calls the first API, to get the data, where we do a Peek.
And once the client has received the packets, the client would call a second API, to acknowledge.
At this point, we want to remove the message from the Subscription, making it Complete.
The current design of the Service Bus Message Receiver is that, a Complete can be performed only using the same Receiver instance that performed the Peek, as per the documentation, and we also observed the same when we tried it out.
Both the APIs, are two separate APIs and we cannot do the Peek and Complete using the same instance of the Receiver.
Thinking about options to somehow make the Receiver as a Singleton, across APIs within that App Service.
However this will be a problem when the App Service scales out.
Is there a different way to achieve what we're trying to do here ?
There is an option available in Azure Service Bus to defer messages. Once a message is deferred, it can be received with the help of it's sequence number.
The first client should receive the message and instead of completing it, it should defer it and return it.
The second client (which has sequence number) can receive the message from the Subscription. Refer here for more details.
Another option would be to not use a Service Bus Client on your backend and instead your clients could directly work with Service Bus using its Service REST API (assuming they can't use the AMQP client if I am understanding your scenario correctly).
There are APIs to
Peek-Lock
Renew Lock
Unlock
Delete (Complete)
You could also proxy these requests if you'd like using your backend itself or a service like APIM if you are already using it.
PS: Cross posting the answer for the same query on the MSDN forum

custom input & output fields for azure function

I am not sure if this is an option. I require it badly, and cannot find any references of it.
When integrating between an HTTP Triggered Azure Function within your Logic App flow, you are requested to pass a body object for the function to digest:
I am hoping for a way to customize the Request Body inputs, to make it have a strict template structure, in and out.
In:
Out:
(The function returns an Object)
Is there any way I can achieve this?
You could create a Logic Apps Custom Connector.
You can create one of these from a Postman collection or using an OpenAPI definition. You can get hold of the OpenAPI definition from the Function App.
With a custom connector, you have a bit more control over the request and response. This will allow the users of your connector to provide inputs when used in a Logic App and also receive tokens from the response. These can then be used in further Logic App steps.
Just hard coded it to the fields I require it using the Logic App syntax and it responds to satisfaction
#{body('function-name')?['property']}
Haven't tried the Custom Connector

Made2Manage ERP Web service method information

I am using Mane2Manage ERP Web service but want to know about methods.
Which method I can use to getting Customer and Sales Order.
HTTP GET
Use GET requests to retrieve resource representation/information only – and not to modify it in any way. As GET requests do not change the state of the resource, these are said to be safe methods. Additionally, GET APIs should be idempotent, which means that making multiple identical requests must produce the same result every time until another API (POST or PUT) has changed the state of the resource on the server.

How to make an Approval step in Azure Logic app calling my own APIs similar to office365 approval connector?

I wanna build a small workflow using Azure Logic Apps that contains an "Approval" step, which is simply an API call in my own system, similar to office 365 approval connector.
However, from what I found on the internet, the only way to make a long running task in Azure Logic Apps is to use Webhooks.
In Webhooks, I could not set a value to the parameter I created "Bool-Approved".. so, How can I check it later in a condition step?
The other possible solution maybe is to use Swagger to have an "Bool-Approved" parameter. However, it does not support long running action!
What's the possible solution for me?
As you mentioned, the way to do it is to use the Webhook action, and for that you need to implement the Subscribe/Unsubscribe pattern described here. The webhook action will allow you to get any payload (via an HTTP Post) from the instance-based webhook you are subscribing to.
The points below are a summary of this blog post:
https://www.mexia.com.au/correlation-identifier-pattern-on-logic-apps/
To implement the Subscribe/Unsubscribe Webhook pattern you need to consider:
Subscription store: A database to store the unique message Id and the
instance-based callback URL provided by the webhook action.
Subscribe and Start Request Processing API: this is a RESTful API that is in charge of starting the processing of the request and storing the
subscription.
Unsubscribe and Stop Request Processing API: this is another RESTful API that is only going to be called if the webhook action on the main workflow times out. This API is in charge of stopping the processing and deleting the subscription from the store.
Instance-based webhook: this webhook is to be triggered by your own custom approval event. Once triggered, your webhook is in charge of getting the instance-based callback URL from the store and invoking it. After making the call back to the main workflow instance, the subscription is to be deleted. This is the webhook that is in charge of sending the payload you require to the waiting webhook action in your Logic App.
The subsequent actions will be able to use that response body, so you can implement your conditions, etc.
You can follow the blog post mentioned above to see a detailed example and get more details on how to implement it.
make you api return HTTP code 200 if the response if "ok" and 400 if the response is "not ok". This way you can force logic app to behave the way you need it to behave..

Resources