Check the response of a 'Send Event' Logic App action - azure

I have a Logic App that sends an event to a specified Event Hub using the Send Event action.
It seems that regardless of whether or not the event is accepted by the specified Event Hub, the Logic App continues on regardless. Unlike the Azure Functions action, there appears to be no automatically generated StatusCode property available for Send Event action.
Is it possible to check the response from Event Hubs so that I may determain whether or not to halt execution?
Update
After a completed run, it seems that there is a status code returned by Event Hubs, although unusually it seems to be 200 where as typically when sending events it's 201.
However, when editing the Logig App, there doesn't seem to be any way of accessing that status code in order to check the success/failure of the send event action.

You should be able to use #outputs('Send_event')?['statusCode'] to access the status code.

Related

How to secure reliable publication when send event about successful db insertion to Event Hub?

Context:
In Azure function with EventHubTrigger, I save data mapped from handled event to database (through the Entity framework). This action performs synchronously
Trigger a new event about successful data insertion using event hub producer. This action is async
Handle that triggered event at some other place
I guess it might happen that something fails during saving data, so I am wondering how to prevent inconsistency and secure that event is not sent if it should not.
As far as I know Azure Event Hub has no outbox pattern implemented yet, so I guess I would need to mimic it somehow.
I am also thinking about alternative and a bit smelly solution to make this publish event method synchronous in step 2 (even if nature of the event-driven is to be async) and to add an addition check between step 1 and step 2 - to make sure that everything is saved in db. Only if that condition is fulfilled, event is going to be triggered (step 3).
Any advice?
There's nothing in the SDK that would manage distributed transactions on your behalf. The simplest approach would likely be having a column in your database that allows you to mark when the event was published, and then have your function flow:
Write to the database with the "event published" flag unset; on failure abort.
Publish the event; on failure abort. (the data stays in written)
Write to the database to set the "event published" flag.
You'd need a second Function running on a timer that could scan your database for rows older than XX minutes ago that still need an event, which then do steps 2 and 3 from your initial flow. In failure scenarios, you will have some potential latency between the data being written and the event published or may see duplicate events. (Event Hubs has an at least once guarantee, so you'll need to be able to handle duplicates regardless.)

How to retry sending events from Azure Event Grid to Logic Apps

I have an event grid which publishes a lot of events, and a logic app which needs to consume some of them. These events aren't guaranteed to be in order, and events which require another event to be processed first, might end up in the logic app prematurely, causing them to fail.
From the documentation, I can see that event grid supports a retry policy, with an increasing time interval. This would solve my problem.
However, it seems like the logic app in question, always acknowledges events from the event grid, even though the process is stopped early with the Terminate action in the failure state and with an error code.
From the logic app overview, the runs are shown as failed. But the event grid never attempts a retry, and seems to consider the events successful. What can I do to make the event grid retry failed logic app runs?
It seems that once the Azure logic app is triggered, the event in the Azure event grid is considered to be processed.
I think you can configure retry policy at the step where your Azure logic app failed, please refer to Retry policies.
Take the example of Httpaction:
You can click ··· in the upper right corner of the Http action, then click Settings, and select the type you want under Retry Policy:
Event Grid will retry depending on how you terminate your Logic App. If you terminate using http response action (status code 500) then event grid will attempt retries.
Now, depending on what is going on in your Logic app, handle the failures in a way that it terminates on http response action with status code 500.

Autodesk Forge Webhooks API publish completed webhook

Is there a webhook that notifies that a model publish operation has been completed (succeeded/failed)?
I found a webhook that notifies that the operation has started. However, it is not useful for me because I need to copy the model to another directory after the operation is completed and the model is updated.
As per the documentation of the model.publish event, the payload.state property in the webhook callback represents:
Reason why the notification was triggered. Possible values are PUBLISHING_PENDING and PUBLISHING_IN_PROGRESS.
Note: There is no PUBLISHING_COMPLETE state. Use the dm.version.added event to receieve notifications when publishing is finished.
So, consider using the dm.version.added event.

Pass HTTP request from Azure Function through Event Grid

I've started thinking through a prototype architecture for a system I want to build based on Azure Functions and Event Grid.
What I would like to achieve is to have a single point of entry (Function) which a variety of external vendors will send Webhook (GET) HTTP requests to. The purpose of the Function is to add some metadata to the payload, and publish the package (metadata + original payload from vendor) to an Event Grid. The Event Grid will then trigger another Function, whose purpose is to respond to the original Webhook HTTP request with e.g. a status 204 HTTP code.
The diagram below is a simplified version of the architecture, the Event Grid will of course publish events also to other Functions, but for the sake of simplicity…
The challenge I'm facing at the moment is that the context of the original Webhook HTTP request from external vendor is lost after the first Function is triggered. Trying to send the context as part of the event payload to Event Grid feels like an anti-pattern, and regardless I cannot get it working (the .done() function is lost somewhere in the event). Trying to just use context.res = {} and context.done() in the last Function won't respond to the vendor's original HTTP request.
Any ideas here? Is the whole architecture just one big anti-pattern -- will it even work? Or do I have to immediately send the HTTP response in the first Function triggered by the vendor's request?
Thank you!
You are mixing two difference patterns such as a message-driven and event-driven.
The Azure Event Grid is a distributed Pub/Sub eventing Push model, where the subscriber subscribing an interest on the source in the loosely decoupled manner.
In your scenario, you want to use an eventing model within the message exchange request-response pattern in the sync manner. The request message exchange context can not flow via the Pub/Sub eventing model and back to the anonymous endpoint such as actually a point for response message.
However, there are a several options how to "logical" integrate these two different patterns, the following is showing some of them:
using a request - replyTo message exchange pattern, such as a full duplex communication, one for request and the other one for replyTo.
using a request - response message exchange pattern with a polling state. Basically, your first function will wait for a subscriber state and then return back to the caller. In the distributed internet architecture, we can use an azure lease blob storage for sharing a state between the sync part and async eventing part.
In your scenario, the first AF will create this lease blob, then firing an event to the AEG and then it will periodically polling the state in the lease blob for end of aggregation process (multiple subscribers, etc.).
Also, for this kind of pattern, you can use Azure Durable Function to simplify an integration to the event-driven AEG model.
The following screen snippet shows a sequence diagram using an Azure Lease Blob for sharing a "Request State" in the distributed model. Note, that this pseudo sync/async pattern is suitable for cases when the Request-Response is processing within a short time less than 60 seconds.
For more details about using a Lease Blob within the Azure Function, see my answer here.

How to control the concurrency of Azure Event Grid trigger

I found the host.json which tells how to control behavior of my Function app. But it doesn't show entries about event grid trigger.
I was wondering as publisher(in my case, events related to blob storage) sends http requests to my function, does it mean I can control Event trigger with http configuration? By the way it's preferred not to realize custom Http trigger to handle events but if it's the only way, I may have to accept it.

Resources