I know that there is EventGrid trigger for Azure Functions and I successfully used it in the past.
What I need now is to trigger another Azure Function only after TWO different events occurred (the order of events might be different).
In other words what I need is:
Event A occurred -> After some time Event B occurred -> Immediately execute Azure Function X
or
Event B occurred -> After some time Event A occurred -> Immediately execute Azure Function X
Is there an Azure Services that allows to merge or combine events from two different sources and convert it to one event? Or should I somehow persist the information what events occurred? Any suggestions?
This can be achieved via various services. Eg
Azure Logic Apps
You can execute your workflow based on conditions. More Reference
Using Service Bus
Consider a message is an event, based on the type of property inside the message you can determine what to do with your event.
Another way for aggregation pattern of the events with a generic and flexible solution (more events, etc.) is to ingest the AEG events to the stream pipeline (Event Hub event handler) and using the Azure Stream Analytics job for generating an event interest for outputting to the Azure Function.
The following screen snippet shows this full declarative solution:
Related
I am designing Http trigger function. This function will be used to authenticate and to call an external API. Then get the response and pass it on to the caller(~in this case web application).
This function can be called thousand time/min(~during peak load).
I am going through event grids and having hard time deciding why to use it.
Microsoft says to safeguard your events so that it's not lost due to any reason I should use it along with my http trigger azure function.
I can very well design a queue trigger function which will process these requests in queue.
I am referring to this article of MS which says to use Event grid to gain more control on your serverless function for example:
MS Event Grid
I am going through event grids and having hard time deciding why to
use it.
I can very well design a queue trigger function which will process
these requests in queue.
I think this needs to be based on your needs. The event grid is discrete based on event triggers. And event grid has higher scalability.
If you use a queue trigger, it is not triggered based on an event, is it? All in all, it is not necessary to use event grid. Please refer to your use case for specific use.
If you want a comparison study of the messaging services you may refer to : https://tsuyoshiushio.medium.com/azure-messaging-service-in-a-picture-f8113cec54cd
https://hackernoon.com/azure-functions-choosing-between-queues-and-event-hubs-dac4157eee1c
For event grid, Azure Event Grid is an eventing service for the cloud. Azure Functions is one of the supported event handlers.
Azure Event Grid allows you to easily build applications with event-based architectures. First, select the Azure resource you would like to subscribe to, and then give the event handler or WebHook endpoint to send the event to. Event Grid has built-in support for events coming from Azure services, like storage blobs and resource groups. Event Grid also has support for your own events, using custom topics. reference
I have an Azure Event Hub over which I would like to send various types of messages. Each message should be handled by a separate Azure Function, based on their message type. What is the best way to accomplish this?
Actually, I could create some JSON container with a type and payload property and let one parent Azure Function dispatch all the messages payloads - based on their type - to other functions, but that feels a bit hacky.
This question basically asks the same - however it is answered how it can be done using the IoT Hub and message routing. In the Event Hub configuration I cannot find any setting to configure message routing though.
Or should I switch to an Azure Message Queue to get this functionality?
I would use Azure Streaming Analytics to route it to the different Azure Functions. ASAs allow you to specify Event Hubs as a source and several sinks (one of which can be multiple Azure Functions). You can read more about setting up Azure Streaming Analytics services through the Azure Portal here. You'll need to set up the Event Hub as your source (docs). You'll also need to set up your sink (docs). You write some MS SQL-like code to route the messages to the various sinks. However, ASAs are costly relative to other services since you're paying for a fixed amount of compute.
I put some pseudo code below. You'll have to swap it out based on how you configure you're ASA using the information from the attached MS Documentation.
SELECT
*
INTO
[YourOutputAlias]
FROM
[YourInputAlias]
HAVING
[CONDITION]
SELECT
*
INTO
[YourAlternateOutputAlias]
FROM
[YourInputAlias]
HAVING
[CONDITION]
Based on your additional info about the business requirements and assuming that the event size < 64KB (1MB in preview), the following screen snippet shows an example of your solution:
The concept of the above solution is based on the pushing a batch of the events to the Event Domain Endpoint of the AEG. The EventHub Trigger function has a responsibility for mapping each event message type in the batch to the domain topic before its publishing to the AEG.
Note, that using the Azure IoT Hub for ingestion of the events, the AEG can be directly integrated to the IoT Hub and each event message can be distributed in the loosely decoupled Pub/Sub manner. Besides that, for this business requirements can be used the B1 scale tier for IoT Hub ($10/month) comparing to the Basic Event Hubs ($11.16).
The IoT Hub has built-in a message routing mechanism (with some limitations), but a recently new feature of the IoT/AEG integration such as publishing a device telemetry message is giving a good support in the serverless architecture.
I ended up using Azure Durable Functions using the Fan Out/Fan In pattern.
In this approach, all events are handled by a single Orchestrator Function which in fact is a Durable Azure Function (F1). This deserializes incoming JSON to the correct DTO. Based on the content of the DTO, a corresponding activity function (F2) is invoked which processes it.
Basically what I understood from few Azure topics is as below:
Azure Event Hub - where data is received initially and converted into events
Service Bus- acting as a queue
Azure Event Grid - where events converted in hub are transferred here.
so the connection is like below:
Hub -> Service Bus -> Event Grid -> Pub Sub -> Storage
I understood this concept. My problem is I want data to be pushed from the event grid to GCP (subscription / topics). My question are:
How can I establish this using PUSH method?
What do I need to develop exactly?
How can I push things from grid to pubsub/subscriptions?
I found this link where data is getting published into Event Grid but I want to push data from the event grid to gcp. Can anybody explain me where am I going wrong or what exactly should I start with. I am new to this and its very confusing so I just need little bit of guidance over here.
I have below doubts:
Is there any direct subscriber option available with event grid listener? I mean can I directly link my google storage account with this listener so, whenever there is an event triggered it will be directly pushed to my GCP account(I don't have Azure account with me right now since access issue is in progress so I can't see it that's why I am asking here)
Suppose I have 20 columns in my data but I want only 16 columns to be pushed in GCP so is there any customization possible while sending data from event grid/event hub to pub/sub
If I write custom connectors code as per the links provided in the below answers then how can I run it? I mean where I can deploy those scripts on the cloud so that they will be triggered automatically whenever an event is triggered?
Can I implement webhooks in this scenario? (as an alternative to connectors), If yes then how can I do it and on which side do I need to create it?
Also, I read some articles and I came to know from a few guys that they experienced data loss in this entire process. So, what's the possibility over here and how can it be avoided
Can anybody explain me where am I going wrong or what exactly should I start with.
It's right here:
so the connection is like below:
Hub -> Service Bus -> Event Grid -> Pub Sub -> Storage
Although this might be the case, it sounds very much as if you're looking at one (very) specific scenario where data flows in this exact way.
Azure Event Hub, Azure Service Bus and Azure Event Grid can work together, but can also be used completely separate from each other.
Event Grid
The purpose of Event Grid is to enable Reactive programming. Use this when you want to react to (status) changes.
Event Hubs
Event Hubs facilitate a big data pipeline. Use this when you need telemetry and distributed data streaming.
Service Bus
The purpose of Service bus is to enable High-value enterprise messaging. Use this when you want to do something like Order processing and financial transactions.
In some cases, you use the services side by side to fulfill distinct roles. For example, an ecommerce site can use Service Bus to process the order, Event Hubs to capture site telemetry, and Event Grid to respond to events like an item was shipped.
In other cases, you link them together to form an event and data pipeline. You use Event Grid to respond to events in the other services. For an example of using Event Grid with Event Hubs to migrate data to a data warehouse, see Stream big data into a data warehouse.
Taken from the very interesting and important documentation article Choose between Azure messaging services - Event Grid, Event Hubs, and Service Bus
EDIT
My problem is I want data to be pushed from event grid to GCP (subscription / topics). So how can I establish this using PUSH method??
Possibly the simplest solution is to have an Event Grid Event trigger a webhook (which might run an Azure Function or a Google Cloud Function) which in turn puts the event/message on the GCP Topic.
Publishing messages is quite well documented. There are examples on how to do so with a REST call, command-line, C#, Go, JAVA, NodeJS, PHP, Python and Ruby.
EDIT 2
What you need to do is create an Event Grid Subscription to listen to and handle Event Grid Events.
Here's an example screenshot on how to listen for events for a specific Storage Account and call a WebHook whenever such an event occurs:
Pay attention to the "Endpoint Details": that's where you can specify to, for instance, call a webhook every time an event is triggered.
The easiest way to transfer the EventHub generated events would probably be to create an EventHub event receiver in Node.js (which you mentioned in your comments) as described here, which receives events and publishes them to Cloud Pub/Sub directly, as described in the Cloud Pub/Sub publisher documentation for Node.js.
I have one requirement to do file merging based on event driven architecture. I am having two blob containers and i need to merge files as soon as they are available in their respective containers. Correlation will happen based on file name.
That means suppose i have two containers, container A and container B. When file comes to container A then it should wait for the file to come in container B and then event should trigger which will get subscribed by ADF or logic app for further processing. Please suggest some way to achieve this.
Event Grid Microsoft.Storage.BlobCreated event will be raised per container and will not wait for another container to raise an event.
One option I could think of is to handle your events using Durable Function where you could use Correlation value as Durable Function instance ID to dentify an existing function or start a new one. If a function instance with a given ID is already running, you'd be able to either perform the merge or raise a new custom event and handle it separately.
another option is creating a simple Distributed Event Aggregator, like is shown in the following screen snippet:
The concept is based on the Lease Blob where is stored the State of EventAggregator with all received event messages. The HttpTrigger function has a responsibility for handling and managing received event messages, create and update the State and handling a retry delivery for reliable updating a State. In this case, the dead-lettering is used as a Watchdog Timer.
Creating or Updating a Lease Blob will generate an event for subscriber and its logic can see the State of EventAggregator with an array of received event messages.
In the case when we are not using an eventing for Lease Blob (separate storage account), the finally event message with an EventAggregator State can be sent to the Storage queue by HttpTrigger function - EventAggregator.
Is there a way to make an Azure function triggerable by multiple Service Bus event queues? For example, if there is a function which logic is valid for multiple cases(event start, event end- each inserted into a different Service Bus queue) and I want to reuse it for these events can I subscribe to both of them in the Service Bus from the same function?
I was looking for an answer to this question, but so far everywhere I checked it seems to be impossible.
Azure Functions can be triggered by a single source queue or subscription.
If you'd like to consolidate multiple sources to serve as a trigger for a single function, you could forward messages to a single entity (let's assume a queue) and configure Function to be triggered by messages in that queue. Azure Service Bus support Auto-Forwarding natively.
Note that there cannot be more than 3 hops and you cannot necessarily know what the source was if message was forwarded from a queue. For subscriptions, there's a possible workaround to stamp messages.
If your goal is to simply reuse code, what about refactoring that Function to create a class which is then used in multiple functions.
If your goal is implementing events aggregation, you could probably create an Azure Durable Function Workflow that would do a fan-in on multiple Events.
Excerpt from https://github.com/Azure/azure-functions-durable-extension/issues/166:
Processing Azure blobs in hourly batches.
New blob notifications are sent to a trigger function using Event Grid trigger.
The event grid trigger uses the singleton pattern to create a single orchestration instance of a well-known name and raises an event to the instance containing the blob payload(s).
To protect against race conditions in instance creation, the event grid trigger is configured as a singleton using SingletonAttribute.
Blob payloads are aggregated into a List and sent to another function for processing - in this case, aggregated into a single per-batch output blob.
A Durable Timer is used to determine the one-hour time boundaries.
You might want to consider switching the pattern around by using only one queue but multiple Topics/Subscriptions for the clients.
Then the Function in question can be triggered by the Start-End Topic.
Some other Function can be triggered by the Working Topic, etc.