What is the subscription validation event message schema in azure event grid? - azure

Created new azure event grid domain with cloud event schema using portal.
Created new web hook endpoint using azure function that can receive both subscription validation event as well as event notifications.
Created new azure event grid topic for the above domain (as part of following subscription) using portal.
Created new azure event grid subscription with cloud event schema with the above web hook endpoint.
When the subscription is Created, the endpoint was invoked by the grid infrastructure with subscription validaiton event to verify web hook endpoint.
To my surprise, the validation event structure (shown below) seemed to conform to native event grid schema and not cloud event schema:
[{
"id": "6309ef83-117f-47aa-a07c-50f6e71a8ca5",
"topic": "/subscriptions/13ad1203-e6d5-4076-bf2b-73465865f9f0/resourceGroups/xxxx-sandbox-rg/providers/Microsoft.EventGrid/domains/eg-xxx-test-cloud-domain/topics/eg-xxx-test-cloud-topic",
"subject": "",
"data": {
"validationCode": "391889BB-FCC3-4269-A2BD-0918B5BAB0AE",
"validationUrl": "https://rp-westus.eventgrid.azure.net/eventsubscriptions/xxxx-subscription-3/validate?id=391889BB-FCC3-4269-A2BD-0918B5BAB0AE&t=2019-01-30T15:45:37.0521594Z&apiVersion=2018-09-15-preview&[Hidden Credential]"
},
"eventType": "Microsoft.EventGrid.SubscriptionValidationEvent",
"eventTime": "2019-01-30T15:45:37.0521594Z",
"metadataVersion": "1",
"dataVersion": "2"
}]
I expected following subscription validation event that conforms to cloud event schema (based on the 0.1 version of cloud event schema at https://learn.microsoft.com/en-us/azure/event-grid/cloudevents-schema#cloudevent-schema):
{
"eventID" : "6309ef83-117f-47aa-a07c-50f6e71a8ca5",
"source" : "/subscriptions/13ad1203-e6d5-4076-bf2b-73465865f9f0/resourceGroups/xxxx-sandbox-rg/providers/Microsoft.EventGrid/domains/eg-xxx-test-cloud-domain/topics/eg-xxx-test-cloud-topic",
"data": {
"validationCode": "391889BB-FCC3-4269-A2BD-0918B5BAB0AE",
"validationUrl": "https://rp-westus.eventgrid.azure.net/eventsubscriptions/xxxx-subscription-3/validate?id=391889BB-FCC3-4269-A2BD-0918B5BAB0AE&t=2019-01-30T15:45:37.0521594Z&apiVersion=2018-09-15-preview&[Hidden Credential]"
},
"eventType" : "Microsoft.EventGrid.SubscriptionValidationEvent",
"eventTime" : "2019-01-30T15:45:37.0521594Z",
"cloudEventsVersion" : "0.1",
"eventTypeVersion" : "2",
}
What am I missing thing?

Basically, the webhook subscriber is handling the following two groups of the events. The specific event type is stored in the http header 'aeg-event-type'.
Internal events of the Event Grid model such as the eventTypes SubscriptionValidation and SubscriptionDeletion. The schema for these event types are always the same as a default schema such as an EventGridSchema. In other words, it's not depended on the EventDeliverySchema. IMO, having the default schema for internal events is making a strong event types specially when we have a CustomInputSchema.
Interest source events (topics) are events defined by input schema and presently the Event Grid model supports 3 types such as EventGridSchema (default), CloudEventSchema and CustomInputSchema.
The AEG supports the following schema input/output mappings:
EventGridSchema to delivery schemas EventGridSchema and CloudEventSchema
CloudEventSchema to delivery schema only CloudSchemaSchema
CustomInputSchema to delivery schema EventGridSchema and CloudEventSchema and CustomInputSchema
The event type in the header is: aeg-event-type=Notification and the schema is based on subscribed EventDeliverySchema (see the following mappings).
Based on the above, for your scenario you should have a separate strong type objects for Internal events (default schema is EventGridSchema) and for Notification events based on the subscribed EventDeliverySchema.
The following is an example of the http headers:
aeg-subscription-name=EVENTGRIDSCHEMA
aeg-delivery-count=0
aeg-data-version=
aeg-metadata-version=0
aeg-event-type=SubscriptionValidation
Note, there is only a subscription name to figure out which an EventDeliverySchema has been subscribed. It will be nice to have an additional aeg header for example: aeg-subscription-labels to pass some subscription metadata to the subscriber handler.
As a workaround, we can pass to the subscriber webhook handler some values via the url query parameters, for instance: &eds=CustomInputSchema

This is a known issue / expected behavior in the Azure Event Grid implementation of Cloud Event V0.1 spec. At the time Cloud Events v0.1 spec was implemented in Azure Event Grid, there was no validation handshake / abuse protection model defined in the Cloud Events standard and hence Event Grid's existing validation handshake model/schema was used for Cloud Event subscribers as well.

Related

How to setup Azure Event Grid for Azure Data Factory triggers?

I am checking how Azure Data Factory (ADF) can be triggered by Event Grid. I have created an Event Grid in the same resource group of my data factory. From ADF it is easy to connect it to the Event Grid topic. However, from Event Grid, I don't know which "Endpoint type" I need to choose. ADF is not one of the available options in "event type" as shown below:
when I choose web hook, it requires an endpoint, If I use the ADF URL it will fail to create the event subscription with error: "Deployment has failed with the following error: {"code":"Url validation","message":"Webhook validation handshake failed for https://adf.azure.com/en/authoring/pipeline/pipeline1.". This is kind of expected. But still I am confused how I need to set up my Event Grid subscription. which of the above options should I choose?
From the ADF side, I can choose the Event Grid for custom trigger. (I also created one Event Grid Topic from the portal parallel to the Event Grid service, however I am not sure these two are different services!) ADF trigger is shown below:
As you see I can make a custom trigger, but the problem is from Event Grid side, how to create a subscription that sends events to ADF. Also in the trigger at ADF, what should be the "Event type"? Just a name is enough?
One other thing, after I create the trigger at ADF side, when I open it again, it goes back to "enter manually" option and the event grid disappears, I am not sure why.
You must be able to do the Microsoft.EventGrid/eventSubscriptions/ action. This action is part of the EventGrid EventSubscription Contributor built-in role.
Prerequisite -
Data Factory expects events to follow the Event Grid event schema. Make sure event payloads have the following fields:
[
{
"topic": string,
"subject": string,
"id": string,
"eventType": string,
"eventTime": string,
"data":{
object-unique-to-each-publisher
},
"dataVersion": string,
"metadataVersion": string
}
]
Follow below Steps:
Go to Azure Data Factory and sign in.
Switch to the Edit tab. Look for the pencil icon.
Select Trigger on the menu and then select New/Edit.
On the Add Triggers page, select Choose trigger, and then select +New.
Select Custom events for Type.
Select your custom topic from the Azure subscription dropdown or manually enter the event topic scope.
The Subject begins with and Subject ends with properties allow you to filter for trigger events. Both properties are optional.
Use + New to add Event Types to filter on. The list of custom event triggers uses an OR relationship. When a custom event with an eventType property that matches one on the list, a pipeline run is triggered. The event type is case insensitive. For example, in the following screenshot, the trigger matches all copycompleted or copysucceeded events that have a subject that begins with factories.
A custom event trigger can parse and send a custom data payload to your pipeline. You create the pipeline parameters, and then fill in the values on the Parameters page. Use the format #triggerBody().event.data.keyName to parse the data payload and pass values to the pipeline parameters.
After you've entered the parameters, select OK.
For more information refer to this official document

Not able to register Event Grid subscription with webhook delivery properties in Azure

I have a REST service hosted in Azure Web app. I registered a webhook on Azure Event Grid by pointing to REST service endpoint. I have followed below link and added endpoint validation with Event Grid events in REST service. I am able to register webhook successfully.
https://learn.microsoft.com/en-us/azure/event-grid/webhook-event-delivery
But I am facing issue(Not able subscribe webhook) if I configure any delivery properties in Event Grid like Authorization or content-type headers as shown below. Please refer below attachment for error details(Shown right side of pic) as well.
Event Grid subscription with webhook delivery properties failure
Could someone please help me on this.
Thanks in advance,
Ashok
First, we need to check how event delivery is authenticated with event handler.
Also, make sure that validation call is successful with event grid, Event grid supports two ways of validations.
Synchronous Validation
Asynchronous Validation
Subscription validation event example as below:
[
{
"id": "2d1781af-3a4c-4d7c-bd0c-e34b19da4e66",
"topic": "/subscriptions/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
"subject": "",
"data": {
"validationCode": "512d38b6-c7b8-40c8-89fe-f46f9e9622b6",
"validationUrl": "https://rp-eastus2.eventgrid.azure.net:553/eventsubscriptions/myeventsub/validate?id=0000000000-0000-0000-0000-00000000000000&t=2021-09-01T20:30:54.4538837Z&apiVersion=2018-05-01-preview&token=1A1A1A1A"
},
"eventType": "Microsoft.EventGrid.SubscriptionValidationEvent",
"eventTime": "2021-00-01T22:12:19.4556811Z",
"metadataVersion": "1",
"dataVersion": "1"
}
]
Refer to Webhook event delivery from MS Docs
Also check this for troubleshooting validation issues

Streaming data consumed from third-party WebHooks to application using an event broker in Azure

I'm currently working on a project where I need to consume a third-party Webhook into my application. The issue is, this third-party service doesn't allow me to pick which events to push to my application through this Webhook, so I'll have to respond to all of them even if they are mostly useless, and if I need to expand on my application or divide it into microservices, I would be streaming the same data to all services even if they have different needs. Also, I would be facing data loss in case of issue with my application server.
The solution would be to use an Event Broker, which would collect all events from the Webhook, respond to the provider with a 200 OK status code, push the event to a specific topic which will be stored until all the concerned subscribed services receive that data.
I'm looking for a fully managed service in Azure, so far I've come across Azure Event Grid, Azure Event Hub and Azure Service Bus.
I wanted to know about the feasibility of this scenario, and if I can stream directly from a Webhook to one of these Azure services.
No, afaik you cannot stream directly into those service. You will need to setup something that accepts the webhook and sends it to one of those listened service.
However, what I would do is create an http triggered azure function. You should be able to configure the webhook to post to the function.
Once you got your function setup you can create some logic there to route the message to the proper channels based on its content. Now that could be an application of yours, a Service Bus Queue, Azure Storage Queue or Event Grid. I would not recommend an Event Hub as it is less suited for this specific purpose.
In the case of consuming a third-party events without guarantee their in order processing and the webhook payload is an array, the Azure Event Grid can be consumed your third-party webhook directly.
The following screen snippet shows this example:
The above integration is based on the Custom Topic Endpoint with a CustomInputSchema.
The Custom Topic Endpoint sends a response back to the webhook with the following HTTP response code:
Success 200 OK
Event data has incorrect format 400 Bad Request
Invalid access key 401 Unauthorized
Incorrect endpoint 404 Not Found
Array or event exceeds size limits 413 Payload Too Large
The AEG model distributing an event in the loosely decoupled Pub/Sub manner with a reliable and retry delivery to the subscriber based on its subscriptions. The AEG subscription represents a logical connectivity between the source of the interest and consumer. It is a set of metadata describing by consumer what, where and how.
Basically there are two delivery patterns such as:
Push-PushAck where the event is pushed to the subscriber handler for its business processing and the result is back to the AEG e.g. Web Hook (Azure Fuction) and Hybrid Connection.
Push-PullAck where the event is reliable delivered to the subscriber and the delivery response is returned back to the AEG. The event must be pulled out from this delivery target for its business post-processing, e.g. Service Bus Queue, Storage Queue and Event Hubs.
UPDATE:
For creating a custom topic endpoint with a CustomInputSchema can be used for example the REST API
The following is an example of the payload PUT request:
{
"location": "westus",
"properties": {
"inputSchema": "CustomEventSchema",
"inputSchemaMapping": {
"properties": {
"id": {
"sourceField": null
},
"topic": {
"sourceField": null
},
"eventTime": {
"sourceField": null
},
"eventType": {
"sourceField": null,
"defaultValue": "notification"
},
"subject": {
"sourceField": null,
"defaultValue": "/webhook/events"
},
"dataVersion": {
"sourceField": null,
"defaultValue": "1.0"
}
},
"inputSchemaMappingType": "Json"
}
}
}
The above CustomInputSchema enables to use any input event schema for this Custom Topic endpoint. That's very nice feature of the AEG. The "bad news" is that the events must be in the array included also a single event. I hope, that the AEG team will make an improvement for custom and domain topics when the single event can be published also as a JObject (no inside of the array).
For bypassing an input event schema via AEG eventing model, the subscriber (consumer of the source event interest) must declared a DeliverySchema = CustomInputSchema. The default output event schema is the EventGridSchema.
The following examples show an event message published to the custom topic with above CustomInputSchema and delivered to the subscribers using a CustomInptutSchema and the other one with the EventGridSchema.
Fire Event to the Custom Topic Endpoint (array of event(s)):
[
{
"abcd": 12345
}
]
Subscriber with DeliverySchema = CustomInputSchema:
{
"abcd": 12345
}
Subscriber with DeliverySchema = EventGridSchema (default schema):
{
"id": "f92a5dbf-d206-4e61-ac1e-7498c543039a",
"eventTime": "2019-07-14T07:19:00.3969337Z",
"eventType": "notification",
"dataVersion": "1.0",
"metadataVersion": "1",
"topic": "/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/rk2012/providers/Microsoft.EventGrid/topics/testerTopic8",
"subject": "/webhook/events",
"data":
{
"abcd": 12345
}
}
Note, that the events can be filtered (selected) for each subscriber based on its subscription properties in the loosely decoupled Pub/Sub manner. In other words, subscriber can subscribed to the AEG eventing model any time via its subscription where is declared a specific source interest, e.g. topic, subject, event data, etc., mechanism of the delivery, retrying, deadlettering, etc.

Handle blob events from storage accounts in multiple Azure Subscriptions in different AD Tenants?

Is it possible to get notified about blobCreated events happening in multiple storage accounts who live in multiple Azure Subscriptions?
I would like to handle blob created events happening in arbitrary storage accounts in a central Azure Function which lives in my subscription but i would like to give customers the possibility to store the data in their own subscription.
I was thinking about using Event Grid Webhook endpoints to route the events to my central Azure Function. Would this be a solid approach to enable multi-subscription scenarios?
Edit: To be more precise, i need this to work over different tenants (as our customers would bring their own subscriptions and we need to integrate them without assigning them to our AD tenant)
Based on our discussion, the following screen snippets show your multi-tenant-fan-in-scenarios.
Subscribing to the distributed interest source across the azure subscriptions (multi-tenants) is done mapping the topic to the webhook endpoint. Note, that the topic represents a full resource path (id) of the place where the event is posting (publishing) to the AEG service. This path is in the scope of the current tenant, see the following example:
"topic": "/subscriptions/myID/resourceGroups/myRG/providers/microsoft.storage/storageaccounts/mySA"
"endpointBaseUrl": "https://myFnc.azurewebsites.net/runtime/webhooks/EventGrid?functionName=myEventGridTrigger&code=xxxx"
This mapping is declared in the subscription metadata stored in the same scope as a topic. On the other side, the webhook endpoint can be posted outside of this scope.
Other more complex solution and the full isolation from the tenats with an event distribution using an FAN-OUT Pub/Sub manner is shown in the following screen snippet:
In the above solution, the fan-in subscriber can mediate an original event message to the properly business oriented event message included a short sasToken for accessing a blob metadata and/or body, etc.
To create an event subscription in your tenant with an event handler for your EventGridTrigger function, you can use for instance the REST API call, see the following example:
PUT https://management.azure.com/subscriptions/myId/resourceGroups/myRG/providers/Microsoft.Storage/storageaccounts/mySA/providers/Microsoft.EventGrid/eventSubscriptions/mySubscription?api-version=2019-01-01
Headers:
Authorization:Bearer eyJ0eXAiOiJKV1QiLCJhb....
Body (minimum payload):
{
"properties": {
"destination": {
"endpointType": "WebHook",
"properties": {
"endpointUrl": "https://myFnc.azurewebsites.net/runtime/webhooks/EventGrid?functionName=myEventGridTrigger&code=xxxxxxxx..."
}
}
}
}
UPDATE:
Another way using the Azure Event Grid Pub/Sub model in the isolated multi-tenants distributed eventing architecture is its cascading.
The logical event pipeline can be constructed via cascading of the Azure Event Grids such as subscribing an Azure Event Grid to the another one using a custom topic.
The following screen snippet shows an example of the Azure Event Grid cascading:
The cascading concept which is based on the Fan-In to Fan-Out pattern is enabled by subscribing a custom topic endpoint to the WebHook event handler of the another event grid model in the standard Pub/Sub manner.
Note, that the Azure Event Grid doesn't have a built-in endpoint for cascading each other including a validation event loopback. However, the following steps can allow to cascade an Azure Event Grid each other.
Create a custom topic endpoint with a CustomInputSchema for example:
{
"properties": {
"inputSchema": "CustomEventSchema",
"inputSchemaMapping": {
"properties": {
"id": {
"sourceField": null
},
"topic": {
"sourceField": null
},
"eventTime": {
"sourceField": null
},
"eventType": {
"sourceField": "myEventType",
"defaultValue": "recordInserted"
},
"subject": {
"sourceField": "subject",
"defaultValue": "/myapp/vehicles/motorcycles"
},
"dataVersion": {
"sourceField": null,
"defaultValue": "1.0"
}
},
"inputSchemaMappingType": "Json"
}
}
}
Note, that the topic property must have a "sourceField": null, which is OK for a custom topic (not for event domains).
For webhook event handler endpoint use the aeg-sas-key in the url query string, for example:
https://myTopic.westus-1.eventgrid.azure.net/api/events?aeg-sas-key=xxxxxxxxxx
Note, that the aeg-sas-key value must be url encoded string.
For subscription validation is used a validationUrl handshake in the fire&forget manner. It can be implemented in the EventGridTrigger function and subscribing to the custom topic for cascading purpose.
The following code snippet shows an example of this implementation:
#r "Newtonsoft.Json"
using System;
using System.Threading.Tasks;
using System.Text;
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Web;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
public static async Task Run(JObject eventGridEvent, ILogger log)
{
log.LogInformation(eventGridEvent.ToString());
string eventType = $"{eventGridEvent["data"]?["eventType"]?.Value<string>()}";
if(!string.IsNullOrEmpty(eventType) && eventType == "Microsoft.EventGrid.SubscriptionValidationEvent")
{
// manual validation
string validationUrl = $"{eventGridEvent["data"]?["data"]?["validationUrl"]?.Value<string>()}";
using (var client = new HttpClient())
{
var response = await client.GetAsync(validationUrl);
log.LogInformation(response.ToString());
}
}
else
{
// notifications
}
await Task.CompletedTask;
}
Note, that the original event message (original source interest) is cascaded (nested) in the event data object each time when is published

Querying events in Azure event hub/blob storage

I am investigating technologies to capture, and store, system events (with a view to maybe future implement some "event sourcing" systems).
I'm interested in Azure Event Hubs as I like the idea of building processing services in Azure Functions & Logic Apps and having them triggered by the event being raised.
I've created my Customer event hub and enabled "capture" so my events and payloads are being stored in Azure Blob storage (.avro files)
I'm wondering how, or indeed if even, I would be able to query the events so say I have a stream capturing all my "Customer" interactions such as Register/Update_Contact_Address etc..... and I wanted to search for all the events for a specific customer ID, how is this achieved? I've seen Stream Analytics jobs but these seem to be for "real time data analysis" rather than me being able to query with a parameter from an application say my customer Guid.
I was hoping to create an small admin application that would allow me to select a customer, and gather all the customer events captured for this id?
Below is sample event I have stored (lifted out of .avro file)
{
"EventId": "51e3610f-8520-406d-8736-45f382bc5110",
"EventName": "ReceiveCustomerReview",
"ReceivedAt": "0001-01-01T00:00:00",
"Client": 1,
"customerGuid": "x45y57x2-5dcc-45c4-86c5-78942db363w1"
"Payload": {
"stars": 5,
"comment": "OMG..... Beautiful product",
"ClientId": 1
}
}
Stream analytics has a new feature by which you can partition output to Blob storage by any attributes or field of your choice. That, with simple SQL query will make it very simple.

Resources