Querying events in Azure event hub/blob storage - azure

I am investigating technologies to capture, and store, system events (with a view to maybe future implement some "event sourcing" systems).
I'm interested in Azure Event Hubs as I like the idea of building processing services in Azure Functions & Logic Apps and having them triggered by the event being raised.
I've created my Customer event hub and enabled "capture" so my events and payloads are being stored in Azure Blob storage (.avro files)
I'm wondering how, or indeed if even, I would be able to query the events so say I have a stream capturing all my "Customer" interactions such as Register/Update_Contact_Address etc..... and I wanted to search for all the events for a specific customer ID, how is this achieved? I've seen Stream Analytics jobs but these seem to be for "real time data analysis" rather than me being able to query with a parameter from an application say my customer Guid.
I was hoping to create an small admin application that would allow me to select a customer, and gather all the customer events captured for this id?
Below is sample event I have stored (lifted out of .avro file)
{
"EventId": "51e3610f-8520-406d-8736-45f382bc5110",
"EventName": "ReceiveCustomerReview",
"ReceivedAt": "0001-01-01T00:00:00",
"Client": 1,
"customerGuid": "x45y57x2-5dcc-45c4-86c5-78942db363w1"
"Payload": {
"stars": 5,
"comment": "OMG..... Beautiful product",
"ClientId": 1
}
}

Stream analytics has a new feature by which you can partition output to Blob storage by any attributes or field of your choice. That, with simple SQL query will make it very simple.

Related

How to setup Azure Event Grid for Azure Data Factory triggers?

I am checking how Azure Data Factory (ADF) can be triggered by Event Grid. I have created an Event Grid in the same resource group of my data factory. From ADF it is easy to connect it to the Event Grid topic. However, from Event Grid, I don't know which "Endpoint type" I need to choose. ADF is not one of the available options in "event type" as shown below:
when I choose web hook, it requires an endpoint, If I use the ADF URL it will fail to create the event subscription with error: "Deployment has failed with the following error: {"code":"Url validation","message":"Webhook validation handshake failed for https://adf.azure.com/en/authoring/pipeline/pipeline1.". This is kind of expected. But still I am confused how I need to set up my Event Grid subscription. which of the above options should I choose?
From the ADF side, I can choose the Event Grid for custom trigger. (I also created one Event Grid Topic from the portal parallel to the Event Grid service, however I am not sure these two are different services!) ADF trigger is shown below:
As you see I can make a custom trigger, but the problem is from Event Grid side, how to create a subscription that sends events to ADF. Also in the trigger at ADF, what should be the "Event type"? Just a name is enough?
One other thing, after I create the trigger at ADF side, when I open it again, it goes back to "enter manually" option and the event grid disappears, I am not sure why.
You must be able to do the Microsoft.EventGrid/eventSubscriptions/ action. This action is part of the EventGrid EventSubscription Contributor built-in role.
Prerequisite -
Data Factory expects events to follow the Event Grid event schema. Make sure event payloads have the following fields:
[
{
"topic": string,
"subject": string,
"id": string,
"eventType": string,
"eventTime": string,
"data":{
object-unique-to-each-publisher
},
"dataVersion": string,
"metadataVersion": string
}
]
Follow below Steps:
Go to Azure Data Factory and sign in.
Switch to the Edit tab. Look for the pencil icon.
Select Trigger on the menu and then select New/Edit.
On the Add Triggers page, select Choose trigger, and then select +New.
Select Custom events for Type.
Select your custom topic from the Azure subscription dropdown or manually enter the event topic scope.
The Subject begins with and Subject ends with properties allow you to filter for trigger events. Both properties are optional.
Use + New to add Event Types to filter on. The list of custom event triggers uses an OR relationship. When a custom event with an eventType property that matches one on the list, a pipeline run is triggered. The event type is case insensitive. For example, in the following screenshot, the trigger matches all copycompleted or copysucceeded events that have a subject that begins with factories.
A custom event trigger can parse and send a custom data payload to your pipeline. You create the pipeline parameters, and then fill in the values on the Parameters page. Use the format #triggerBody().event.data.keyName to parse the data payload and pass values to the pipeline parameters.
After you've entered the parameters, select OK.
For more information refer to this official document

Azure Consumption Usage Details API response has "cost":0 and "effectivePrice":0 for every record

I am using Azure Consumption Usage API to calculate the cost for a resource over a certain span of time, in the response received the cost and effectivePrice value is "0" for every record.
Here is my full url:
GET "https://management.azure.com/subscriptions/{subscriptionsId}/providers/Microsoft.Consumption/usageDetails?$filter=properties/usageStart ge '2019-08-07T00:00:00Z' and properties/usageEnd le '2019-08-09T01:00:00Z'&api-version=2019-05-01"
Here is the response format:
{
"value": [
{
"id":"/subscriptions/{subscriptionsId}/providers/Microsoft.Billing/billingPeriods/20190901/providers/Microsoft.Consumption/usageDetails/######-####-####-####-##########",
"name":"######-####-####-####-##########",
"type":"Microsoft.Consumption/usageDetails",
"tags":null,
"properties":
{
"billingAccountId":"*******",
"billingAccountName":"*****************",
"billingPeriodStartDate":"2019-09-01T00:00:00.0000000Z",
"billingPeriodEndDate":"2019-09-30T00:00:00.0000000Z",
"billingProfileId":"******",
"billingProfileName":"************************",
"accountOwnerId":"**********",
"accountName":"*************",
"subscriptionId":"subscriptioId",
"subscriptionName":"subscriptionName",
"date":"2019-09-06T00:00:00.0000000Z",
"product":"Product Name",
"partNumber":"******",
"meterId":"meterId",
"quantity":0.004032,
"effectivePrice":0,
"cost":0,
"unitPrice":0.045,
"billingCurrency":"USD",
"resourceLocation":"EastUS",
"consumedService":"microsoft.web",
"resourceId":"/subscriptions/......",
"resourceName":"resourceName",
"invoiceSection":"Unassigned",
"resourceGroup":"resourceGroupName",
"offerId":"MS-AZR-0017P",
"isAzureCreditEligible":true,
"publisherType":"Azure",
"chargeType":"Usage",
"frequency":"UsageBased",
"meterDetails":null
}
}
]
}
Cost analysis is disabled by the Admin of the subscription, is it why the every record in the response has "cost":0 and "effectivePrice":0 ?
How I can get the cost for a resource using Consumption API?
Can I use the quantity and unitPrice to get the cost for that particular response record?
I suppose you can't. When clicking Cost analysis in the portal, it also calls the Azure Management REST API(which has a prefix as https://management.azure.com/).
So if the cost is disabled in your subscription, you will not be able to access it via calling the REST API directly, also not other ways like azure powershell, cli, which essentially calls the REST API.

Streaming data consumed from third-party WebHooks to application using an event broker in Azure

I'm currently working on a project where I need to consume a third-party Webhook into my application. The issue is, this third-party service doesn't allow me to pick which events to push to my application through this Webhook, so I'll have to respond to all of them even if they are mostly useless, and if I need to expand on my application or divide it into microservices, I would be streaming the same data to all services even if they have different needs. Also, I would be facing data loss in case of issue with my application server.
The solution would be to use an Event Broker, which would collect all events from the Webhook, respond to the provider with a 200 OK status code, push the event to a specific topic which will be stored until all the concerned subscribed services receive that data.
I'm looking for a fully managed service in Azure, so far I've come across Azure Event Grid, Azure Event Hub and Azure Service Bus.
I wanted to know about the feasibility of this scenario, and if I can stream directly from a Webhook to one of these Azure services.
No, afaik you cannot stream directly into those service. You will need to setup something that accepts the webhook and sends it to one of those listened service.
However, what I would do is create an http triggered azure function. You should be able to configure the webhook to post to the function.
Once you got your function setup you can create some logic there to route the message to the proper channels based on its content. Now that could be an application of yours, a Service Bus Queue, Azure Storage Queue or Event Grid. I would not recommend an Event Hub as it is less suited for this specific purpose.
In the case of consuming a third-party events without guarantee their in order processing and the webhook payload is an array, the Azure Event Grid can be consumed your third-party webhook directly.
The following screen snippet shows this example:
The above integration is based on the Custom Topic Endpoint with a CustomInputSchema.
The Custom Topic Endpoint sends a response back to the webhook with the following HTTP response code:
Success 200 OK
Event data has incorrect format 400 Bad Request
Invalid access key 401 Unauthorized
Incorrect endpoint 404 Not Found
Array or event exceeds size limits 413 Payload Too Large
The AEG model distributing an event in the loosely decoupled Pub/Sub manner with a reliable and retry delivery to the subscriber based on its subscriptions. The AEG subscription represents a logical connectivity between the source of the interest and consumer. It is a set of metadata describing by consumer what, where and how.
Basically there are two delivery patterns such as:
Push-PushAck where the event is pushed to the subscriber handler for its business processing and the result is back to the AEG e.g. Web Hook (Azure Fuction) and Hybrid Connection.
Push-PullAck where the event is reliable delivered to the subscriber and the delivery response is returned back to the AEG. The event must be pulled out from this delivery target for its business post-processing, e.g. Service Bus Queue, Storage Queue and Event Hubs.
UPDATE:
For creating a custom topic endpoint with a CustomInputSchema can be used for example the REST API
The following is an example of the payload PUT request:
{
"location": "westus",
"properties": {
"inputSchema": "CustomEventSchema",
"inputSchemaMapping": {
"properties": {
"id": {
"sourceField": null
},
"topic": {
"sourceField": null
},
"eventTime": {
"sourceField": null
},
"eventType": {
"sourceField": null,
"defaultValue": "notification"
},
"subject": {
"sourceField": null,
"defaultValue": "/webhook/events"
},
"dataVersion": {
"sourceField": null,
"defaultValue": "1.0"
}
},
"inputSchemaMappingType": "Json"
}
}
}
The above CustomInputSchema enables to use any input event schema for this Custom Topic endpoint. That's very nice feature of the AEG. The "bad news" is that the events must be in the array included also a single event. I hope, that the AEG team will make an improvement for custom and domain topics when the single event can be published also as a JObject (no inside of the array).
For bypassing an input event schema via AEG eventing model, the subscriber (consumer of the source event interest) must declared a DeliverySchema = CustomInputSchema. The default output event schema is the EventGridSchema.
The following examples show an event message published to the custom topic with above CustomInputSchema and delivered to the subscribers using a CustomInptutSchema and the other one with the EventGridSchema.
Fire Event to the Custom Topic Endpoint (array of event(s)):
[
{
"abcd": 12345
}
]
Subscriber with DeliverySchema = CustomInputSchema:
{
"abcd": 12345
}
Subscriber with DeliverySchema = EventGridSchema (default schema):
{
"id": "f92a5dbf-d206-4e61-ac1e-7498c543039a",
"eventTime": "2019-07-14T07:19:00.3969337Z",
"eventType": "notification",
"dataVersion": "1.0",
"metadataVersion": "1",
"topic": "/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/rk2012/providers/Microsoft.EventGrid/topics/testerTopic8",
"subject": "/webhook/events",
"data":
{
"abcd": 12345
}
}
Note, that the events can be filtered (selected) for each subscriber based on its subscription properties in the loosely decoupled Pub/Sub manner. In other words, subscriber can subscribed to the AEG eventing model any time via its subscription where is declared a specific source interest, e.g. topic, subject, event data, etc., mechanism of the delivery, retrying, deadlettering, etc.

What is the subscription validation event message schema in azure event grid?

Created new azure event grid domain with cloud event schema using portal.
Created new web hook endpoint using azure function that can receive both subscription validation event as well as event notifications.
Created new azure event grid topic for the above domain (as part of following subscription) using portal.
Created new azure event grid subscription with cloud event schema with the above web hook endpoint.
When the subscription is Created, the endpoint was invoked by the grid infrastructure with subscription validaiton event to verify web hook endpoint.
To my surprise, the validation event structure (shown below) seemed to conform to native event grid schema and not cloud event schema:
[{
"id": "6309ef83-117f-47aa-a07c-50f6e71a8ca5",
"topic": "/subscriptions/13ad1203-e6d5-4076-bf2b-73465865f9f0/resourceGroups/xxxx-sandbox-rg/providers/Microsoft.EventGrid/domains/eg-xxx-test-cloud-domain/topics/eg-xxx-test-cloud-topic",
"subject": "",
"data": {
"validationCode": "391889BB-FCC3-4269-A2BD-0918B5BAB0AE",
"validationUrl": "https://rp-westus.eventgrid.azure.net/eventsubscriptions/xxxx-subscription-3/validate?id=391889BB-FCC3-4269-A2BD-0918B5BAB0AE&t=2019-01-30T15:45:37.0521594Z&apiVersion=2018-09-15-preview&[Hidden Credential]"
},
"eventType": "Microsoft.EventGrid.SubscriptionValidationEvent",
"eventTime": "2019-01-30T15:45:37.0521594Z",
"metadataVersion": "1",
"dataVersion": "2"
}]
I expected following subscription validation event that conforms to cloud event schema (based on the 0.1 version of cloud event schema at https://learn.microsoft.com/en-us/azure/event-grid/cloudevents-schema#cloudevent-schema):
{
"eventID" : "6309ef83-117f-47aa-a07c-50f6e71a8ca5",
"source" : "/subscriptions/13ad1203-e6d5-4076-bf2b-73465865f9f0/resourceGroups/xxxx-sandbox-rg/providers/Microsoft.EventGrid/domains/eg-xxx-test-cloud-domain/topics/eg-xxx-test-cloud-topic",
"data": {
"validationCode": "391889BB-FCC3-4269-A2BD-0918B5BAB0AE",
"validationUrl": "https://rp-westus.eventgrid.azure.net/eventsubscriptions/xxxx-subscription-3/validate?id=391889BB-FCC3-4269-A2BD-0918B5BAB0AE&t=2019-01-30T15:45:37.0521594Z&apiVersion=2018-09-15-preview&[Hidden Credential]"
},
"eventType" : "Microsoft.EventGrid.SubscriptionValidationEvent",
"eventTime" : "2019-01-30T15:45:37.0521594Z",
"cloudEventsVersion" : "0.1",
"eventTypeVersion" : "2",
}
What am I missing thing?
Basically, the webhook subscriber is handling the following two groups of the events. The specific event type is stored in the http header 'aeg-event-type'.
Internal events of the Event Grid model such as the eventTypes SubscriptionValidation and SubscriptionDeletion. The schema for these event types are always the same as a default schema such as an EventGridSchema. In other words, it's not depended on the EventDeliverySchema. IMO, having the default schema for internal events is making a strong event types specially when we have a CustomInputSchema.
Interest source events (topics) are events defined by input schema and presently the Event Grid model supports 3 types such as EventGridSchema (default), CloudEventSchema and CustomInputSchema.
The AEG supports the following schema input/output mappings:
EventGridSchema to delivery schemas EventGridSchema and CloudEventSchema
CloudEventSchema to delivery schema only CloudSchemaSchema
CustomInputSchema to delivery schema EventGridSchema and CloudEventSchema and CustomInputSchema
The event type in the header is: aeg-event-type=Notification and the schema is based on subscribed EventDeliverySchema (see the following mappings).
Based on the above, for your scenario you should have a separate strong type objects for Internal events (default schema is EventGridSchema) and for Notification events based on the subscribed EventDeliverySchema.
The following is an example of the http headers:
aeg-subscription-name=EVENTGRIDSCHEMA
aeg-delivery-count=0
aeg-data-version=
aeg-metadata-version=0
aeg-event-type=SubscriptionValidation
Note, there is only a subscription name to figure out which an EventDeliverySchema has been subscribed. It will be nice to have an additional aeg header for example: aeg-subscription-labels to pass some subscription metadata to the subscriber handler.
As a workaround, we can pass to the subscriber webhook handler some values via the url query parameters, for instance: &eds=CustomInputSchema
This is a known issue / expected behavior in the Azure Event Grid implementation of Cloud Event V0.1 spec. At the time Cloud Events v0.1 spec was implemented in Azure Event Grid, there was no validation handshake / abuse protection model defined in the Cloud Events standard and hence Event Grid's existing validation handshake model/schema was used for Cloud Event subscribers as well.

Building SOAP Listener with Azure Functions

I am using Azure Functions to build some integrations between various systems. I new requirement is to respond to record updates in Salesforce. Some quick research yielded what seems like a good solution from the Salesforce side. Use Outbound messaging which can send SOAP requests on record modifications.
How to create Salesforce application that will send record to external web service when record created/changed(https://salesforce.stackexchange.com/questions/73425/how-to-create-salesforce-application-that-will-send-record-to-external-web-servi)
The challenge now is to be able create a SOAP listener in Azure Function. I have created basic HTTP Triggers for my other listeners. Is there anything "built-in" to Azure Functions that would allow me to easily consume the incoming SOAP request?
Salesforce has the basics for a solution based on a more traditional web service and an ASMX file but I am not sure if or how that can be applied in Azure Functions. (https://developer.salesforce.com/docs/atlas.en-us.api.meta/api/sforce_api_om_outboundmessaging_listener.htm)
That notification is just a SOAP request that is made over HTTP, so really not too different than a regular HTTP trigger request.
Although you could just treat that as a plain request and parse the contents yourself, Azure Functions does expose the great WebHook support we get from ASP.NET WebHooks, and luckily, there is a Salesforce receiver that significantly simplifies this task.
DISCLAIMER: It's worth noting that although the receiver is technically enabled in Azure Functions, there's no official support for it yet, so you won't find a lot of documentation and help will be limited to what you get on SO and Forums. Official support to this and other receivers will hopefully be coming soon, which means documentation, templates and UI support will become available.
To get started, you need the following:
Create a new function, selecting the GenericWebHook - CSharp template (this works for node as well, but I'll focus on C# here.
Follow the steps outlined on the ASP.NET WebHooks integration with Salesforce post in order to create the outbound message. Here you want to use the Function Url given to you by the portal WITHOUT THE CODE QUERY STRING (having the code there wouldn't hurt, but the receiver does not use that information).
IMPORTANT: Get your Salesforce Organization ID, which will be used for authentication and is located under Administer > Company Profile > Company Information > Salesforce.com Organization ID and back in the Azure Functions portal, open the Keys panel, delete your default function key (not host key) and create a new key, named default (this name is important) using the Organization ID value you got from Salesforce.
Go to Integrate
On the integration page, select Advanced Editor on the upper right (as mentioned, there's no official support, so the UI does not expose this. We're putting our explorer hats on and venturing into a more advanced workflow here :) )
Change the webHookType property value to sfsoap and save the configuration. Your function.json config should look like the following:
function.json:
{
"bindings": [
{
"type": "httpTrigger",
"direction": "in",
"webHookType": "sfsoap",
"name": "req"
},
{
"type": "http",
"direction": "out",
"name": "res"
}
],
"disabled": false
}
Switch to the Develop tab. We're ready to write our code.
This is where the ASP.NET WebHooks receiver shines! It will parse the notification for you, exposing strong typed objects you can work with. All you need to do is modify the method/function signature you get withe template to use the SalesforceNotifications type, making sure you're referencing the required assembly (Microsoft.AspNet.WebHooks.Receivers.Salesforce, which is made available to you, so no need for package reference) and namespace reference (Microsoft.AspNet.WebHooks).
Here is a full sample of a function that will receive the request and log the Organization ID, Action ID, grab the first notification and log all of its properties:
#r "Microsoft.AspNet.WebHooks.Receivers.Salesforce"
using Microsoft.AspNet.WebHooks;
public static void Run(SalesforceNotifications req, TraceWriter log)
{
log.Info($"Salesforce webhook was triggered!");
log.Info(req.OrganizationId);
log.Info(req.ActionId);
var notification = req.Notifications.First();
foreach (var p in notification.Keys)
{
log.Info($"{p} = {notification[p]}");
}
}
This process will be a lot smoother when the receiver is officially supported, but even with the added steps, this still beats having to parse the SOAP messages yourself :)
I hope this helps!

Resources