Azure Event Grid Custom Topic Triggering on Insert to Azure SQL table - azure

I am trying to use Event Grid to kick off an Azure Data Factory pipeline when a new record is inserted into an Azure SQL database table. But, I'm lost at the start of things.
When creating the new subscription, I think I would choose Custom Input Schema but I'm not sure where to even start with where to get the "Event Type" from. Is there a list of types somewhere? Is this in the documentation of Azure SQL or Event Grid?
What is the right event type? Any help would be appreciated.
Reference: https://learn.microsoft.com/en-us/azure/event-grid/event-sources
NOTE: I cannot use Logic Apps for this as that has not been approved by our Azure architecture team. I say this because Logic Apps SQL connector now allows for a trigger based on SQL table insert -- no matter though, because I cannot use Logic Apps :(

At this moment, SQL Database doesn't publish events to Event Grid, so you can't use this approach.
You can change your code and right after insert on SQL, publish a custom event to Event Grid, or switch to CosmosDB which offers the Change Feed (which you can subscribe and react to events)

Yes, at the moment there is no event sync integration to the Azure Event grid, however, for purpose of exploring other venues, you might find Debezium a place for syncing most of the data sources into a Kafka or other streams with little custom code.
Reference SQL Server Debezium Connector
Note: the few of the connectors are in testing phase, and it is a little time taking task to write or customize the connector, but can be done.
I felt this technology mostly usefull for integrating or migrating complex distributed systems with legacy components working together.

Related

How Modifying Azure Analysis services roles using a logic app?

With Azure Data Factory I have built a pipeline to orchestrate the processing of my Azure Analysis Services model trough a dedicated Logic App as explicated in this article, and it works properly.
Now, always using Azure Data Factory (through Logic App), I wish I could also update the list of the user in a specific roles.
In the article mentioned above, to process the Azure Analysis Services models, the Logic App calls a specific API that has the following format:
https:// <rollout>.asazure.windows.net/servers/<serverName>/models/<resource>/refreshes
but this API doesn't seem to work for update the model's roles.
Is there anyone who knows the correct method to be able to update model roles using a specific Logic App?
Thanks for any suggestions
If you don't necessarily need to use the logic app for this, I think it might be possible using Azure automation and the powershell cmdlets for managing azure analysis services:
https://learn.microsoft.com/en-us/azure/analysis-services/analysis-services-refresh-azure-automation
https://learn.microsoft.com/en-us/azure/analysis-services/analysis-services-powershell
https://learn.microsoft.com/en-us/powershell/module/sqlserver/Add-RoleMember?view=sqlserver-ps
One alternative approach might be to have fixed AD groups as members of the tabular model roles and add / remove members from those AD groups. Therefore the tabular model roles would not need to be refreshed, it would simply be a matter of adding or removing members from the AD groups as part of your governance process.
A second approach would be to use dynamic row-level security. Adding records to a Azure SQL DB table is perfectly possible with Logic Apps and could be used to drive security, depending on your requirements. You can then refresh your security dimension with the Logic App. See here for more details:
https://learn.microsoft.com/en-us/power-bi/desktop-tutorial-row-level-security-onprem-ssas-tabular
To answer your question however, the Azure Analysis Services REST API is useful but is not that fully featured, ie it does not contain all possible operations for tabular models or the service. One other missing example I found was backups, ie although it is possible to trigger a pause or resume of the service, it is not possible to trigger a backup of a tabular model via the REST API. I do not believe it is possible to alter role members or at least, the operation is not listed in the REST API, although happy to be corrected if I am wrong. To be more specific, Roles is not mentioned in the list of available objects which can be passed in to the Objects array using the POST / Refreshes eg here. table and partition are the only ones I'm aware of.
There are also no examples on the MS github site:
https://github.com/microsoft/Analysis-Services
Finally, consider calling TMSL via Powershell in an Azure Function, which you can call from Azure Data Factory.
HTH

Azre Data Explorer Data Connection Automation

Is there anyway to automate the creation of an Azure Data Explorer Data Connection.
I want to create it as part of an automated deployment so either ARM or through C#. The Data Connection source is an EventHub and needs to include the properties specifying the table, consumer group, mapping name and data format.
I have tried creating a resource manually and epxporting the template but it doesn't work. I have also looked through the MSFT online documentation and cannot find a working example.
This is all I have found example
Please take a look at this good example which shows how to create control plane resources (cluster, database, data connection) using ARM templates, and using a data plane python API for the data plane resources (table, mapping).
In addition, for C# please see docs here and following C# example for how to create an event hub data connection:
var dataConnection = managementClient.DataConnections.CreateOrUpdate(resourceGroup, clusterName, databaseName, dataConnectionName,
new EventHubDataConnection(eventHubResourceId, consumerGroup, location: location));
I've actually just finished building and pushing an Azure Sample that does this (see the deployment template and script in the repo).
Unfortunately, as I elected not to use the Azure CLI (and stick with pure Azure PowerShell), I wasn't able to fully automate this, but you can at least see the approach I took.
I've filed feedback with the product group here on UserVoice

Is there a way to trigger a Logic App on a deletion of a record in an Azure SQL table?

Is there a way to trigger a Logic App on a deletion of a record in an Azure SQL table?
I've checked the SQL Connector and there is only When an item is created and When an item is modified, which gives me the C and U in CRUD, but sadly there isn't an out-of-the-box trigger for the D.
I can think of some awful way of polling to get record deletions, but I'm hoping that there is a cleaner solution that some bright person has come up with, however I've had no joy with the Google searching.
I would look at the Azure Event Grid. Azure Event Grid allows you to easily build applications with event-based architectures. First, select the Azure resource you would like to subscribe to, and then give the event handler or WebHook endpoint to send the event to. Event Grid has built-in support for events coming from Azure services, like storage blobs and resource groups. Event Grid also has support for your own events, using custom topics.
I would suggest monitoring the resource group and triggering it off the deletion from the RG. There is a tutorial that shows this same concept with a VM but you should be able to modify it to meet your needs with an Azure SQL DB.
https://learn.microsoft.com/en-us/azure/event-grid/monitor-virtual-machine-changes-event-grid-logic-app
I added an on delete trigger which adds the id of the deleted record to a secondary table. I have the logic app look for modifications on the secondary table.

How to guarantee at least once delivery with Azure Function with Cosmos DB trigger

I have a Cosmos DB trigger for an Azure function. I want to flatten and write some data from the incoming Document(s) to an (Azure) SQL Server.
What is a way to guarantee at least once delivery?
I looked at https://hackernoon.com/reliable-event-processing-in-azure-functions-37054dc2d0fc which gives some options in the case of an Azure Function triggered by an Event Hub event, but I am not sure if the same applies for the CosmosDB changefeed that causes the trigger to fire.
On the Cosmos DB Change Feed site https://learn.microsoft.com/en-us/azure/cosmos-db/change-feed it states:
Each change to a document appears exactly once in the change feed, and clients manage their checkpointing logic. The change feed processor library provides automatic checkpointing and "at least once" semantics.
Does that mean that it implements the same (or something similar to) the checkpoint system from Event Hub?
Does the circuit breaker pattern work the same way if applied to this flow of a CosmosDB trigger to an Azure Function as detailed at the end of https://hackernoon.com/reliable-event-processing-in-azure-functions-37054dc2d0fc ?
Azure Functions Cosmos DB trigger is based on Change Feed processor library. You will get at-least-once out of the box.

Which Azure service to use for processing data from Event Hub?

I would appreciate some help picking out the best suited Azure services for my scenario - I am just beginning with Azure services and my knowledge is pretty limited.
I have data from multiple sources, and of different shapes, coming into an Event Hub. I need to subscribe to the events from the Event Hub and, based on their format, process them and ultimately save them into an SQL Database. All components - events consumers, the SQL Database - need to be hosted in the cloud.
How would I implement this in an "Azure Orientated Architecture"?
In an off cloud application, I would have competing consumers subscribing to the Event Hub. They would be some console applications or Windows services, and each would be processing the events asynchronously (this is further simplified by the event processing being idempotent).
Ideally, the Azure equivalent of the above consumers would scale up and down automatically, so I would like to not have to use VMs that host console applications (where I would need to keep an eye on the VM's resources myself). Scaling and deployment wise they would have to behave like App Services, however I'm under the impression that those are just for web applications. I've also briefly looked at Web Jobs, but those seem to be polling data at various intervals, whereas I need a proper event subscriber that the Event Hub pushes data into.
Any help will be greatly appreciated!
Thank you.
Later Edit:
I've looked into Web Jobs and they do allow continuous
processing, hence looks like they can be used as automatically
scaling subscribers.
Ideally I would like to write the code for
the subscribers in F#. C# is the other option if that is not
available.
You can see my post regarding IoT Hub. Its basically the same for Event Hub.
(each of the examples in the post can be used on Event Hubs).
https://stackoverflow.com/a/38682324/6659347
In addition, For Event Hub you can also use Azure Function which has an Event Hub trigger - a function that will run whenever an event hub receive a new event. And it will also answer your requirement of scaling.
Make sure that if you are working with multiple consumers make use of the Event Hub Consumer Groups so each consumer can read the stream independently.
I'd say use a WebJob in combination with an EventProcessor. I wrote some demo code that can easily be transferred to a WebJob: https://github.com/DeHeerSoftware/SemanticLogging.EventHub/tree/master/SemanticLogging.EventHub.Processor
See https://azure.microsoft.com/en-us/documentation/articles/event-hubs-csharp-ephcs-getstarted/#receive-messages-with-eventprocessorhost for official documentation.
I've created a WebJob myself using this approach. Works like a charm.

Resources