Azre Data Explorer Data Connection Automation - azure

Is there anyway to automate the creation of an Azure Data Explorer Data Connection.
I want to create it as part of an automated deployment so either ARM or through C#. The Data Connection source is an EventHub and needs to include the properties specifying the table, consumer group, mapping name and data format.
I have tried creating a resource manually and epxporting the template but it doesn't work. I have also looked through the MSFT online documentation and cannot find a working example.
This is all I have found example

Please take a look at this good example which shows how to create control plane resources (cluster, database, data connection) using ARM templates, and using a data plane python API for the data plane resources (table, mapping).
In addition, for C# please see docs here and following C# example for how to create an event hub data connection:
var dataConnection = managementClient.DataConnections.CreateOrUpdate(resourceGroup, clusterName, databaseName, dataConnectionName,
new EventHubDataConnection(eventHubResourceId, consumerGroup, location: location));

I've actually just finished building and pushing an Azure Sample that does this (see the deployment template and script in the repo).
Unfortunately, as I elected not to use the Azure CLI (and stick with pure Azure PowerShell), I wasn't able to fully automate this, but you can at least see the approach I took.
I've filed feedback with the product group here on UserVoice

Related

Azure Event Grid Custom Topic Triggering on Insert to Azure SQL table

I am trying to use Event Grid to kick off an Azure Data Factory pipeline when a new record is inserted into an Azure SQL database table. But, I'm lost at the start of things.
When creating the new subscription, I think I would choose Custom Input Schema but I'm not sure where to even start with where to get the "Event Type" from. Is there a list of types somewhere? Is this in the documentation of Azure SQL or Event Grid?
What is the right event type? Any help would be appreciated.
Reference: https://learn.microsoft.com/en-us/azure/event-grid/event-sources
NOTE: I cannot use Logic Apps for this as that has not been approved by our Azure architecture team. I say this because Logic Apps SQL connector now allows for a trigger based on SQL table insert -- no matter though, because I cannot use Logic Apps :(
At this moment, SQL Database doesn't publish events to Event Grid, so you can't use this approach.
You can change your code and right after insert on SQL, publish a custom event to Event Grid, or switch to CosmosDB which offers the Change Feed (which you can subscribe and react to events)
Yes, at the moment there is no event sync integration to the Azure Event grid, however, for purpose of exploring other venues, you might find Debezium a place for syncing most of the data sources into a Kafka or other streams with little custom code.
Reference SQL Server Debezium Connector
Note: the few of the connectors are in testing phase, and it is a little time taking task to write or customize the connector, but can be done.
I felt this technology mostly usefull for integrating or migrating complex distributed systems with legacy components working together.

Proper way to maintain azure resource manager template

I have arm template to recreate resource group with resources and their settings. This works fine.
Use case:
Some developer goes to azure portal and update some settings for some resource. Is there a way how to get exact changes that can be applied to my template to take these changes in effect? (Update template in source control)
If I go to automation script in resource group I can see all resources but my template in source control is different (parameters, conditions, variables, multiple templates linked together ...). I can't see on first look what changes were done and I can't use any diff.
Maybe I missed completely something but how are you solving this issue?
Thanks.
It is not easy to see any changes to resources by comparing templates from within the portal. Best practice is to always use ARM templates (and CI/CD pipelines) to deploy ARM templates to provision resources. Keep these ARM templates under source control to track them.
Further than that, I think you have two main options to track these changes:
1) You can use the Azure Activity Log to track the changes. The Azure Activity Log is a subscription log that provides insight into subscription-level events that have occurred in Azure. This includes a range of data, from Azure Resource Manager operational data to updates on Service Health events.
2) Write a little intelligent code against the Management Plane API. A good starting point is https://resources.azure.com/subscriptions. You could write a little extract that pulls all your resources out daily and commits them to a git repo. This will only update for changes to templates. You can then analyse the delta as or when you need.
Conceptionally, the developer should never 'go[es] to azure portal and update some settings for some resource', except for his own development / unit testing work. He then should produce an updated ARM template for deployment in the TST etc environments, and document his unit-tested changes with the new template. If his update collides with your resources in TST he will probably come to you to explain his changes, and discuss the resolution.

Use parameters in place of linked service name in Azure Data Factory V2

My question is slightly similar to this question however adequately different to merit a new thread.
I have a requirement to extract data from a number of different on-premises SQL Server instances over the internet. I am using Azure Data-Factory 2 and the Integration Runtime to access data from these servers.
The problem is that i will have many pipelines to manage and update. I want to have a single Data Factory process which uses parameters for linked service names.
Is it possible to have 1 pipeline which uses a parameter to reference a linked service name which is updated before re-executing the pipeline?
I am struggling to find a useful article on how this can be achieved.
Reference name can’t be parametied. But making linked Service support parameters is in the plan as the post you mentioned said.

Disable / suspend Azure Time Series Insight

Since the pricing does not offer much choice in terms of flexibility my developer MSDN account is quickly running out of credits using Azure Time Series Insights fro a Proof of Concept. Is it somehow possible to suspend the service so no costs are incurred? I would hate to have to delete the whole thing and set it up again when we start working again on the PoC.  
Currently, Azure still do not provide a way to suspend TSI environment.
Maybe you can use scripted template deployment for creating/deleting TSI environment.
With this approach, however, you are going to constantly loose your data.
On the link below there are guidelines, provided by Microsoft, on how to implement template deployment:
https://learn.microsoft.com/en-us/azure/time-series-insights/time-series-insights-manage-resources-using-azure-resource-manager-template
The general steps provided by MSFT are:
Install PowerShell
Create the template and a parameter file.
In PowerShell, log in to your Azure account.
Create a new resource group if one does not exist.
Test the deployment.
Deploy the template.

Deploy ServiceBus with multiple Topics and Queues at once

I'm fairly new to using the servicebus and other Azure features. After creating a servicebus manually on the Azure portal, I try to figure out how this can be achieved automatically. After a while of reading I thought that using the azure resource manager should be the way to go. Deploying just one topic is no big deal. But I can't find an example, that shows how to deploy multiple topics and queues at once. Or am I on the wrong approach?
Thanks for your answers!
Helmut
What we do (and I saw other teams doing the same) is simple: when your producer/consumer application starts, it checks if required queues/topics/subscriptions exist, and creates them otherwise.
So we create all Service Bus entities from C# code, which also gives the full flexibility for options.
Code sample:
var namespaceManager = NamespaceManager.CreateFromConnectionString(connectionString);
if (!namespaceManager.TopicExists(topicName))
{
namespaceManager.CreateTopic(new TopicDescription(topicName));
namespaceManager.CreateSubscription(
new SubscriptionDescription(topicName, subscriptionName));
}
That's not to say your ARM approach is wrong or bad, just to give a simple alternative.
Using the new azure portal (here), there is an Automation script feature.
I've created a new resource group with a service bus namespace that contains 2 topics and 1 queue:
You can see on the left panel, there is an Automation script feature.
In this section, you can find a template that represents the resources that you've created manually. you can then use this template to automate your deployment to others environment.
See also Deploy resources with Resource Manager templates and Azure PowerShell

Resources