is it possible to create custom json payload for metric alerts like the way we create for log alert and log analytics alerts.If it is there is there any documentation i can refer to ,to create alerts with custom json payload
Yes, It is possible to use a pre-written JSON and change it according to our context even for metric alerts. You can also deploy the JSON with ARM using any deployment method.
There are sample ARM Template for both static and dynamic instances given in Microsoft docs for reference :
https://learn.microsoft.com/en-us/azure/azure-monitor/platform/alerts-metric-logs#configuring-metric-alert-for-logs
Related
Let's say I have a custom metric on Azure Application Insights. These custom metric does not seem to exist until it receives the log/data with the metrics. So when I try to deploy alert rules that are based on these custom metrics, the Bicep deployment will fail saying the metric does not exist.
Is there a workaround to this issue? I would like to not have to send dummy data just to create the custom metric first. Or is there something I'm missing or doing wrong?
We need to log custom messages into Azure Monitor or AppInsights from Data Factory, including pipelines. For example, a pipeline runs validation and requires to log a validation error for a file with an exceeded length of the file name and carry on running the rest of the pipeline.
The loggings are viewable from Azure Monitor.
Any idea?
To extract data from AppInsights and run a query of AppInsights in Azure Data Factory. You could use Web Activity in the ADF to invoke the Application Insights REST API after the execution of your main activities.
Or
You can use Azure Monitor instead of Application Insights.
For more details, please refer to this document:
https://www.ben-morris.com/using-azure-data-factory-with-the-application-insights-rest-api/
https://learn.microsoft.com/en-gb/azure/azure-monitor/logs/data-collector-api
Is there anyway to automate the creation of an Azure Data Explorer Data Connection.
I want to create it as part of an automated deployment so either ARM or through C#. The Data Connection source is an EventHub and needs to include the properties specifying the table, consumer group, mapping name and data format.
I have tried creating a resource manually and epxporting the template but it doesn't work. I have also looked through the MSFT online documentation and cannot find a working example.
This is all I have found example
Please take a look at this good example which shows how to create control plane resources (cluster, database, data connection) using ARM templates, and using a data plane python API for the data plane resources (table, mapping).
In addition, for C# please see docs here and following C# example for how to create an event hub data connection:
var dataConnection = managementClient.DataConnections.CreateOrUpdate(resourceGroup, clusterName, databaseName, dataConnectionName,
new EventHubDataConnection(eventHubResourceId, consumerGroup, location: location));
I've actually just finished building and pushing an Azure Sample that does this (see the deployment template and script in the repo).
Unfortunately, as I elected not to use the Azure CLI (and stick with pure Azure PowerShell), I wasn't able to fully automate this, but you can at least see the approach I took.
I've filed feedback with the product group here on UserVoice
Application insight connector in oms is not pulling custom properties that were logged in application insight.
I have application where i write additional request specific context data in custom properties when calling trace, error, warning methods.
In oms i want to filter based on these custom properties.
eg : all my micro services api emit correlation id in custom properties and this correlation id is maintained when call goes from one api to other api. This will help to correlated user request for a workflow.
But when my appinsight data from all microservice appinsight comes to oms those custom properties are lost.
please advise.
One approach i noted is by exporting app-insight logs to blob storage and then configuring oms to get logs from blob storage.
I have not tried this approach but looks like one option. wanted to check if anybody faced this situation and what helped.
Other approach i got after some more reading is to use log analytics http data collector api to log all data directly to log analytics. This envolves lot of work to change application logger to use http data api. It sounds possible to but doesnt feels right to make so much change.
App -> http data api -> log analytics
app -> appinsight -> connector -> loganalytics.
https://learn.microsoft.com/en-us/azure/log-analytics/log-analytics-data-collector-api
I have to get all custom events that contain certain string on the name, with a scheduled Azure function.
Ideally I would like to send an email if a specific custom event have some wrong data.
The built in alerting features don't let you do this (run an arbitrary query and alert if some specific condition)
However, a lot of people are using Microsoft Flow, and there's some example scenarios there.
Other people have created azure functions to do similar.