Hi I'm using azure SQL database and I need to create a notification/alert once the daily growth of the database is over a pre-defined number. As an example I need to send an email to the DB admins once the database has grown over 1 GB within the last 24 hours. I was seeking for solutions but couldn't find a straight forward solution to be implemented using azure. Any help will be appreciated.
You can create alerts for SQL Db using alerts and action group in Azure. Below are steps you can follow to create alerts for SQL DB usage for a period of time,
Create a logic app as shown below,
In send email action, configure recipients mail addresses to notify alerts.
Next create an action group and configure the created logic app in actions tab.
Creating action group,
Once logic app selected, click on review + create.
Now you can create alert for sql db and select the created action group.
In Conditions tab, select signal as Data space used.
As per your requirement, configure details as shown below,
In Actions tab, select already created action group.
Once it is done, click on review + create.
This flow will execute whenever data used is more that 1 Gb for a selected time period.
Related
I need to create an alert that sends an email every time a new container is created in a specific Azure storage account.
We don't have any built signals to create a Azure monitor alert to monitor the container creation in the storage account.
To accomplish this, you'll need to use Azure Monitor and write a kusto query to monitor and send an email every time a new container is created in the storage account.
Below steps to be followed in-order to implement custom solution using Azure monitor:
Create a log analytics workspace.
Enable the Diagnostic settings on the storage account and send these logs to the above Log analytics workspace.
Here is Kusto query to pull the Create Container operations on the storage account:
StorageBlobLogs
| where OperationName =~ "CreateContainer" and AccountName =~ '<StorageAccountName>'
| project Uri
click on New alert rule option in the log analytics space to create a custom alert using the above query as signal as shown below.
In signal condition, set Aggregation Granularity (The interval over which datapoints are grouped by the aggregation type.),Frequency of evalution
(determines how often the alert rule should run) to 5 minutes.
Using Actions groups, you send an email or sms notification when the alert criteria has met. You can use the existing action group or you can create a new action group while configuring the alert.
Here is the sample image of the alert rule that was created using the above query:
Here is the sample output for reference:
Note:
Here we are using the custom log search as condition signal, if you want know which container got created and this alert got fired click on Search Results in your alert notification email.
I have a Logic App running every minute that checks the time that data was last received in a table. If it has been enough time since the data was updated I want to receive an alert. I really like the Action Groups used by the Alerts in Azure. They are clean and have lots of options like email, SMS, and Phone. How can I trigger an Action Group from my Logic App?
I know I can recreate the email, SMS, and Phone connections in the Logic App, but then it's harder to maintain. I'm already using the same Action Group for other Alerts. It would be easier to maintain if I could reuse this Action Group.
There is ton online about triggering a Logic App from an Action Group. This is NOT what I'm trying to do. I want the reverse. I want to trigger an Action Group from a Logic App.
How can I trigger an Action Group from my Logic App?
Currently as per the documentation We can trigger a particular logic app using the action group but there is no way to trigger a particular action group using logic app.
It would be easier to maintain if I could reuse this Action Group.
Yes, you can use same action group in multiple alert mechanisms.
Would suggest you to raise a feature request using this azure support link.
You should be able to send data to a custom log in Log Analytics from your Logic App using Azure Log Analytics Data Collector.
Then you can use a Log Analytics query to evaluate resources logs every set frequency, and fire an alert based on the results. Rules can trigger one or more actions using Action Groups. - see Create, view, and manage log alerts using Azure Monitor.
I want to have a control in Azure regarding new and deleted items
I need a query to know "who" and "when" a resource is created or deleted in Azure
Is this possible? How can I do this query?
I need a query to know "who" and "when" a resource is created or
deleted in Azure
Is this possible? How can I do this query?
Whenever a resource is created or deleted, information about that operation is stored in Azure Activity Logs. You should be able to find the information by querying that.
Another alternative would be to make use of Azure Event Grid and subscribe to Subscription Events. You can subscribe to Microsoft.Resources.ResourceWriteSuccess (for creation/updation of resources) and Microsoft.Resources.ResourceDeleteSuccess (for resource deletion) events and take action on these events in near real time.
Within the Azure Portal, you can view these types of events from the past 90 days in the Activity Log blade.
For access to events occurring more than 90 days in the past, you need to pre-emptively set up log archival as detailed in the Export the Azure Activity Log article.
If you are planning to use the export Activity Log feature, please make sure you use the new diagnostic setting feature on Azure subscription to export Activity Logs. This feature offers multiple improvements over the old features such as Logprofiles or the Activity Log solution (Log Analytics).
https://learn.microsoft.com/en-us/azure/azure-monitor/platform/activity-log-collect
https://learn.microsoft.com/en-us/azure/azure-monitor/platform/diagnostic-settings-template
Is there a way to trigger a Logic App on a deletion of a record in an Azure SQL table?
I've checked the SQL Connector and there is only When an item is created and When an item is modified, which gives me the C and U in CRUD, but sadly there isn't an out-of-the-box trigger for the D.
I can think of some awful way of polling to get record deletions, but I'm hoping that there is a cleaner solution that some bright person has come up with, however I've had no joy with the Google searching.
I would look at the Azure Event Grid. Azure Event Grid allows you to easily build applications with event-based architectures. First, select the Azure resource you would like to subscribe to, and then give the event handler or WebHook endpoint to send the event to. Event Grid has built-in support for events coming from Azure services, like storage blobs and resource groups. Event Grid also has support for your own events, using custom topics.
I would suggest monitoring the resource group and triggering it off the deletion from the RG. There is a tutorial that shows this same concept with a VM but you should be able to modify it to meet your needs with an Azure SQL DB.
https://learn.microsoft.com/en-us/azure/event-grid/monitor-virtual-machine-changes-event-grid-logic-app
I added an on delete trigger which adds the id of the deleted record to a secondary table. I have the logic app look for modifications on the secondary table.
I'm trying to create jobs in Azure Sql Database but I don't know how to do that. Is It possible do them inside de Sql Server Management Studio?
You need to use Azure Automation to schedule the execution of a stored procedure. For instance, You can use Azure Automation to schedule index maintenance tasks.
Below are steps :
Provision an Automation Account if you don’t have any, by going to https://portal.azure.com and select New > Management > Automation Account
After creating the Automation Account, open the details and now click on Runbooks > Browse Gallery
Type on the search box the word “indexes” and the runbook “Indexes tables in an Azure database if they have a high fragmentation” appears:
Note that the author of the runbook is the SC Automation Product Team at Microsoft. Click on Import:
After importing the runbook, now let’s add the database credentials to the assets. Click on Assets > Credentials and then on “Add a credential…” button.
Set a Credential name (that will be used later on the runbook), the database user name and password:
Now click again on Runbooks and then select the “Update-SQLIndexRunbook” from the list, and click on the “Edit…” button. You will be able to see the PowerShell script that will be executed:
If you want to test the script, just click on the “Test Pane” button, and the test window opens. Introduce the required parameters and click on Start to execute the index rebuild. If any error occurs, the error is logged on the results window. Note that depending on the database and the other parameters, this can take a long time to complete:
Now go back to the editor, and click on the “Publish” button enable the runbook. If we click on “Start”, a window appears asking for the parameters. But as we want to schedule this task, we will click on the “Schedule” button instead:
Click on the Schedule link to create a new Schedule for the runbook. I have specified once a week, but that will depend on your workload and how your indexes increase their fragmentation over time. You will need to tweak the schedule based on your needs and by executing the initial queries between executions:
Now introduce the parameters and run settings:
NOTE: you can play with having different schedules with different settings, i.e. having a specific schedule for a specific table.
With that, you have finished. Remember to change the Logging settings as desired:
You can use Microsoft Flow (https://flow.microsoft.com) in order to create a programmed flow with an SQL Server connector. Then in the connector you set the SQL Azure server, database name, username and password.
SQL Server connector
There are many options but the ones that you can use to run a T-SQL query daily are these:
SQL Connector options
Execute a SQL Query
Execute stored procedure
You can also edit your connection info in Data --> Connections menu.