How can I create an alert on the subscription level for every new resource created? - azure

I want to get alerts whenever a new resource is created, I tried to do it through the monitor using the activity log but could not find any default configuration which can provide the needful. Any help will be appreciated.

Create Event Grid Subcription on Subscription level. Narrow actions to e.g. Successful Writes only, your choice, send such event when its triggered to some external processor (e.g. Azure Function) and do whatever you need with it.

You can create a log alert for any create events.
You will first need to stream the activities to log anayltics:
https://learn.microsoft.com/azure/azure-monitor/essentials/activity-log#send-to-log-analytics-workspace
You can then set a log alert on the subscripition to detect any resource creation.

Related

Azure az copy command complete event grid notification

Is there a way for the Azure event grid to trigger when an AZ copy command completes?
We have clients which use az copy to transfer hundreds of files and sub folders into our Azure storage. The number of files is variable. And the az copy command is of a single root folder on their local containing those files and sub folders.
We want to raise an event grid notification when the az copy is complete and successful.
An alternative would be to have a second az copy command in a batch file that transfers a single flag file once the initial command is fully executed successfully. We would then monitor for this single file as the flag to proceed with further processing.
Perhaps if az copy cannot raise the event, then it can add a verification file signaling the end of the transfer?
You can have event grid notifications on an individual blob (or directory, when using ADLS). azcopy is essentially creating individual blobs, so you'd get individual notifications. Azure storage doesn't provide a transactional batch of uploads, so you can't get a single notifiction.
If you wanted a single notification, you'd have to manage this yourself. You mentioned a "flag" file, but you can also create custom topics, use an Azure Function, service bus message, etc. How you ultimately implement this is up to you (and your clients that are uploading content), but tl;dr no, you can't get a single completion event for a batch of uploads.
I have tried to reproduce in my environment get notification successfully
For event grid notification the endpoint which will receive notification in function app or logic apps...I created endpoint in function app
In your function app -> Function -> create ->select azure event grid trigger ->create
once created, in storage account create subscription as below
please make changes as below once you select endpoint by default it shows function in right side as below and confirm selection and create subscription
once azcopy is complete and files uploaded in container you will get a notification like below

How can I be notified if someone creates a new database in Azure?

I would like to set up an Azure alert for when someone on our team sets up an Azure database. Once alerted, I want to have an additional alert created if that resource is running for more than a certain amount of time.
My solution is to create an Alert Rule on the storage account and have it send an email. Where I'm running into trouble is how to monitor the database, since it just got created and I don't know the name yet for the second Alert rule that will monitor its uptime.
Is there some programmatic way to determine the database resource name?
If you don't want to invest time in Programmatic way there is a option to set or configure an alert at Resource Group level based on Resource type, where in the alert rule configure with below configurations
Scope -Select the right subscription, filter by resource type like SqlDatabase and if required filter based on location wise
Condition - In Select condition, Signal type drop down select "Create/Update Azure Sql Database" and in alert logic you can provide additional filtering logic's
Can choose existing Action group or create new one based on your requirement
Add "Alert rule details" like rule name, description etc.
Finally create alert rule
Now once alert rule is created if any new Azure SqlDatabase is created you will be notified based on alert configured.
According to the official doc, you can use Event Grid to notify Azure Automation when a SQL database is created.
https://learn.microsoft.com/en-au/azure/event-grid/overview#ops-automation
Once you subscribe, you can use Logic Apps to send you an email for example.
About the second part, you'll need to query the metrics and figure out if it's running (is performing compute) or not.

How to create an Azure alert to notify about any resource deletion

Looking for some help around Azure alerts.
I need to get notified whenever any Azure resource is deleted. From what I have read so far I know an alert can be created at a resource level. But it will be too cumbersome to setup alert for each resource individually.
Ideally I would like to have a rule set up at Subscription level or Resource Group level which notify when any resource in Sub/ Resource group is deleted.
Will highly appreciate any help I can get with this.
you could, actually just stream activity logs to azure log analytics and then you could use a simple query like so:
AzureActivity
| where OperationNameValue endswith "DELETE"
and then just click that + New alert rule button and your are all set.
Currently, the Alert level at the Subscription level is NOT supported. However, you can upvote the request here.
The only way to do is to create the rule for each resource as you mentioned in the question.

How to group Azure deployment events into a single alert e-mail

I want to send a single alert on a successful deployment in Azure. Alerting on individual events is a problem because that could get too noisy.
Grouping events by their correlation id and sending that in a single e-mail would be great. Sort of like what you might find in the deployment overview page in the Azure portal:
Your deployment is complete
Deployment name: mesh_rp.linux
Subscription: AcmeDevTest
Resource group: rg-mesh-demo
Start time: 11/29/2018 9:00:00 AM
Duration: 2 minutes 56 seconds
Correlation ID: 11111111-1111-1111-1111-111111111111
Resource TYPE STATUS
HelloWorldApp Microsoft.ServiceFabricMesh/applications OK
HellowWorldNetwork Microsoft.ServiceFabricMesh/networks OK
How would I go about grouping those events by their correlation id and then firing that off in an e-mail?
Is this possible with Azure Monitor or is something like logic apps and event grid the way to go here?
Azure Monitor would be best for you.
From Azure portal, select Monitor > Alerts
Click New Alert Rule at the top of the Alerts window.
Configure Alert target and Target criteria
Configure Action group to send an email.
For a successful deployment, you should choose "create new deployment" as the target criteria.
For more details, please refer to the following:
https://learn.microsoft.com/en-us/azure/monitoring-and-diagnostics/alert-activity-log

Azure Data Factory: event not starting pipeline

I've set up a Azure Data Factory pipeline containing a copy activity. For testing purposes both source and sink are Azure Blob Storages.
I wan't to execute the pipeline as soon as a new file is created on the source Azure Blob Storage.
I've created a trigger of type BlovEventsTrigger. Blob path begins with has been set to //
I use Cloud Storage Explorer to upload files but it doesn't trigger my pipeline. To get an idea of what is wrong, how can I check if the event is fired? Any idea what could be wrong?
Thanks
Reiterating what others have stated:
Must be using a V2 Storage Account
Trigger name must only contain letters, numbers and the '-' character (this restriction will soon be removed)
Must have registered subscription with Event Grid resource provider (this will be done for you via the UX soon)
Trigger makes the following properties available #triggerBody().folderPath and #triggerBody().fileName. To use these in your pipeline your must map them to pipeline paramaters and use them as such: #pipeline().parameters.paramaetername.
Finally, based on your configuration setting blob path begins with to // will not match any blob event. The UX will actually show you an error message saying that that value is not valid. Please refer to the Event Based Trigger documentation for examples of valid configuration.
Please reference this. First, it needs to be a v2 storage. Second, you need register it with event grid.
https://social.msdn.microsoft.com/Forums/azure/en-US/db332ac9-2753-4a14-be5f-d23d60ff2164/azure-data-factorys-event-trigger-for-pipeline-not-working-for-blob-creation-deletion-most-of-the?forum=AzureDataFactory
There seems to be a bug with Blob storage trigger, if you have more than one trigger is allocated to the same blob container, none of the triggers will fire.
For some reasons (another bug, but this time in Data factories?), if you edit several times your trigger in the data factory windows, the data factory seems to loose track of the triggers it creates, and your single trigger may end up creating multiple duplicate triggers on the blob storage. This condition activates the first bug discussed above: the blob storage trigger doesn't trigger anymore.
To fix this, delete the duplicate triggers. For that, navigate to your blob storage resource in the Azure portal. Go to the Events blade. From there you'll see all the triggers that the data factories added to your blob storage. Delete the duplicates.
And now, on 20.06.2021, same for me: event trigger is not working, though when editing it's definition in DF, it shows all my files in folder, that matches. But when i add new file to that folder, nothing happens!
If you're creating your trigger via arm template, make sure you're aware of this bug. The "runtimeState" (aka "Activated") property of the trigger can only be set as "Stopped" via arm template. The trigger will need to be activated via powershell or the ADF portal.
An event grid resource provider needs to have been registered, within the specific azure subscription.
Also if you use Synapse Studio pipelines instead of Data Factory (like me) make sure the Data Factory resource provider is also registered.
Finally, the user should have both 'owner' and 'storage blob data contributor' on the storage account.

Resources