Difference between Auditing vs Diagnostic settings in SQL Azure - azure

In SQL Azure, there are two options for getting database events in Azure portal are Auditing and Diagnostic settings.
In which scenario we will use them.

Azure SQL database Auditing is usually used for:
Retain an audit trail of selected events. You can define categories
of database actions to be audited.
Report on database activity. You can use pre-configured reports and a
dashboard to get started quickly with activity and event reporting.
Analyze reports. You can find suspicious events, unusual activity,
and trends.
Diagnostics settings:
You can use the Diagnostics settings menu for each single, pooled, or instance database in Azure portal to configure streaming of diagnostics telemetry. In addition, diagnostic telemetry can also be configured separately for database containers: elastic pools and managed instances. You can set the following destinations to stream the diagnostics telemetry: Azure Storage, Azure Event Hubs, and Azure Monitor logs.
As 4c74356b41 said, they are difference things and with different uses.
Please reference:
Azure SQL Database metrics and diagnostics logging
Get started with SQL database auditing
Hope this helps.

Just found this "When auditing is configured to a Log Analytics workspace or to an Event Hub destination via the Azure portal or PowerShell cmdlet, a Diagnostic Setting is created with "SQLSecurityAuditEvents" category enabled.".
link
For me, this means that auditing is an easier way to enable one of the features of Diagnostic Settings.

Related

How to get Users Logging Information Using Log Analytics in Azure SQL Database

I am trying to get Users logged in information in Azure SQL Database Using Log Analytics. Can it be possible if so, can you please help me on this ?
Below are Options available in Diagnostic Settings for Azure SQL Database.
Click 'Add Diagnostic setting' above to configure the collection of the following data:
DmsWorkers
ExecRequests
RequestSteps
SqlRequests
Waits
Basic
InstanceAndAppAdvanced
WorkloadManagement
I want to achieve this without using Sys schemas objects related to Azure SQL Databases.
Thanks,
Brahma
You need to enable Auditing in Azure SQL Server using and then you can check the logs in Azure Log Analytics.
Easiest way to enable auditing is through the Azure Portal. However, it can be easily set up through ARM templates, Azure Powershell, Azure CLI.
Auditing can be enabled either at the individual database level or at the logical server level. If enabled at the server level then it automatically applies to existing databases and any new databases that are created.
However, enabling both at the server and database level leads to duplicate logs.
In the homepage of the desired Azure Sql server, in the left pane there is an option for “Auditing”.
By default, Auditing is off. Enable it. Choose the Log Analytics Workspace where you need to store the logs. Click on Save.
Click on Add diagnostics setting. Let us enable diagnostics for errors and InstanceAndAppAdvanced. Send this data to the log analytics workspace using your subscription and log analytics workspace. Click on Save for the configuration.
To view the logs, open up the Log Analytics workspace that was configured as a sink and choose logs and select the scope.
Summarizing the connection attempts by caller IP addresses
AzureDiagnostics
|summarize count() by client_ip_s
Source: https://www.mssqltips.com/sqlservertip/6782/kusto-query-language-query-audit-data-azure-sql-database/

Is there access log or metadata for Azure Data Catalog

Is there a way to access the metadata of Azure Data Catalog? I looked up the documentation and went through the Azure Activity log of Azure Data Catalog. However, it seems like there is no access activities(i.e. who accessed Azure Data Catalog at what point of time) log I can use. Is there such activity anywhere in Azure at the moment?
Unfortunately there is no such way to check the activity logs. I would recommend you to please have a look at Azure Purview which has updated Data Catalog features.
You can refer to this document which has describes how to configure metrics, alerts, and diagnostic settings for Azure Purview using Azure Monitor: Azure Purview metrics in Azure Monitor

Azure Data Factory Pipeline Logs

Where does data logs of Azure Pipeline v2 gets stored, I would like to retrieve data of failed pipelines for specific date.( Dont want to use azure portal to view these data). Is there any table/view holds such datalogs from database.
To my knowledge, to obtain diagnostic logs you can use Azure Monitor, Operations Management Suite (OMS), or monitor those pipelines visually.
By Azure Pipeline v2, you mean Azure Data Factory v2. Alert and Monitor data factories using Azure Monitor
Diagnostic logs:
Save them to a Storage Account for auditing or manual inspection. You can specify the retention time (in days) using the diagnostic settings.
Stream them to Event Hubs for ingestion by a third-party service or custom analytics solution such as PowerBI.
Analyze them with Log Analytics
The logs are stored on Azure Data Factory web server for 45 days. If you want to get the pipeline run and activity run metadata, you can use Azure Data Factory SDK to extract the information you need and save it somewhere you want.
Recommended approach on this for log term analysis as well as limiting access to a production data factory would be to configure logs to be sent to log analytics. Be sure to enable dedicated logging tables as this will help on the backend in terms of organizing your logs.
From there you can also set up alerts and access groups running off of log analytics queries for better monitoring.

Event alerts in Azure SQL using auditing

We are looking to get alerts on specific tables in a Azure DB to initiate actions.These could be
a) calling a Azure Search indexer API so that changes get indexed in near-real time every time the datasource changes
b) push updates to users via SignalR when there is an update
I understand AzureDB has functional limitations and triggers cannot invoke the Azure function directly due to lack of CLR support.
Azure SQL Database trigger to insert audit info into Azure Table
Given the triggers in Azure cannot invoke APIs, I was told that we could get information on database updates via the auditing. However, the auditing seems to be at the blob level- not at a table level
Given the multiple changes underway, it would be quite a task and time-delay to check out for the event of interest from the blob.
While there is a line around the blob auditing being configurable(supports higher granularity object-level auditing), I couldn't find an approach to limit blob updates to certain table alone. Any pointers would be appreciated
Also, given blob auditing seems to be built for threat detection/ regulatory purposes,any issues using this for event alerts?
Additional info on granular Blob Auditing in Azure SQL DB (including limiting the audit policy to specific tables/views) can be found here:
https://msdn.microsoft.com/library/azure/mt695939.aspx
Please note that we have also created the following OMS integration app for advanced analysis of Audit logs - you can use it to push the Blob audit logs into OMS, and then create customized alerts in OMS on top of the audit logs (this is a temporary solution, until our fully integrated OMS/Log Analytics solution is available):
https://github.com/Microsoft/Azure-SQL-DB-auditing-OMS-integration
Best Regards,
Gilad Mittelman
SQL Security, Microsoft

Redirecting Azure logs to a particular log service

I have some VMs running on Azure Service. I'd like to redirect logs from them (Windows Event Logs and MS SQL server logs) to a specific log concentrator (like Graylog). For Windows logs, I'm using Nxlog (https://nxlog.co/docs/nxlog-ce/nxlog-reference-manual.html#quickstart_windows). However, for specific (PaaS) applications such as SQL Server (PaaS in general) Nxlog does not apply.
Is there a way to redirect logs (VMs and PaaS) just using Azure (web) tools?
Most services keep their logs in a Storage Account so you can tap into that source and forward logs to your own centralized log database. You generally define the storage account at the place you enable diagnostics for the service.
Don't know what king of logs you are looking for in SQL DB, but for example the audit logs are saved in a storage account.
Azure Operations Management Suite (OMS) can ingest from dozens of services as well as custom logs. As itaysk mentioned, most services in Azure write service related diagnostic information to a storage account. It's really easy to ingest these from within OMS.
https://azure.microsoft.com/en-us/services/log-analytics/
For Azure Web Sites, you can use Application Insights and store custom metrics as well. There's also an option to continuously write these metrics to a storage account.
Here's a similar option for Azure SQL:
https://azure.microsoft.com/en-us/documentation/articles/sql-database-auditing-get-started/

Resources