how would you go about collecting all the audit failure data from the security log of a virtual machine to an Azure Storage account
I assume you are mainly interested in logon failures to your virtual machines. If this is the case, you can enable Standard tier Security Center and create a Log Analytics workspace.
After you have deployed the Log Analytics workspace you can connect it to your VM and collect event log data. Go to Logs under the Monitoring section of your VM and assign it to your Log Analytics workspace.
Related
For storing Azure SQL database logs, is it necessary to explicitly create blob for logs storage or it is implicitly created. I read this and this post but still not sure about it?
Also, for Java/Spring application deployed in App Services, when using App Insights, Log Analytics and Azure Monitor for Application logs, HTTP and Access logs and DB logs, do I need to explicitly setup blob for storing logs?
No, you do not need to create a blob storage for Azure SQL database logs as they are stored in Azure SQL database transaction logs and can be viewed using Azure Monitor or audited using Azure SQL Auditing.
Steps to check the Logs in SQL DB under Monitor section.
After creating azure SQL database and server.
Go to monitoring tab as mentioned in below screenshot and the logs can be viewed.
Approach 2
Using Log Analytics
Create a Log analytics workspace in Azure.
And go to the SQL Database and choose the Diagnostics from left pane in monitoring tab.
Add a diagnostic setting and choose the created log analytics and choose the log option as mentioned in below screenshot.
You can find the logs as shown below.
To store the Azure SQL Logs explicitly
You need to create 'Storage Account' for storing logs.
And have to enable Azure Monitor Logs from your SQL server and select 'Diagnostic logs' from the Azure Monitor menu and then, turn on the logs and select the storage account you created.
And configure log retention by selecting the Logs tab in the Azure Monitor menu, and then choose 'Retention policy' to configure how long logs will be retained.
To verify logs in the storage account, go to the storage account and select 'Containers.' You should see a container named 'insights-logs-sqlserverauditlog.' You can then browse the logs stored in this container.
I have on-prem Linux machines, we store their logs in our Azure storage account as blobs.
Can we use Azure log analytics to collect these logs stored in the blob storage?
Yes, we can use Azure log analytics to collect the logs. There are 2 ways:
WAY-1
Try following the below steps
Fill in the required parameters and execute the script locally or in Azure Cloud Shell.
This PowerShell script downloads the logs from Azure Storage.
Convert the diagnostic logs into JSON format, as that is what the API expects.
Load custom data into Log Analytics using the HTTP Data Collector API.
For more information, you can refer to this Document
WAY-2
There is a direct process where you can install a log analytics agent in the virtual machines through Azure Monitor.
Azure Monitor collects the data directly from your physical or virtual Linux computers in your environment into a Log Analytics workspace for detailed analysis and correlation using the azure log analytics agents.
Steps to follow in order to collect data from data center using Azure monitor
STEP - 1: Install the log analytics agent for Linux
Enter the log analytics workspace id and primary key in the following command.
wget https://raw.githubusercontent.com/Microsoft/OMS-Agent-for-Linux/master/installer/scripts/onboard_agent.sh && sh onboard_agent.sh -w <YOUR WORKSPACE ID> -s <YOUR WORKSPACE PRIMARY KEY>
2. Enter the log analytics workspace id in the following command and restart the agent
sudo /opt/microsoft/omsagent/bin/service_control restart [<workspace id>]
STEP - 2: Collect the event and performance data
Azure portal --> Log Analytics --> Log Analytics workspace --> Advance settings --> Data --> Syslog --> '+' sign.
Uncheck the severities Info, Notice, Debug and then Apply.
Now go to Linux Performance counters and click Add Recommended Counters and click Apply.
STEP - 3: View Collected Data
Log Analytics Workspace --> Logs (from the left pane)
On the Logs query page, type Perf in the query editor and select Run.
For more information, you can refer to this Blog.
In SQL Azure, there are two options for getting database events in Azure portal are Auditing and Diagnostic settings.
In which scenario we will use them.
Azure SQL database Auditing is usually used for:
Retain an audit trail of selected events. You can define categories
of database actions to be audited.
Report on database activity. You can use pre-configured reports and a
dashboard to get started quickly with activity and event reporting.
Analyze reports. You can find suspicious events, unusual activity,
and trends.
Diagnostics settings:
You can use the Diagnostics settings menu for each single, pooled, or instance database in Azure portal to configure streaming of diagnostics telemetry. In addition, diagnostic telemetry can also be configured separately for database containers: elastic pools and managed instances. You can set the following destinations to stream the diagnostics telemetry: Azure Storage, Azure Event Hubs, and Azure Monitor logs.
As 4c74356b41 said, they are difference things and with different uses.
Please reference:
Azure SQL Database metrics and diagnostics logging
Get started with SQL database auditing
Hope this helps.
Just found this "When auditing is configured to a Log Analytics workspace or to an Event Hub destination via the Azure portal or PowerShell cmdlet, a Diagnostic Setting is created with "SQLSecurityAuditEvents" category enabled.".
link
For me, this means that auditing is an easier way to enable one of the features of Diagnostic Settings.
Just getting started with Log Analytics. What will happen if I enable Diagnostics Logging in a VM which will send the data to a storage account that I then connect to Log Analytics and then also connect the VM directly to Log Analytics?
I have some VMs running on Azure Service. I'd like to redirect logs from them (Windows Event Logs and MS SQL server logs) to a specific log concentrator (like Graylog). For Windows logs, I'm using Nxlog (https://nxlog.co/docs/nxlog-ce/nxlog-reference-manual.html#quickstart_windows). However, for specific (PaaS) applications such as SQL Server (PaaS in general) Nxlog does not apply.
Is there a way to redirect logs (VMs and PaaS) just using Azure (web) tools?
Most services keep their logs in a Storage Account so you can tap into that source and forward logs to your own centralized log database. You generally define the storage account at the place you enable diagnostics for the service.
Don't know what king of logs you are looking for in SQL DB, but for example the audit logs are saved in a storage account.
Azure Operations Management Suite (OMS) can ingest from dozens of services as well as custom logs. As itaysk mentioned, most services in Azure write service related diagnostic information to a storage account. It's really easy to ingest these from within OMS.
https://azure.microsoft.com/en-us/services/log-analytics/
For Azure Web Sites, you can use Application Insights and store custom metrics as well. There's also an option to continuously write these metrics to a storage account.
Here's a similar option for Azure SQL:
https://azure.microsoft.com/en-us/documentation/articles/sql-database-auditing-get-started/