Can Synapse Spark connect to a "Log Analytics workspace"? - azure

I need to export from "Log Analytics workspace" to storage account in parquet / delta format. How can I achieve this?
Using the ADX Spark connector in a Notebook, it says that the URL is invalid. I’m using the URL of LAW instead of a cluster of AdX.

With a diffrent appraoch apart from ADX/Notebook, we can run the logic application afterwards and look for logs in the storage.
To learn more about Using a Logic App to archive data from the Log Analytics workspace to Azure storage, consult the Microsoft Documentation..
To do this, we only need access to the Log Analytics Workspace and Storage accounts.
Additionally, we can construct a trigger in the logic app that will run once daily or as frequently as needed to upload all the new data..
After the device has been activated, click Add New step to add an action that runs after the recurring action. Click "Azure Monitor Logs" after typing "azure monitor" into the "Choose an action" box. After setting up the full workflow, make a blob and attach it to the workflow as shown below:
Later, we can execute the logic application and search the log store.
Reference link: Microsoft Documentation
Export data from a Log Analytics workspace to a storage account by using Logic Apps

Related

Azure SQL storing database logs

For storing Azure SQL database logs, is it necessary to explicitly create blob for logs storage or it is implicitly created. I read this and this post but still not sure about it?
Also, for Java/Spring application deployed in App Services, when using App Insights, Log Analytics and Azure Monitor for Application logs, HTTP and Access logs and DB logs, do I need to explicitly setup blob for storing logs?
No, you do not need to create a blob storage for Azure SQL database logs as they are stored in Azure SQL database transaction logs and can be viewed using Azure Monitor or audited using Azure SQL Auditing.
Steps to check the Logs in SQL DB under Monitor section.
After creating azure SQL database and server.
Go to monitoring tab as mentioned in below screenshot and the logs can be viewed.
Approach 2
Using Log Analytics
Create a Log analytics workspace in Azure.
And go to the SQL Database and choose the Diagnostics from left pane in monitoring tab.
Add a diagnostic setting and choose the created log analytics and choose the log option as mentioned in below screenshot.
You can find the logs as shown below.
To store the Azure SQL Logs explicitly
You need to create 'Storage Account' for storing logs.
And have to enable Azure Monitor Logs from your SQL server and select 'Diagnostic logs' from the Azure Monitor menu and then, turn on the logs and select the storage account you created.
And configure log retention by selecting the Logs tab in the Azure Monitor menu, and then choose 'Retention policy' to configure how long logs will be retained.
To verify logs in the storage account, go to the storage account and select 'Containers.' You should see a container named 'insights-logs-sqlserverauditlog.' You can then browse the logs stored in this container.

Azure Monitor Export to a SQL Server

I need the near real-time front end data from a web app for use in PowerBI. I need to keep this data forever.
I would like to automatically export the App customEvents and pageViews tables for this purpose.
It seems like I need to go from Azure Logs -> Azure Storage Account -> Azure SQL Server -> PowerBI
The steps I'm having trouble with are going from Logs to storage, and then getting the data that's passed into there into a SQL server.
To send logs to Storage Accounts, Event Hubs and Log Analytics, go to the App Service and on the left panel select Diagnostic setting and click on + Diagnostic settings.
Select the options which are shown in below image to store the logs in Storage account and click on Save.
You can now use Azure Data Factory service to copy the logs from Azure Storage account to Azure SQL Database.
Please refer this tutorial from Microsoft – Copy data from Azure Blob storage to a SQL Database by using the Copy Data tool to implement the same.
Once data available in Database, we are good to use Power BI to read the data.
Open the Power BI dashboard and click on Get data from another source ->.
Select Azure -> Azure SQL Database and click on Connect.
Give the server’s name.
In the next step just give the username and password for your account and you will get the access.
Now you can select the data from any table and showcase it in Power BI dashboard as per of your requirement.

Log Analytics data export to storage account- All tables

I want to use Azure Log Analytics with the data export feature to export all log tables to a storage account. There used to be an '--export-all-tables' option, but annoyingly this has been removed.
Is there a way I can export all tables? Not just the ones that exist at the moment, but any future ones that may be created?
Azure Policy?
Azure Functions?
Azure Logic App?
We can archive the data with the help of Logic App, as we run a query from a logic app and uses its output in other actions in the workflow. So here Azure Blob Storage connector is used to send query output to blob storage.
Here we just need Log Analytics Workspace and Storage account access to achieve this.
And to add on all the new data, we can create a trigger in logic app where we can run it once in a day according to our requirement.
After setting up the trigger “Click + New step to add an action that runs after the recurrence action. Under Choose an action, type azure monitor and then select Azure Monitor Logs.”
Later after configuring the whole workflow create blob and attach it to workflow as below:
Later we can run the logic app and check the storage for the logs.
Check for the Microsoft Documentation to understand more about Archive data from Log Analytics workspace to Azure storage using Logic App

How to get Users Logging Information Using Log Analytics in Azure SQL Database

I am trying to get Users logged in information in Azure SQL Database Using Log Analytics. Can it be possible if so, can you please help me on this ?
Below are Options available in Diagnostic Settings for Azure SQL Database.
Click 'Add Diagnostic setting' above to configure the collection of the following data:
DmsWorkers
ExecRequests
RequestSteps
SqlRequests
Waits
Basic
InstanceAndAppAdvanced
WorkloadManagement
I want to achieve this without using Sys schemas objects related to Azure SQL Databases.
Thanks,
Brahma
You need to enable Auditing in Azure SQL Server using and then you can check the logs in Azure Log Analytics.
Easiest way to enable auditing is through the Azure Portal. However, it can be easily set up through ARM templates, Azure Powershell, Azure CLI.
Auditing can be enabled either at the individual database level or at the logical server level. If enabled at the server level then it automatically applies to existing databases and any new databases that are created.
However, enabling both at the server and database level leads to duplicate logs.
In the homepage of the desired Azure Sql server, in the left pane there is an option for “Auditing”.
By default, Auditing is off. Enable it. Choose the Log Analytics Workspace where you need to store the logs. Click on Save.
Click on Add diagnostics setting. Let us enable diagnostics for errors and InstanceAndAppAdvanced. Send this data to the log analytics workspace using your subscription and log analytics workspace. Click on Save for the configuration.
To view the logs, open up the Log Analytics workspace that was configured as a sink and choose logs and select the scope.
Summarizing the connection attempts by caller IP addresses
AzureDiagnostics
|summarize count() by client_ip_s
Source: https://www.mssqltips.com/sqlservertip/6782/kusto-query-language-query-audit-data-azure-sql-database/

Connecting storage account datalake 2 to log analytics workspace

I have a storage account datalake Gen2.
I need to connect my storage account logs to a Log analytics workspace.
But there is no Diagnostic Settings menu, so I don't know how to do.
I think this was supported by datalake Gen1, but is there a workaround for datalake gen 2?
thank you
There is a Diagnostic settings option at the end of the left sidebar, but you have to scroll quite a bit to find it.
Sadly, i believe currently there is no such option to automatically send diagnostic logs to a log analytics workspace. The active logs will be generated inside a folder named "$logs" located on the root path of your storage account, its only visible through Azure storage explorer.
Microsoft provides a Powershell script located at Azure GitHub which aims to upload the log files generated to a log analytics workspace of your choice.
You can refer to this official guide from Microsoft to build this workflow to send your logs to log analytics: Querying Azure Storage logs in Azure Monitor Log Analytics

Resources