Log Analytics data export to storage account- All tables - azure

I want to use Azure Log Analytics with the data export feature to export all log tables to a storage account. There used to be an '--export-all-tables' option, but annoyingly this has been removed.
Is there a way I can export all tables? Not just the ones that exist at the moment, but any future ones that may be created?
Azure Policy?
Azure Functions?
Azure Logic App?

We can archive the data with the help of Logic App, as we run a query from a logic app and uses its output in other actions in the workflow. So here Azure Blob Storage connector is used to send query output to blob storage.
Here we just need Log Analytics Workspace and Storage account access to achieve this.
And to add on all the new data, we can create a trigger in logic app where we can run it once in a day according to our requirement.
After setting up the trigger “Click + New step to add an action that runs after the recurrence action. Under Choose an action, type azure monitor and then select Azure Monitor Logs.”
Later after configuring the whole workflow create blob and attach it to workflow as below:
Later we can run the logic app and check the storage for the logs.
Check for the Microsoft Documentation to understand more about Archive data from Log Analytics workspace to Azure storage using Logic App

Related

Can Synapse Spark connect to a "Log Analytics workspace"?

I need to export from "Log Analytics workspace" to storage account in parquet / delta format. How can I achieve this?
Using the ADX Spark connector in a Notebook, it says that the URL is invalid. I’m using the URL of LAW instead of a cluster of AdX.
With a diffrent appraoch apart from ADX/Notebook, we can run the logic application afterwards and look for logs in the storage.
To learn more about Using a Logic App to archive data from the Log Analytics workspace to Azure storage, consult the Microsoft Documentation..
To do this, we only need access to the Log Analytics Workspace and Storage accounts.
Additionally, we can construct a trigger in the logic app that will run once daily or as frequently as needed to upload all the new data..
After the device has been activated, click Add New step to add an action that runs after the recurring action. Click "Azure Monitor Logs" after typing "azure monitor" into the "Choose an action" box. After setting up the full workflow, make a blob and attach it to the workflow as shown below:
Later, we can execute the logic application and search the log store.
Reference link: Microsoft Documentation
Export data from a Log Analytics workspace to a storage account by using Logic Apps

Searching Storage Account with Azure Log Analytics

Using Log Analytics, is it possible to search thru data stored in a container inside an Azure storage account? We have an Azure Function that reaches out to an API in O365 for log data and then it pushes that data into a storage account. We would like to be able to query this data.
We can push content inside your container to log analytics workspace repository using something called log analytics http data collector API.
We need to build your own integration of sending container content to log analytics by leveraging http data collector API.
You may refer to the suggestion mentioned in the article
https://learn.microsoft.com/en-us/azure/azure-monitor/platform/data-collector-api
Additional information: - Azure Functions
- Azure Automation
- Logic App
With any of these what you will do is have some schedule that will run on certain interval. When it is ran you will execute query against Log Analytics to get data. The results from the query you will transfer to Azure Storage may be as blob. You might have to do some transformation on the data depending on your scenario. The most important that you have to make sure is that you do not miss data or upload the same data twice to the storage. Log Analytics query language allows you to specify time frame for the results. I hope this will help you.
Kindly let us know if the above helps or you need further assistance on this issue.

Can azure event hub ingest json events from azure blog storage without writing any code?

Is it possible to use some ready made construct in azure cloud environment to ingest the events (in json format) that are currently stored in azure blob storage and have it submit those events directly to azure event hub without writing any (however small) custom code? In other words, I would like to use configuration driven approach only.
Sure. You can try to use Azure Logic Apps to realize your needs without any code or just with some function expressions, please refer to the offical documents of Azure Logic Apps to know more details.
The logic flow is as the figure below.
You can refer to my sample below to make it works.
Here is my sample to receive an event from my EventHub and transfer to Azure Blob Storage to create a new blob for storing the event data.
Create an Azure Logic App instance on Azure portal, it should be easy for you.
Move to the tab Logic app designer to configure the logic flow.
Click Save and Run buttons. Then, use ServiceBusExplorer (downloaded from https://github.com/paolosalvatori/ServiceBusExplorer/releases) to send event message and check whether new blob created using AzureStorageExplorer. It works fine after a few minutes.

Azure Data Factory Pipeline Logs

Where does data logs of Azure Pipeline v2 gets stored, I would like to retrieve data of failed pipelines for specific date.( Dont want to use azure portal to view these data). Is there any table/view holds such datalogs from database.
To my knowledge, to obtain diagnostic logs you can use Azure Monitor, Operations Management Suite (OMS), or monitor those pipelines visually.
By Azure Pipeline v2, you mean Azure Data Factory v2. Alert and Monitor data factories using Azure Monitor
Diagnostic logs:
Save them to a Storage Account for auditing or manual inspection. You can specify the retention time (in days) using the diagnostic settings.
Stream them to Event Hubs for ingestion by a third-party service or custom analytics solution such as PowerBI.
Analyze them with Log Analytics
The logs are stored on Azure Data Factory web server for 45 days. If you want to get the pipeline run and activity run metadata, you can use Azure Data Factory SDK to extract the information you need and save it somewhere you want.
Recommended approach on this for log term analysis as well as limiting access to a production data factory would be to configure logs to be sent to log analytics. Be sure to enable dedicated logging tables as this will help on the backend in terms of organizing your logs.
From there you can also set up alerts and access groups running off of log analytics queries for better monitoring.

Data export from SQL Azure Database

We have a requirement to provide data in the form of a text file from our database to different vendors. The file should be generated on a daily basis. Is there any resource or application in azure that we can leverage in order to accomplish this?
Regards,
Lolek
You can use Azure Functions to read from a SQL Azure database as explained on the following resource:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-scenario-database-table-cleanup
The following resource shows how you can write from an Azure Function to a BLOB storage account.
https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/azure-functions/functions-reference-csharp.md#binding-at-runtime-via-imperative-bindings
The following article shows you how to schedule the Azure Function or automate its execution.
https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-scheduled-function
Hope this helps.
Use Azure Data factory to export the desired data from Sql Azure, schedule the job to put the file in Blob storage.

Resources