How to ingest Azure web app/web job custom logs into a Azure Monitor Log analytics workspace - azure

We have websites and webjobs hosted in Azure app services that log custom application log data to Azure blob storage (using Monitoring > App service Logs > Application Logging (Blob) option in the app service). We would like to send these log files to Azure Monitor Log analytics workspace as and when they are inserted into the blob storage, so we can aggregate the logs, send alerts etc. Looks like it's easy to send custom log data from a Azure VM to a Log analytics workspace by installing a Microsoft Monitoring agent on the VM but looks like there is no direct support to send the log data from a blob storage. Does anybody have a solution for this?
I've explored using Logic apps for sending data from Blob storage to a Log analytics workspace but didn't have much luck.

AFAIK current best approach to accomplish your requirement is to make use of Azure Log Analytics HTTP Data Collector API which helps to send custom log data to Log Analytics workspace repository. For illustration, you can see sample code as well in the article.
Hope this helps!! Cheers!!

One thing to watch for are the data limits for the Azure Log Analytics HTTP Data Collector API, especially if you're logging potentially large blobs from blob storage.
Quoting from the Data limits section of the Send log data to Azure Monitor by using the HTTP Data Collector API (preview) document:
The data posted to the Azure Monitor Data collection API is subject to
certain constraints:
Maximum of 30 MB per post to Azure Monitor Data Collector API. This is a size limit for a single post. If the data from a single post
exceeds 30 MB, you should split the data into smaller sized chunks and
send them concurrently.
Maximum of 32 KB for field values. If the field value is greater than 32 KB, the data will be truncated.
Recommended maximum of 50 fields for a given type. This is a practical limit from a usability and search experience perspective.
Tables in Log Analytics workspaces support only up to 500 columns (referred to as fields in this article).
Maximum of 45 characters for column names.

Related

where i can find azure logs for tracking api response

I want to view logs in azure, I mean logs that I have in the console in localhost where I can find them in web site deployed in azure? I am consuming an external API and I want to see what I send and what I received from the prod env
thank you
There are couple of ways to track logs in Azure
Azure API Management
Azure Monitor
Azure API Management helps you track all kinds of requests including
View activity logs
View resource logs
View metrics of your API
Set up an alert rule when your API gets unauthorized calls
Azure Monitor on the other hand helps it possible to programmatically retrieve the available default metric definitions, granularity, and metric values.
The data can be saved in a separate data store such as Azure SQL Database, Azure Cosmos DB, or Azure Data Lake. From there additional analysis can be performed as needed.

Searching Storage Account with Azure Log Analytics

Using Log Analytics, is it possible to search thru data stored in a container inside an Azure storage account? We have an Azure Function that reaches out to an API in O365 for log data and then it pushes that data into a storage account. We would like to be able to query this data.
We can push content inside your container to log analytics workspace repository using something called log analytics http data collector API.
We need to build your own integration of sending container content to log analytics by leveraging http data collector API.
You may refer to the suggestion mentioned in the article
https://learn.microsoft.com/en-us/azure/azure-monitor/platform/data-collector-api
Additional information: - Azure Functions
- Azure Automation
- Logic App
With any of these what you will do is have some schedule that will run on certain interval. When it is ran you will execute query against Log Analytics to get data. The results from the query you will transfer to Azure Storage may be as blob. You might have to do some transformation on the data depending on your scenario. The most important that you have to make sure is that you do not miss data or upload the same data twice to the storage. Log Analytics query language allows you to specify time frame for the results. I hope this will help you.
Kindly let us know if the above helps or you need further assistance on this issue.

Azure Data Factory Pipeline Logs

Where does data logs of Azure Pipeline v2 gets stored, I would like to retrieve data of failed pipelines for specific date.( Dont want to use azure portal to view these data). Is there any table/view holds such datalogs from database.
To my knowledge, to obtain diagnostic logs you can use Azure Monitor, Operations Management Suite (OMS), or monitor those pipelines visually.
By Azure Pipeline v2, you mean Azure Data Factory v2. Alert and Monitor data factories using Azure Monitor
Diagnostic logs:
Save them to a Storage Account for auditing or manual inspection. You can specify the retention time (in days) using the diagnostic settings.
Stream them to Event Hubs for ingestion by a third-party service or custom analytics solution such as PowerBI.
Analyze them with Log Analytics
The logs are stored on Azure Data Factory web server for 45 days. If you want to get the pipeline run and activity run metadata, you can use Azure Data Factory SDK to extract the information you need and save it somewhere you want.
Recommended approach on this for log term analysis as well as limiting access to a production data factory would be to configure logs to be sent to log analytics. Be sure to enable dedicated logging tables as this will help on the backend in terms of organizing your logs.
From there you can also set up alerts and access groups running off of log analytics queries for better monitoring.

Azure activity logs not displaying any write data

I'm trying to set up logging for a storage resource (table specifically, though it seems like the activity log doesn't and just logs the entire Storage account).
The logging seems to log my ListKeys operations, occasional access from ApplicationInsights, but isn't logging any writes/reads I'm making to the tables themselves through either my app or the Microsoft Azure Storage Explorer. This table has been written to multiple times over the past few weeks, but yet none of that activity shows up.
Am I misinterpreting this page, which states that this activity log should track posts/deletes? Do I need any additional setup to track these operations?
Per my understanding, you could leverage Storage Analytics logging to log the operations on your storage. For the detailed operations that are logged for the corresponding storage service, you could refer to this official document.
According to your description, I have tested my operations against table storage by using REST API and Storage Explorer Tool. Here is my test result, you could refer to it.
Table Storage Analytics logging
Table Storage Metrics
As noted in this document:
As requests are logged, Storage Analytics will upload intermediate results as blocks. Periodically, Storage Analytics will commit these blocks and make them available as a blob.
In summary, please follow this tutorial to enable and configure Storage Analytics, then wait for some time and check your table storage logging.
If you are leveraging the Azure Activity log, remember that it is meant for control plane operations. So listkeys would show up there.
if you are looking for data plane operations (such as entity writes into a table), then make sure Diagnostics are turned on inside the Storage account that you are writing to.
Azure Activity Log is only for management plane records through Azure Resource Manager (ARM), specifically PUT/DELETE/POST which includes ListKeys which is an HTTP POST.
For storage analytics logging, you can use this article to see the types of data logged.

Azure storage metrics data

I am trying to implement azure storage metrics code in my role but i am checking if there is easy way to get azure storage metric data about my files usage. my code is stable and i do not want to change code again.
Actually if you already have Windows Azure role running, then you don't need to make any changes to your code and you still can get Windows Azure Blob storage Metrics data.
I have written a blog about it last time as Collecting Windows Azure Storage REST API level metrics data without a single line of programming, just by using tools
Please try above and see if this works for you.
Storage analytics is disabled by default, so any operations against your storage up til now has not been logged for analysis.
You may choose to enable analytics at any time, for both logging (detailed access information for every single object) and metrics (hourly rollups). Further, you may choose which specific storage service to track (blobs, tables, queues) and which operations to track (read, write, delete). Once analytics are enabled, you may access the resulting analytics data from any app (as long as you have the storage account name + key).
Persistent Systems just published a blog post on enabling storage analytics for Java apps. The same principles may be applied to a .net app (and the sdk's are very similar).
Additionally, Full Scale 180 published a sample app encapsulating storage analytics (based on REST API, as it was written before SDK v1.6 came out).

Resources