Azure Blob Storage logging not working at all - azure

I was trying to setup logging for a container and followed the documentation by enabling Diagnostic settings (classic). However, while the $logs folder is created it simply stayed empty (for hours), while uploading and downloading files.
Am I missing something obvious?

Sometimes the logs may not appear in portal.Try checking $logs from azure storage explorer.
or they can be viewed through powershell or programmatically also.
Ensure that the Delete data check box is selected. Then, set the number of days that you would like log data to be retained .
Please check if retention policy is not set or set to 0,which may cause logs not to retain to show.
Reference :Enable and manage Azure Storage Analytics logs (classic) | Microsoft Docs
Its also good to note that application logs for few apps may not provide logs in blob storage and only can be done in file system which is why they use console.log('message') and console.error('message').
If we want to write Application logs to Azure blob storage ,firstly we need to enable Application log and configure blob storage for it on the Azure portal and keet its level to be verbose in some cases.
Other references:
c# - ASP.NET Core Application Logs Not Written To Blob in Azure App
Service - Stack Overflow
logging - Azure WebApp Not Sending Application Logs to Blob Storage- Stack Overflow
.net core - Azure WebJob not logging to blob storage - Stack
Overflow

So if someone else stumbles across this and after some back-and-forth with technical support, the problem seems to be related to some storage account settings. In particular with storage-account wide immutability settings.
What solved the problem for us was to disable immutability on the storage account level and to instead set it on the container level.

Related

How to find who has deleted the file share folder from the storage account in azure

Some one has deleted the file share folder from the storage account in azure . It can be recovered as soft delete is enabled. But how to know that who has deleted the file?
It is possible to view operations within an Azure resource using Resource Logs. This is possible by Monitoring Azure Blob Storage which is a feature of Azure Monitor.
You would first start with creating a Diagnostic Setting- https://learn.microsoft.com/en-us/azure/storage/blobs/monitor-blob-storage?tabs=azure-portal#creating-a-diagnostic-setting
And then view logged activity by using a Log Analytics query or you can go the destination that you are forwarding the logs to as setup in the diagnostics setting and look for the respective API, example- "DeleteBlob" or "DeleteContainer" etc.,
However, if you have not already setup a diagnostic setting already and are forwarding data to a specific destination, it may not be possible to retrieve this information right now. Hope this helps!

How to get request details coming to azure storage for monitoring?

From the past few days, I am being charged for the LRS read and write operation in azure container storage. I have checked all the metrics of API calling to that azure container, but all are working normally. But in storage container metrics its showing lots of request every minute
So here request is around 35k in last few days. But I don't know from where all this request are coming.
To get request details coming to Azure Storage for monitoring, you can make use of Storage Analytics
Azure Storage Analytics performs logging and provides metrics data for a storage account. You can use this data to trace requests, analyze usage trends, and diagnose issues with your storage account.
Please check whether you have enabled Azure Storage Analytics metrics, if not enable like below:
Go to Azure portal -> Storage Accounts -> Your storage account -> Monitoring -> Diagnostic settings
The diagnostic logs will be saved in container called $logs which will appear after enabling Storage Analytics Logging
Please check whether you have any Azure App services or Azure functions that is using storage in background by enabling their Diagnostic settings.
If the issue still persists, please raise Azure support request to know where exactly it is going wrong.
References:
How can I find the source of my Hot LRS Write Operations on Azure Storage Account? - Stack Overflow
Azure Storage Analytics metrics (classic) | Microsoft Docs

Diagnostic Logs (Storage) not showing up in Azure Storage Account

Im using services such as on prem data gateway, logic apps, etc, and I've configured diagnostics on these instances to send logs to both Log Analytics and Storage.
I go to the storage account, and I see zero log files.
I go to containers --> its empty
I go to tables --> its empty.
I really need to see the logs. What do i do?
Here are something you need to check.
1.There is a latency for the logs to take effect. Please wait a few minutes, then check if the logs are stored in blob storage.
2.Please check your Diagnostics settings, if you have configured everything correctly. Like you choose the log type and metrics.
3.Make sure you check the logs in the blob storage which you configured in your Diagnostics settings.

Azure Cloud Service (Classic) - Any way to log Diagnostic.Trace logs to BLOB storage

I've been asked to change an old Azure Cloud Service worker's logging to the System.Diagnostics.Trace logging style of logging. I've done that and now I'm about ready to deploy it to azure.
The client requirement is that these logs should appear in blob storage, similar to how the more modern app service logs can be configured to write their diagnostics to blob storage. There is an expectation that logs can be batched up and uploaded periodically (perhaps time or number of lines based).
Is there a nuget package or other library or config I should enable to connect the application to blob storage? I've spent about 20 mins searching here and online for a solution, but information seems to mainly talk about writing logs to Table Storage..
Edit: More detail:
This is an existing app (C# .Net Framework 4.5) that used to use an external logging service.
I assumed (incorrectly, I think) that the logging to blob storage was something I could configure in the Azure Portal.
As things are right now, NO log file of any kind is generated, but when I run the code in Visual Studio, I can see some Output from the logging statements
I have updated the code to use a standard (custom) logging system
that eventually boils down to using statements like the below:
Trace.TraceInformation($"DEBUG: {message}");
Here are some links I found with related information:
Streaming from command line
Trace listener question
Adding Trace to existing website
Performance Impact of Logging
Smarx Library
The logging is configured by the diagnostics.wadcfgx file which you can see in your solution.
This holds all of the diagnostic information that you want to collect. This can be controlled via the "Properties" of the Web\Worker role (right-click -> Properties).
From there, there is also the option to specify the Storage Account:
This isn't always ideal if you are deploying to multiple environments, so you should be able to alter the configuration from the Azure Portal, by downloading and uploading new configuration, following these instructions.
So logging to blob storage, think of it as uploading existing files to the blob storage. If your current app creates files, then you should use put blob property or blob append to add these files to blob storage. So you must interact with the storage SDK to make these transactions. You could also leverage logic apps which uses connectors to blob storage, and would perform certain actions based on specific triggers(time stamp and other conditions).
If you would like to see the generated logs in Azure Storage, you'll have to enable azure diagnostics but these logs would pertain to the storage account itself, not your app.
Since you mentioned that you see the output, you have to transfer that output as an object ex: (text file), then upload it to the storage account. You can find SDK information for C# here. I hope this helps.

Azure WebApp Not Sending Application Logs to Blob Storage

My team has multiple Azure WebApps (Windows) running Node.js applications. We are using the Winston library to log service activity (e.g., requests). We have configured our Diagnostic Logging in each to store logs in Blob storage resources.
Using the Microsoft Azure Storage Explorer, we can see that there are multiple containers within Blob storage. It seems to be collecting information by the hour, but only 'snapshot' entries as CSV files and .log files with virtually no information. The files are small, which shouldn't be the case because traffic is consistent and we are logging a fair amount.
Our logging works in the filesystem format, but it's clearly not working in blob storage. We cannot seem to find a reason why our logs are not getting stored in our storage accounts.
Is there additional configuration necessary?
According to your description, I checked your issue and found that I could only get the logging via console.log and console.error from the KUDU path D:\home\LogFiles\Application\. Then I found a blog mentioned about application logs for node.js on azure web app as follows:
Setting application logs in the Azure portal
For node.js websites the way to write application logs is by writing to the console using console.log('message') and console.error('message') which goes to Information/Error level log entries. Currently the only supported target for the log files for node.js is the file system.
Other web site types like php and python are not supported for the application logs feature.
Here is a Azure blob storage adapter for popular nodejs logger, e.g. winston: winston-azure-blob-transport, you could leverage it for a workaround to collect the application logs from your node.js website into azure blob storage.

Resources