Azure Cloud Service (Classic) - Any way to log Diagnostic.Trace logs to BLOB storage - azure

I've been asked to change an old Azure Cloud Service worker's logging to the System.Diagnostics.Trace logging style of logging. I've done that and now I'm about ready to deploy it to azure.
The client requirement is that these logs should appear in blob storage, similar to how the more modern app service logs can be configured to write their diagnostics to blob storage. There is an expectation that logs can be batched up and uploaded periodically (perhaps time or number of lines based).
Is there a nuget package or other library or config I should enable to connect the application to blob storage? I've spent about 20 mins searching here and online for a solution, but information seems to mainly talk about writing logs to Table Storage..
Edit: More detail:
This is an existing app (C# .Net Framework 4.5) that used to use an external logging service.
I assumed (incorrectly, I think) that the logging to blob storage was something I could configure in the Azure Portal.
As things are right now, NO log file of any kind is generated, but when I run the code in Visual Studio, I can see some Output from the logging statements
I have updated the code to use a standard (custom) logging system
that eventually boils down to using statements like the below:
Trace.TraceInformation($"DEBUG: {message}");
Here are some links I found with related information:
Streaming from command line
Trace listener question
Adding Trace to existing website
Performance Impact of Logging
Smarx Library

The logging is configured by the diagnostics.wadcfgx file which you can see in your solution.
This holds all of the diagnostic information that you want to collect. This can be controlled via the "Properties" of the Web\Worker role (right-click -> Properties).
From there, there is also the option to specify the Storage Account:
This isn't always ideal if you are deploying to multiple environments, so you should be able to alter the configuration from the Azure Portal, by downloading and uploading new configuration, following these instructions.

So logging to blob storage, think of it as uploading existing files to the blob storage. If your current app creates files, then you should use put blob property or blob append to add these files to blob storage. So you must interact with the storage SDK to make these transactions. You could also leverage logic apps which uses connectors to blob storage, and would perform certain actions based on specific triggers(time stamp and other conditions).
If you would like to see the generated logs in Azure Storage, you'll have to enable azure diagnostics but these logs would pertain to the storage account itself, not your app.
Since you mentioned that you see the output, you have to transfer that output as an object ex: (text file), then upload it to the storage account. You can find SDK information for C# here. I hope this helps.

Related

Azure Blob Storage logging not working at all

I was trying to setup logging for a container and followed the documentation by enabling Diagnostic settings (classic). However, while the $logs folder is created it simply stayed empty (for hours), while uploading and downloading files.
Am I missing something obvious?
Sometimes the logs may not appear in portal.Try checking $logs from azure storage explorer.
or they can be viewed through powershell or programmatically also.
Ensure that the Delete data check box is selected. Then, set the number of days that you would like log data to be retained .
Please check if retention policy is not set or set to 0,which may cause logs not to retain to show.
Reference :Enable and manage Azure Storage Analytics logs (classic) | Microsoft Docs
Its also good to note that application logs for few apps may not provide logs in blob storage and only can be done in file system which is why they use console.log('message') and console.error('message').
If we want to write Application logs to Azure blob storage ,firstly we need to enable Application log and configure blob storage for it on the Azure portal and keet its level to be verbose in some cases.
Other references:
c# - ASP.NET Core Application Logs Not Written To Blob in Azure App
Service - Stack Overflow
logging - Azure WebApp Not Sending Application Logs to Blob Storage- Stack Overflow
.net core - Azure WebJob not logging to blob storage - Stack
Overflow
So if someone else stumbles across this and after some back-and-forth with technical support, the problem seems to be related to some storage account settings. In particular with storage-account wide immutability settings.
What solved the problem for us was to disable immutability on the storage account level and to instead set it on the container level.

Azure App Service - cannot store logs in azure storage

I would like to configure my diagnostic logs to be redirected to Blob or Table Storage. However the only option I see is Filesystem:
My goal is to collect these logs in splunk
Currently only .NET application logs can be written to the blob storage. Java, PHP, Node.js, Python application logs can only be stored on the file system (without code modifications to write logs to external storage). I'd recommend checking the documentation here
According to this, redirecting to blob storage is only supported for Windows app service plans. Your one appears to be Linux.

Logging stdout/err from a nodejs azure web-app: Blobs still not supported?

nodejs apps on azure can log stdout and stderr to the file system (d:\home\logiles\Application)
The file system logs are automatically disabled after ~ 12 hours
Logging to blog would be a good alternative
According to the docs, it is still not supported for nodejs apps
Just wanted to know if that still holds true. I tried to turn ob blob logging in the portal: app => monitoring => diagnostics logs.
azure correctly creates folders (named after my app) in my blob container, but contain rather empty csv-files only.
Thanks a lot
Yes, the doc is right, Log to blob storage for nodejs is still not supported as of now.
As a workaround, you can take a look at winston-azure-blob-transport.
Hope it helps.

Azure WebApp Not Sending Application Logs to Blob Storage

My team has multiple Azure WebApps (Windows) running Node.js applications. We are using the Winston library to log service activity (e.g., requests). We have configured our Diagnostic Logging in each to store logs in Blob storage resources.
Using the Microsoft Azure Storage Explorer, we can see that there are multiple containers within Blob storage. It seems to be collecting information by the hour, but only 'snapshot' entries as CSV files and .log files with virtually no information. The files are small, which shouldn't be the case because traffic is consistent and we are logging a fair amount.
Our logging works in the filesystem format, but it's clearly not working in blob storage. We cannot seem to find a reason why our logs are not getting stored in our storage accounts.
Is there additional configuration necessary?
According to your description, I checked your issue and found that I could only get the logging via console.log and console.error from the KUDU path D:\home\LogFiles\Application\. Then I found a blog mentioned about application logs for node.js on azure web app as follows:
Setting application logs in the Azure portal
For node.js websites the way to write application logs is by writing to the console using console.log('message') and console.error('message') which goes to Information/Error level log entries. Currently the only supported target for the log files for node.js is the file system.
Other web site types like php and python are not supported for the application logs feature.
Here is a Azure blob storage adapter for popular nodejs logger, e.g. winston: winston-azure-blob-transport, you could leverage it for a workaround to collect the application logs from your node.js website into azure blob storage.

Azure storage metrics data

I am trying to implement azure storage metrics code in my role but i am checking if there is easy way to get azure storage metric data about my files usage. my code is stable and i do not want to change code again.
Actually if you already have Windows Azure role running, then you don't need to make any changes to your code and you still can get Windows Azure Blob storage Metrics data.
I have written a blog about it last time as Collecting Windows Azure Storage REST API level metrics data without a single line of programming, just by using tools
Please try above and see if this works for you.
Storage analytics is disabled by default, so any operations against your storage up til now has not been logged for analysis.
You may choose to enable analytics at any time, for both logging (detailed access information for every single object) and metrics (hourly rollups). Further, you may choose which specific storage service to track (blobs, tables, queues) and which operations to track (read, write, delete). Once analytics are enabled, you may access the resulting analytics data from any app (as long as you have the storage account name + key).
Persistent Systems just published a blog post on enabling storage analytics for Java apps. The same principles may be applied to a .net app (and the sdk's are very similar).
Additionally, Full Scale 180 published a sample app encapsulating storage analytics (based on REST API, as it was written before SDK v1.6 came out).

Resources