Diagnostic Logs (Storage) not showing up in Azure Storage Account - azure

Im using services such as on prem data gateway, logic apps, etc, and I've configured diagnostics on these instances to send logs to both Log Analytics and Storage.
I go to the storage account, and I see zero log files.
I go to containers --> its empty
I go to tables --> its empty.
I really need to see the logs. What do i do?

Here are something you need to check.
1.There is a latency for the logs to take effect. Please wait a few minutes, then check if the logs are stored in blob storage.
2.Please check your Diagnostics settings, if you have configured everything correctly. Like you choose the log type and metrics.
3.Make sure you check the logs in the blob storage which you configured in your Diagnostics settings.

Related

Azure SQL storing database logs

For storing Azure SQL database logs, is it necessary to explicitly create blob for logs storage or it is implicitly created. I read this and this post but still not sure about it?
Also, for Java/Spring application deployed in App Services, when using App Insights, Log Analytics and Azure Monitor for Application logs, HTTP and Access logs and DB logs, do I need to explicitly setup blob for storing logs?
No, you do not need to create a blob storage for Azure SQL database logs as they are stored in Azure SQL database transaction logs and can be viewed using Azure Monitor or audited using Azure SQL Auditing.
Steps to check the Logs in SQL DB under Monitor section.
After creating azure SQL database and server.
Go to monitoring tab as mentioned in below screenshot and the logs can be viewed.
Approach 2
Using Log Analytics
Create a Log analytics workspace in Azure.
And go to the SQL Database and choose the Diagnostics from left pane in monitoring tab.
Add a diagnostic setting and choose the created log analytics and choose the log option as mentioned in below screenshot.
You can find the logs as shown below.
To store the Azure SQL Logs explicitly
You need to create 'Storage Account' for storing logs.
And have to enable Azure Monitor Logs from your SQL server and select 'Diagnostic logs' from the Azure Monitor menu and then, turn on the logs and select the storage account you created.
And configure log retention by selecting the Logs tab in the Azure Monitor menu, and then choose 'Retention policy' to configure how long logs will be retained.
To verify logs in the storage account, go to the storage account and select 'Containers.' You should see a container named 'insights-logs-sqlserverauditlog.' You can then browse the logs stored in this container.

Azure Blob Storage logging not working at all

I was trying to setup logging for a container and followed the documentation by enabling Diagnostic settings (classic). However, while the $logs folder is created it simply stayed empty (for hours), while uploading and downloading files.
Am I missing something obvious?
Sometimes the logs may not appear in portal.Try checking $logs from azure storage explorer.
or they can be viewed through powershell or programmatically also.
Ensure that the Delete data check box is selected. Then, set the number of days that you would like log data to be retained .
Please check if retention policy is not set or set to 0,which may cause logs not to retain to show.
Reference :Enable and manage Azure Storage Analytics logs (classic) | Microsoft Docs
Its also good to note that application logs for few apps may not provide logs in blob storage and only can be done in file system which is why they use console.log('message') and console.error('message').
If we want to write Application logs to Azure blob storage ,firstly we need to enable Application log and configure blob storage for it on the Azure portal and keet its level to be verbose in some cases.
Other references:
c# - ASP.NET Core Application Logs Not Written To Blob in Azure App
Service - Stack Overflow
logging - Azure WebApp Not Sending Application Logs to Blob Storage- Stack Overflow
.net core - Azure WebJob not logging to blob storage - Stack
Overflow
So if someone else stumbles across this and after some back-and-forth with technical support, the problem seems to be related to some storage account settings. In particular with storage-account wide immutability settings.
What solved the problem for us was to disable immutability on the storage account level and to instead set it on the container level.

Querying IIS logs from blob using Azure Log Analytics (formerly OMS)

I am attempting to use Azure Log analytics to query IIS logs that sit in four separate storage accounts (based on region) that are generated from our WebApps. Is this possible? I'm only seeing azure activity logs on my queries - I'm very new to log analytics, so any help would be greatly appreciated.
There is no direct way for this, you can take a look at this issue and it mentions:
Log Analytics is only supported at the IaaS (VM) level, not at PaaS (App Service) level.
If you want to do that, you can manually set up a way which sends the logs from blob storage to Log analytics, following this tutorial(It's a little hard to do that).
As well, you can also upvote on this feedback.

Azure Cloud Service (Classic) - Any way to log Diagnostic.Trace logs to BLOB storage

I've been asked to change an old Azure Cloud Service worker's logging to the System.Diagnostics.Trace logging style of logging. I've done that and now I'm about ready to deploy it to azure.
The client requirement is that these logs should appear in blob storage, similar to how the more modern app service logs can be configured to write their diagnostics to blob storage. There is an expectation that logs can be batched up and uploaded periodically (perhaps time or number of lines based).
Is there a nuget package or other library or config I should enable to connect the application to blob storage? I've spent about 20 mins searching here and online for a solution, but information seems to mainly talk about writing logs to Table Storage..
Edit: More detail:
This is an existing app (C# .Net Framework 4.5) that used to use an external logging service.
I assumed (incorrectly, I think) that the logging to blob storage was something I could configure in the Azure Portal.
As things are right now, NO log file of any kind is generated, but when I run the code in Visual Studio, I can see some Output from the logging statements
I have updated the code to use a standard (custom) logging system
that eventually boils down to using statements like the below:
Trace.TraceInformation($"DEBUG: {message}");
Here are some links I found with related information:
Streaming from command line
Trace listener question
Adding Trace to existing website
Performance Impact of Logging
Smarx Library
The logging is configured by the diagnostics.wadcfgx file which you can see in your solution.
This holds all of the diagnostic information that you want to collect. This can be controlled via the "Properties" of the Web\Worker role (right-click -> Properties).
From there, there is also the option to specify the Storage Account:
This isn't always ideal if you are deploying to multiple environments, so you should be able to alter the configuration from the Azure Portal, by downloading and uploading new configuration, following these instructions.
So logging to blob storage, think of it as uploading existing files to the blob storage. If your current app creates files, then you should use put blob property or blob append to add these files to blob storage. So you must interact with the storage SDK to make these transactions. You could also leverage logic apps which uses connectors to blob storage, and would perform certain actions based on specific triggers(time stamp and other conditions).
If you would like to see the generated logs in Azure Storage, you'll have to enable azure diagnostics but these logs would pertain to the storage account itself, not your app.
Since you mentioned that you see the output, you have to transfer that output as an object ex: (text file), then upload it to the storage account. You can find SDK information for C# here. I hope this helps.

Can we fetch custom logs in Azure OMS

Our Project is a Java Spring boot application, We have a logging system using log4j, Which we are pushing into the Azure Storage accounts.
Question:
I want to query these custom logs in OMS. (Is it possible)
If Yes how.
Till now what i have tried is.
1. Pushed the logs in Blob storage using Logback and container looks like
Pushed logs in table storage
And configured Storage accounts in log analytics in Azure workspace
But i am unable to see any Analytic data to query in OMS .
Please help.
If you can't use Application Insights, you can read logs files from Storage and use HTTP Data Collector API to push logs into Log Analytics workspace. Samples and reference: https://learn.microsoft.com/en-us/azure/log-analytics/log-analytics-data-collector-api

Resources