I store logs from Azure Web App and Redis Cache in Storage Accounts, but I wonder, what is the best way to analyze them?
Redis seems to store diagnostics information in WADMetrics* tables, while web app puts into storage .csv and .log files, but I dont see any of those as option under the Log Analytics > Workspace data sources > Storage account logs.
Is there a standard (Azure) way to consume, analyze and (preferably) automatically act upon content of those logs?
Answering my own question, based on the investigation I did so far, maybe it will help someone :)
Log Analytics doesn't digest the log data from web apps (I have no idea why, since they seem to be rather standard IIS logs)
The only reasonable way I found to consume and analyze log data is with Power BI. You can easily setup Storage account as the data source and then massage the data and get the reports you need.
So far I didn't come up with a way to generate alerts based on the content of logs without using tools like Splunk or Sumologic.
Related
I know this might be a repeated question, but I am asking it again because I am not able to find any specific answer.
I am new to Azure functions. I have written an Azure functions in Java. I have a requirement to save the logs into files which will be Daily rolling log files (i.e. new log file should be created each day with the name %fileName%_ddmmyy)
When I use context.getLogger() , I am able to see the logs under Application Insights and Azure Monitor, but I cant find any option to save them into a log file. If I use Log4j, etc, I cannot see the logs under Application Insights and Azure Monitor.
I want to be able to see the logs under Application Insights and Azure Monitor as well as save them to log file which will rotate daily.
Is there any way using which I can achieve this scenario ? Any help would be appreciated.
PS : I need it in Java only. I am using java8.
AFAIK, you can export manually everyday Logs like below using export option in Logs section of function app or application insights logs:
Alternatively, you can use diagnostic setting of function and send logs to your Storage Account or else to Log Analytic workspace. This is an example workaround:
https://stackoverflow.com/a/73383532/17623802
Go to Function App Resource -> Find Diagnostic settings under Monitoring
Click on Add diagnostic setting
Give a name of your Diagnostic setting name
You can choose to export all logs and metrics or you can select from specific categories.
and then select Archive to storage account
Select the subscription and storage account
DONE.
For More info -> https://learn.microsoft.com/en-us/azure/azure-functions/functions-monitor-log-analytics?tabs=csharp
ALTERNATE
If above doesn't work then you can also create stream analytics jobs and dump the logs data in storage account.
https://learn.microsoft.com/en-us/azure/stream-analytics/app-insights-export-stream-analytics
Using Log Analytics, is it possible to search thru data stored in a container inside an Azure storage account? We have an Azure Function that reaches out to an API in O365 for log data and then it pushes that data into a storage account. We would like to be able to query this data.
We can push content inside your container to log analytics workspace repository using something called log analytics http data collector API.
We need to build your own integration of sending container content to log analytics by leveraging http data collector API.
You may refer to the suggestion mentioned in the article
https://learn.microsoft.com/en-us/azure/azure-monitor/platform/data-collector-api
Additional information: - Azure Functions
- Azure Automation
- Logic App
With any of these what you will do is have some schedule that will run on certain interval. When it is ran you will execute query against Log Analytics to get data. The results from the query you will transfer to Azure Storage may be as blob. You might have to do some transformation on the data depending on your scenario. The most important that you have to make sure is that you do not miss data or upload the same data twice to the storage. Log Analytics query language allows you to specify time frame for the results. I hope this will help you.
Kindly let us know if the above helps or you need further assistance on this issue.
I am attempting to use Azure Log analytics to query IIS logs that sit in four separate storage accounts (based on region) that are generated from our WebApps. Is this possible? I'm only seeing azure activity logs on my queries - I'm very new to log analytics, so any help would be greatly appreciated.
There is no direct way for this, you can take a look at this issue and it mentions:
Log Analytics is only supported at the IaaS (VM) level, not at PaaS (App Service) level.
If you want to do that, you can manually set up a way which sends the logs from blob storage to Log analytics, following this tutorial(It's a little hard to do that).
As well, you can also upvote on this feedback.
Is there a way to download the audit log on subscription level as well as diagnostics of the virtual machine, virtual networks, storage accounts, etc. ?
Edit: for more context -- I'm thinking of a powershell script that will be run by Splunk. The script will download the audit log and diagnostics then save it to a directory which Splunk will monitor.
There is no Direct way to download the diagnostics data in Azure, By default no data is stored, if you want to store you'll have to start storing the data in Azure storage, from where you can visualize the data by using a number of ways
Use Server Explorer in Visual studio to view your storage resources.
Use Azure storage explorer by neudesic
Use Azure Diagnostics Manager by Cerebrata - this tool lets you download the logs if you want to as well as visualize.
Latest - you can use power BI to visualize your audit logs as well - I think this is the coolest of them all.
https://azure.microsoft.com/en-us/documentation/articles/cloud-services-dotnet-diagnostics-storage/
http://blogs.msdn.com/b/powerbi/archive/2015/09/30/monitor-azure-audit-logs-with-power-bi.aspx
I found this. Which is exactly what I'm looking for.
So now I just have to write the script then have splunk run it on schedule.
Thanks guys!
I have Azure Diagnostics setup and logging into all the WAD tables.
How do I suppose to read all that logs? I have the Azure Storage Explorer, but I don't see how it can be useful and the logs also are loaded with considerable amount of garbage. Is there any way to view the diagnostics data in more sensible way?
You will either have to write yourself a parsing tool to read all the data, or you can purchase something like Cerebrata's diagnostics tool to interpret it. Unfortunately, the data in storage is just raw data and there is no interpretation.