How to download azure audit log - azure

Is there a way to download the audit log on subscription level as well as diagnostics of the virtual machine, virtual networks, storage accounts, etc. ?
Edit: for more context -- I'm thinking of a powershell script that will be run by Splunk. The script will download the audit log and diagnostics then save it to a directory which Splunk will monitor.

There is no Direct way to download the diagnostics data in Azure, By default no data is stored, if you want to store you'll have to start storing the data in Azure storage, from where you can visualize the data by using a number of ways
Use Server Explorer in Visual studio to view your storage resources.
Use Azure storage explorer by neudesic
Use Azure Diagnostics Manager by Cerebrata - this tool lets you download the logs if you want to as well as visualize.
Latest - you can use power BI to visualize your audit logs as well - I think this is the coolest of them all.
https://azure.microsoft.com/en-us/documentation/articles/cloud-services-dotnet-diagnostics-storage/
http://blogs.msdn.com/b/powerbi/archive/2015/09/30/monitor-azure-audit-logs-with-power-bi.aspx

I found this. Which is exactly what I'm looking for.
So now I just have to write the script then have splunk run it on schedule.
Thanks guys!

Related

Can we save azure functions logs to a file?

I know this might be a repeated question, but I am asking it again because I am not able to find any specific answer.
I am new to Azure functions. I have written an Azure functions in Java. I have a requirement to save the logs into files which will be Daily rolling log files (i.e. new log file should be created each day with the name %fileName%_ddmmyy)
When I use context.getLogger() , I am able to see the logs under Application Insights and Azure Monitor, but I cant find any option to save them into a log file. If I use Log4j, etc, I cannot see the logs under Application Insights and Azure Monitor.
I want to be able to see the logs under Application Insights and Azure Monitor as well as save them to log file which will rotate daily.
Is there any way using which I can achieve this scenario ? Any help would be appreciated.
PS : I need it in Java only. I am using java8.
AFAIK, you can export manually everyday Logs like below using export option in Logs section of function app or application insights logs:
Alternatively, you can use diagnostic setting of function and send logs to your Storage Account or else to Log Analytic workspace. This is an example workaround:
https://stackoverflow.com/a/73383532/17623802
Go to Function App Resource -> Find Diagnostic settings under Monitoring
Click on Add diagnostic setting
Give a name of your Diagnostic setting name
You can choose to export all logs and metrics or you can select from specific categories.
and then select Archive to storage account
Select the subscription and storage account
DONE.
For More info -> https://learn.microsoft.com/en-us/azure/azure-functions/functions-monitor-log-analytics?tabs=csharp
ALTERNATE
If above doesn't work then you can also create stream analytics jobs and dump the logs data in storage account.
https://learn.microsoft.com/en-us/azure/stream-analytics/app-insights-export-stream-analytics

Azure Cloud Service (Classic) - Any way to log Diagnostic.Trace logs to BLOB storage

I've been asked to change an old Azure Cloud Service worker's logging to the System.Diagnostics.Trace logging style of logging. I've done that and now I'm about ready to deploy it to azure.
The client requirement is that these logs should appear in blob storage, similar to how the more modern app service logs can be configured to write their diagnostics to blob storage. There is an expectation that logs can be batched up and uploaded periodically (perhaps time or number of lines based).
Is there a nuget package or other library or config I should enable to connect the application to blob storage? I've spent about 20 mins searching here and online for a solution, but information seems to mainly talk about writing logs to Table Storage..
Edit: More detail:
This is an existing app (C# .Net Framework 4.5) that used to use an external logging service.
I assumed (incorrectly, I think) that the logging to blob storage was something I could configure in the Azure Portal.
As things are right now, NO log file of any kind is generated, but when I run the code in Visual Studio, I can see some Output from the logging statements
I have updated the code to use a standard (custom) logging system
that eventually boils down to using statements like the below:
Trace.TraceInformation($"DEBUG: {message}");
Here are some links I found with related information:
Streaming from command line
Trace listener question
Adding Trace to existing website
Performance Impact of Logging
Smarx Library
The logging is configured by the diagnostics.wadcfgx file which you can see in your solution.
This holds all of the diagnostic information that you want to collect. This can be controlled via the "Properties" of the Web\Worker role (right-click -> Properties).
From there, there is also the option to specify the Storage Account:
This isn't always ideal if you are deploying to multiple environments, so you should be able to alter the configuration from the Azure Portal, by downloading and uploading new configuration, following these instructions.
So logging to blob storage, think of it as uploading existing files to the blob storage. If your current app creates files, then you should use put blob property or blob append to add these files to blob storage. So you must interact with the storage SDK to make these transactions. You could also leverage logic apps which uses connectors to blob storage, and would perform certain actions based on specific triggers(time stamp and other conditions).
If you would like to see the generated logs in Azure Storage, you'll have to enable azure diagnostics but these logs would pertain to the storage account itself, not your app.
Since you mentioned that you see the output, you have to transfer that output as an object ex: (text file), then upload it to the storage account. You can find SDK information for C# here. I hope this helps.

View Azure Blob Metadata Online

Is there a way to examine an Azure blob's metadata through a web interface or the Azure portal?
I'm running into a problem where I set metadata on a blob programmatically, without any problems, but when I go back to read the metadata in another section of the program there isn't any. So I'd like to confirm that the metadata was, in fact, written to the cloud.
One of the simplest ways to set/get an Azure Storage Blob's metadata is by using the cross-platform Microsoft Azure Storage Explorer, which is a standalone app from Microsoft that allows you to easily work with Azure Storage data on Windows, macOS and Linux.
Just right click on the blob you want to examine and select Properties, you will see the metadata list if they exist.
Note: Version tested - 0.8.7
There is no way to check this on the portal, however you can try the Storage Explorer tool.
if you want to check the metadata in your code, please try this Get blob metadata

Azure - best way to review logs (web app, redis)?

I store logs from Azure Web App and Redis Cache in Storage Accounts, but I wonder, what is the best way to analyze them?
Redis seems to store diagnostics information in WADMetrics* tables, while web app puts into storage .csv and .log files, but I dont see any of those as option under the Log Analytics > Workspace data sources > Storage account logs.
Is there a standard (Azure) way to consume, analyze and (preferably) automatically act upon content of those logs?
Answering my own question, based on the investigation I did so far, maybe it will help someone :)
Log Analytics doesn't digest the log data from web apps (I have no idea why, since they seem to be rather standard IIS logs)
The only reasonable way I found to consume and analyze log data is with Power BI. You can easily setup Storage account as the data source and then massage the data and get the reports you need.
So far I didn't come up with a way to generate alerts based on the content of logs without using tools like Splunk or Sumologic.

Culling/Managing Azure Log Files / Failed Request Logs in Blob Storage

I've just discovered that I have 100's of GB of log files/failed request logs on Azure Blob storage that have been accumulating over the years. Is there a tool or technique for managing them - the directory structure is convoluted so its not as easy as just sorting by date (I use Cloud Storage Studio as an Azure management tool)
[With apologies in advance if it feels like product plug] You could possibly look into Azure Diagnostics Manager (http://www.cerebrata.com/Products/AzureDiagnosticsManager). This tool is built specifically for viewing/managing Windows Azure Diagnostics. You could also look into Azure Management Studio (http://www.cerebrata.com) which combines Cloud Storage Studio and Azure Diagnostics Manager into one product and is currently in public beta.
Both tools allow you to purge old data, search for logs data based on date ranges.
(Disclosure: I'm part of Cerebrata team)

Resources