how to Get the CRUD Level Access logs for Azure Files - azure

how to Get the CRUD Level Access logs for Azure Files , if any one has used it to check on who has accessed the file , any update they did or deleted the file on Azure Files . Please let me know if you can help

As I know, Azure Storage has Diagnostic settings that can write the logs about the Blob, Queue, and Table, except the File. You can the Diagnostic settings in the storage account like this:
When you set the logging, it will write the logs and store them in the storage account. For example, when you set the logging, you can see the logs like this:
That is what I know to write the logs about CRUD level. But only the File does not have the feature. Not pretty sure if there is another way to achieve what you want, but most likely there is no another way.

Related

How to find who has deleted the file share folder from the storage account in azure

Some one has deleted the file share folder from the storage account in azure . It can be recovered as soft delete is enabled. But how to know that who has deleted the file?
It is possible to view operations within an Azure resource using Resource Logs. This is possible by Monitoring Azure Blob Storage which is a feature of Azure Monitor.
You would first start with creating a Diagnostic Setting- https://learn.microsoft.com/en-us/azure/storage/blobs/monitor-blob-storage?tabs=azure-portal#creating-a-diagnostic-setting
And then view logged activity by using a Log Analytics query or you can go the destination that you are forwarding the logs to as setup in the diagnostics setting and look for the respective API, example- "DeleteBlob" or "DeleteContainer" etc.,
However, if you have not already setup a diagnostic setting already and are forwarding data to a specific destination, it may not be possible to retrieve this information right now. Hope this helps!

Can we save azure functions logs to a file?

I know this might be a repeated question, but I am asking it again because I am not able to find any specific answer.
I am new to Azure functions. I have written an Azure functions in Java. I have a requirement to save the logs into files which will be Daily rolling log files (i.e. new log file should be created each day with the name %fileName%_ddmmyy)
When I use context.getLogger() , I am able to see the logs under Application Insights and Azure Monitor, but I cant find any option to save them into a log file. If I use Log4j, etc, I cannot see the logs under Application Insights and Azure Monitor.
I want to be able to see the logs under Application Insights and Azure Monitor as well as save them to log file which will rotate daily.
Is there any way using which I can achieve this scenario ? Any help would be appreciated.
PS : I need it in Java only. I am using java8.
AFAIK, you can export manually everyday Logs like below using export option in Logs section of function app or application insights logs:
Alternatively, you can use diagnostic setting of function and send logs to your Storage Account or else to Log Analytic workspace. This is an example workaround:
https://stackoverflow.com/a/73383532/17623802
Go to Function App Resource -> Find Diagnostic settings under Monitoring
Click on Add diagnostic setting
Give a name of your Diagnostic setting name
You can choose to export all logs and metrics or you can select from specific categories.
and then select Archive to storage account
Select the subscription and storage account
DONE.
For More info -> https://learn.microsoft.com/en-us/azure/azure-functions/functions-monitor-log-analytics?tabs=csharp
ALTERNATE
If above doesn't work then you can also create stream analytics jobs and dump the logs data in storage account.
https://learn.microsoft.com/en-us/azure/stream-analytics/app-insights-export-stream-analytics

Diagnostic Logs (Storage) not showing up in Azure Storage Account

Im using services such as on prem data gateway, logic apps, etc, and I've configured diagnostics on these instances to send logs to both Log Analytics and Storage.
I go to the storage account, and I see zero log files.
I go to containers --> its empty
I go to tables --> its empty.
I really need to see the logs. What do i do?
Here are something you need to check.
1.There is a latency for the logs to take effect. Please wait a few minutes, then check if the logs are stored in blob storage.
2.Please check your Diagnostics settings, if you have configured everything correctly. Like you choose the log type and metrics.
3.Make sure you check the logs in the blob storage which you configured in your Diagnostics settings.

Azure Cloud Service (Classic) - Any way to log Diagnostic.Trace logs to BLOB storage

I've been asked to change an old Azure Cloud Service worker's logging to the System.Diagnostics.Trace logging style of logging. I've done that and now I'm about ready to deploy it to azure.
The client requirement is that these logs should appear in blob storage, similar to how the more modern app service logs can be configured to write their diagnostics to blob storage. There is an expectation that logs can be batched up and uploaded periodically (perhaps time or number of lines based).
Is there a nuget package or other library or config I should enable to connect the application to blob storage? I've spent about 20 mins searching here and online for a solution, but information seems to mainly talk about writing logs to Table Storage..
Edit: More detail:
This is an existing app (C# .Net Framework 4.5) that used to use an external logging service.
I assumed (incorrectly, I think) that the logging to blob storage was something I could configure in the Azure Portal.
As things are right now, NO log file of any kind is generated, but when I run the code in Visual Studio, I can see some Output from the logging statements
I have updated the code to use a standard (custom) logging system
that eventually boils down to using statements like the below:
Trace.TraceInformation($"DEBUG: {message}");
Here are some links I found with related information:
Streaming from command line
Trace listener question
Adding Trace to existing website
Performance Impact of Logging
Smarx Library
The logging is configured by the diagnostics.wadcfgx file which you can see in your solution.
This holds all of the diagnostic information that you want to collect. This can be controlled via the "Properties" of the Web\Worker role (right-click -> Properties).
From there, there is also the option to specify the Storage Account:
This isn't always ideal if you are deploying to multiple environments, so you should be able to alter the configuration from the Azure Portal, by downloading and uploading new configuration, following these instructions.
So logging to blob storage, think of it as uploading existing files to the blob storage. If your current app creates files, then you should use put blob property or blob append to add these files to blob storage. So you must interact with the storage SDK to make these transactions. You could also leverage logic apps which uses connectors to blob storage, and would perform certain actions based on specific triggers(time stamp and other conditions).
If you would like to see the generated logs in Azure Storage, you'll have to enable azure diagnostics but these logs would pertain to the storage account itself, not your app.
Since you mentioned that you see the output, you have to transfer that output as an object ex: (text file), then upload it to the storage account. You can find SDK information for C# here. I hope this helps.

Azure - best way to review logs (web app, redis)?

I store logs from Azure Web App and Redis Cache in Storage Accounts, but I wonder, what is the best way to analyze them?
Redis seems to store diagnostics information in WADMetrics* tables, while web app puts into storage .csv and .log files, but I dont see any of those as option under the Log Analytics > Workspace data sources > Storage account logs.
Is there a standard (Azure) way to consume, analyze and (preferably) automatically act upon content of those logs?
Answering my own question, based on the investigation I did so far, maybe it will help someone :)
Log Analytics doesn't digest the log data from web apps (I have no idea why, since they seem to be rather standard IIS logs)
The only reasonable way I found to consume and analyze log data is with Power BI. You can easily setup Storage account as the data source and then massage the data and get the reports you need.
So far I didn't come up with a way to generate alerts based on the content of logs without using tools like Splunk or Sumologic.

Resources