How to monitor read write activities on Azure Blob Storage - azure

Need to figure out how to log/retrieve information about who (which Azure AD user) has read/write on blobs in our azure blob storage.
I know you can turn on logging on the storage account level using this:
I can see in the logs the different api calls that have been performed on the blob but If I myself went via the azure portal to open some of the blobs, I could not see this activity recorded in the logs. Any ideas how to monitor this? I need it for auditing purposes.

When you enable Storage Analytics on Portal, you will have $logs folder on your Blob with storage logs.
When you are using Azure AD authentication you need to configure 2.0 logs and use UserPrincipalName column to identify the user and parse the column with JSON AuthorizationDetail.action to identify the action of the user on storage, i.e. Microsoft.Storage/storageAccounts/blobServices/containers/read for list the blobs in a container.
You will not capture OAuth/Azure AD authenticated requests with log format 1.0.
On Azure Storage Uservoice there is also the request for integration with LogAnalytics to simplify logs monitoring, the private preview should start this month.

Related

How to get request details coming to azure storage for monitoring?

From the past few days, I am being charged for the LRS read and write operation in azure container storage. I have checked all the metrics of API calling to that azure container, but all are working normally. But in storage container metrics its showing lots of request every minute
So here request is around 35k in last few days. But I don't know from where all this request are coming.
To get request details coming to Azure Storage for monitoring, you can make use of Storage Analytics
Azure Storage Analytics performs logging and provides metrics data for a storage account. You can use this data to trace requests, analyze usage trends, and diagnose issues with your storage account.
Please check whether you have enabled Azure Storage Analytics metrics, if not enable like below:
Go to Azure portal -> Storage Accounts -> Your storage account -> Monitoring -> Diagnostic settings
The diagnostic logs will be saved in container called $logs which will appear after enabling Storage Analytics Logging
Please check whether you have any Azure App services or Azure functions that is using storage in background by enabling their Diagnostic settings.
If the issue still persists, please raise Azure support request to know where exactly it is going wrong.
References:
How can I find the source of my Hot LRS Write Operations on Azure Storage Account? - Stack Overflow
Azure Storage Analytics metrics (classic) | Microsoft Docs

Unable to link storage account to Log analytics workspace

We are using fluentbit to output application logs to a Azure log analytics workspace. The application log does appear in the workspace as a table under the Logs blade, Custom Logs category. So far so good.
Due to the maximum retention period of the Log analytic workspace limited to 730 days, I thought linking a storage account to type Custom logs & IIS logs under the Linked storage accounts would solve the problem for me. My understanding is once a storage account is linked to type Custom logs & IIS logs, all Custom Logs will be written into the nominated storage account instead of the default storage account that comes with the creation of the Log analytics workspace. Is this understanding correct?
Secondly, after clicking on the Custom logs & IIS logs item, and selecting a storage account from the Pop-up blade on the left hand side, Azure Portal reported a message Successfully linked storage account . However, the Linked storage accounts view still reports No linked storage accounts.
Browsing the target storage account, no log seems to be written to the storage account.
Updates 1
Storage account network configuration.
Updates 2
The answer is accepted as it is technically correct. However, a few steps/details are missing in the documentation. In order to, map a customer storage account to a LA Workspace, one must build resources to match the following diagram.
Create a AMPLS resource.
Link the AMPLS resource to your LA workspace.
Create private endpoint on the target vnet for the AMPLS resource
Create storage account.
Create print endpoints (blob type) on the target vnet
Link the storage account to the LA workspace.
We need to follow few prerequisites before linking the storage account to the workspace.
Storage account should be in the same region as log analytics workspace.
Need to give permissions for other services to allow accessing the storage account.
Allow Azure Monitor to access the storage account. If you chose to allow only select networks to access your storage account, you should select the exception: “Allow trusted Microsoft services to access this storage account”
For rest of the configuration information refer to MS Docs.
By following the above documentation, I can link the storage account successfully as below:

Logs stored in storage account monitored by Azure Sentinel

I want to understand if i am sending logs from log analytics workspace to a storage account in azure, is it possible that those logs will be monitored by Azure sentinel when storage connector is used or will azure sentinel only monitor storage account logs
It sounds like you wish to read the contents of Azure Storage and ingest into Sentinel. If so, you will need a custom connector. There is an Azure Function example on GitHub here.

How to display log messages from azure iot device client code

I see log messages in azure iot device client source code like this:
log.debug("Connection already opened by TransportClient."); or
log.info("Device client opened successfully");
My question is where these log messages going? how to get that messages for debug purpose?
Thanks
In general, Blob Storage is added as a 'logging endpoint' which shall encompass a storage account, container in the account and blob in the container. The blobs of type 'Block blobs' shall be utilized for storing text and binary data.
All logs get stored in 'Block blobs' in a container named $logs, which is automatically created when Storage Analytics is enabled for a storage account. The $logs container is located in the blob namespace of the storage account, for example: http://.blob.core.windows.net/$logs
To view and analyze your log data, you should download the blobs that contain the log data you are interested in to a local machine. There are many tools like AzCopy, Azure Storage Data Movement library, Azure Import/Export service to import or export data to and from your storage account. To view the logs, you can also use any tool that can access Azure blob storage, such as Visual Studio or Cerebrata Azure Management Studio.
In case of azure-iot-sdk, each IoT hub exposes a set of endpoints(service endpoints) for the solution's back end to communicate with the devices. An IoT hub has a default built-in-endpoint (messages/events). By default, messages are routed to the built-in service-facing endpoint (messages/events) that is compatible with Event Hubs. You can refer to below link to various methods to read from built-in endpoint https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-devguide-messages-read-builtin
You can also create custom endpoints to route messages to by linking other services in your subscription to the IoT Hub. In case if the custom endpoint is created, a message is routed to multiple endpoints whose routing queries it matches. There are two storage services IoT Hub can route messages to Azure Blob Storage and ADLS (Azure Data Lake Storage) Gen2 accounts. You can refer to the below link to various methods to read from custom endpoint https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-devguide-messages-read-custom
In the scenario of logs from IoT SDK itself, the logs can get logged in stdout or stderr as per type of log and based on the deployment environment and also can be redirected accordingly as per the requirement. The SDK uses debug library for detailed logs. The below link can be helpful in giving additional details https://github.com/Azure/azure-iot-sdk-node/wiki/Troubleshooting-Guide-Devices

How to store Azure app services diagnostic logs to Azure Table Storage?

I want to store my api's (hosted on app services) logs to Azure Table storage using Azure Diagnostics. Currently I can store the logs on blob container I am unable to find a option to store on the Table storage.
After some google searching I found that classic azure portal supported to store logs on table storage. When I try to log-in into classic portal it automatically redirects me to the current portal.
Basically I want to view the logs using Azure Log Analytics where I'm unable to view the logs from a blob container.
Please show me some light on this issue. Either enlighten me by showing a way to bind logs in Azure Log Analytics from a blob container or showing me a way to store app services diagnostic logs to table storage.

Resources