Logs stored in storage account monitored by Azure Sentinel - azure

I want to understand if i am sending logs from log analytics workspace to a storage account in azure, is it possible that those logs will be monitored by Azure sentinel when storage connector is used or will azure sentinel only monitor storage account logs

It sounds like you wish to read the contents of Azure Storage and ingest into Sentinel. If so, you will need a custom connector. There is an Azure Function example on GitHub here.

Related

How to get request details coming to azure storage for monitoring?

From the past few days, I am being charged for the LRS read and write operation in azure container storage. I have checked all the metrics of API calling to that azure container, but all are working normally. But in storage container metrics its showing lots of request every minute
So here request is around 35k in last few days. But I don't know from where all this request are coming.
To get request details coming to Azure Storage for monitoring, you can make use of Storage Analytics
Azure Storage Analytics performs logging and provides metrics data for a storage account. You can use this data to trace requests, analyze usage trends, and diagnose issues with your storage account.
Please check whether you have enabled Azure Storage Analytics metrics, if not enable like below:
Go to Azure portal -> Storage Accounts -> Your storage account -> Monitoring -> Diagnostic settings
The diagnostic logs will be saved in container called $logs which will appear after enabling Storage Analytics Logging
Please check whether you have any Azure App services or Azure functions that is using storage in background by enabling their Diagnostic settings.
If the issue still persists, please raise Azure support request to know where exactly it is going wrong.
References:
How can I find the source of my Hot LRS Write Operations on Azure Storage Account? - Stack Overflow
Azure Storage Analytics metrics (classic) | Microsoft Docs

Is it possible to find details about ingress/egress for Azure Storage account?

Is there any way to find out on our own where the ingress/egress activities are originating from/to (at least some details) for Azure Storage account?
Is there any way to find out on our own where the ingress/egress
activities are originating from/to (at least some details) for Azure
Storage account?
You should be able to find this information out by analyzing contents of $logs blob container in your storage account. This blob container contains Storage Analytics Logs. You may need to enable storage analytics logging for your storage account as I believe it is not enabled by default.
You can find more information about storage analytics logging here: https://learn.microsoft.com/en-us/azure/storage/common/storage-analytics-logging?tabs=dotnet.

How to display log messages from azure iot device client code

I see log messages in azure iot device client source code like this:
log.debug("Connection already opened by TransportClient."); or
log.info("Device client opened successfully");
My question is where these log messages going? how to get that messages for debug purpose?
Thanks
In general, Blob Storage is added as a 'logging endpoint' which shall encompass a storage account, container in the account and blob in the container. The blobs of type 'Block blobs' shall be utilized for storing text and binary data.
All logs get stored in 'Block blobs' in a container named $logs, which is automatically created when Storage Analytics is enabled for a storage account. The $logs container is located in the blob namespace of the storage account, for example: http://.blob.core.windows.net/$logs
To view and analyze your log data, you should download the blobs that contain the log data you are interested in to a local machine. There are many tools like AzCopy, Azure Storage Data Movement library, Azure Import/Export service to import or export data to and from your storage account. To view the logs, you can also use any tool that can access Azure blob storage, such as Visual Studio or Cerebrata Azure Management Studio.
In case of azure-iot-sdk, each IoT hub exposes a set of endpoints(service endpoints) for the solution's back end to communicate with the devices. An IoT hub has a default built-in-endpoint (messages/events). By default, messages are routed to the built-in service-facing endpoint (messages/events) that is compatible with Event Hubs. You can refer to below link to various methods to read from built-in endpoint https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-devguide-messages-read-builtin
You can also create custom endpoints to route messages to by linking other services in your subscription to the IoT Hub. In case if the custom endpoint is created, a message is routed to multiple endpoints whose routing queries it matches. There are two storage services IoT Hub can route messages to Azure Blob Storage and ADLS (Azure Data Lake Storage) Gen2 accounts. You can refer to the below link to various methods to read from custom endpoint https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-devguide-messages-read-custom
In the scenario of logs from IoT SDK itself, the logs can get logged in stdout or stderr as per type of log and based on the deployment environment and also can be redirected accordingly as per the requirement. The SDK uses debug library for detailed logs. The below link can be helpful in giving additional details https://github.com/Azure/azure-iot-sdk-node/wiki/Troubleshooting-Guide-Devices

How to monitor read write activities on Azure Blob Storage

Need to figure out how to log/retrieve information about who (which Azure AD user) has read/write on blobs in our azure blob storage.
I know you can turn on logging on the storage account level using this:
I can see in the logs the different api calls that have been performed on the blob but If I myself went via the azure portal to open some of the blobs, I could not see this activity recorded in the logs. Any ideas how to monitor this? I need it for auditing purposes.
When you enable Storage Analytics on Portal, you will have $logs folder on your Blob with storage logs.
When you are using Azure AD authentication you need to configure 2.0 logs and use UserPrincipalName column to identify the user and parse the column with JSON AuthorizationDetail.action to identify the action of the user on storage, i.e. Microsoft.Storage/storageAccounts/blobServices/containers/read for list the blobs in a container.
You will not capture OAuth/Azure AD authenticated requests with log format 1.0.
On Azure Storage Uservoice there is also the request for integration with LogAnalytics to simplify logs monitoring, the private preview should start this month.

How to store Azure app services diagnostic logs to Azure Table Storage?

I want to store my api's (hosted on app services) logs to Azure Table storage using Azure Diagnostics. Currently I can store the logs on blob container I am unable to find a option to store on the Table storage.
After some google searching I found that classic azure portal supported to store logs on table storage. When I try to log-in into classic portal it automatically redirects me to the current portal.
Basically I want to view the logs using Azure Log Analytics where I'm unable to view the logs from a blob container.
Please show me some light on this issue. Either enlighten me by showing a way to bind logs in Azure Log Analytics from a blob container or showing me a way to store app services diagnostic logs to table storage.

Resources