Azure Storage Account - Tracking the SAS consumption - azure

I have been searching for a few hours without success if it's possible to track the consumption of a SAS credential on a blob container?
I'm going to give SAS credential to a couple customers and I want to be able to track their usage of their SAS (number of operations, bandwidth usage ...)
Is there a way to do it?
Thanks

I use logs for checking or tracking the SAS usage details by using below KQL query in Logs section as below:
Firstly, go to Storage Account, then click on Logs then execute the below Query:
StorageBlobLogs
| where AuthenticationType contains "SAS"
By using above query, you can get so many number of details about our SAS usage.
The properties that can be achieved by using above query are here.
Here you can also set a time period between which you want to check the details.

Related

Select limited columns from App Insights log data while transferring to storage account using Diagnostic settings

I have configured a diagnostic setting in app insight to transfer telemetry data to storage account. I do not want to transfer/migrate user_authenticationId column from pageViews data. How can I prevent it from transferring to storage account using Diagnostic settings.
• Using ‘Diagnostic setting’, it is not possible to exclude columns from being exported through ‘pageViews’ category in application insights. Rather, you can exclude the column ‘user_authenticationID’ with an application insights log filter query by executing the same on the ‘pageViews’ table and then save the query as a function to be executed at a time of your choosing or export the output of that query for a particular timestamp to an excel file or a storage account of your choosing.
Please find the below application insights log query for excluding the column as stated above: -
Also, find the below documentation link for more detailed information on exporting the query results in a storage account and the requirements for the same: -
https://learn.microsoft.com/en-us/azure/azure-monitor/app/export-telemetry
Thus, in this way, you can achieve the desired.

Searching Storage Account with Azure Log Analytics

Using Log Analytics, is it possible to search thru data stored in a container inside an Azure storage account? We have an Azure Function that reaches out to an API in O365 for log data and then it pushes that data into a storage account. We would like to be able to query this data.
We can push content inside your container to log analytics workspace repository using something called log analytics http data collector API.
We need to build your own integration of sending container content to log analytics by leveraging http data collector API.
You may refer to the suggestion mentioned in the article
https://learn.microsoft.com/en-us/azure/azure-monitor/platform/data-collector-api
Additional information: - Azure Functions
- Azure Automation
- Logic App
With any of these what you will do is have some schedule that will run on certain interval. When it is ran you will execute query against Log Analytics to get data. The results from the query you will transfer to Azure Storage may be as blob. You might have to do some transformation on the data depending on your scenario. The most important that you have to make sure is that you do not miss data or upload the same data twice to the storage. Log Analytics query language allows you to specify time frame for the results. I hope this will help you.
Kindly let us know if the above helps or you need further assistance on this issue.

Azure activity logs not displaying any write data

I'm trying to set up logging for a storage resource (table specifically, though it seems like the activity log doesn't and just logs the entire Storage account).
The logging seems to log my ListKeys operations, occasional access from ApplicationInsights, but isn't logging any writes/reads I'm making to the tables themselves through either my app or the Microsoft Azure Storage Explorer. This table has been written to multiple times over the past few weeks, but yet none of that activity shows up.
Am I misinterpreting this page, which states that this activity log should track posts/deletes? Do I need any additional setup to track these operations?
Per my understanding, you could leverage Storage Analytics logging to log the operations on your storage. For the detailed operations that are logged for the corresponding storage service, you could refer to this official document.
According to your description, I have tested my operations against table storage by using REST API and Storage Explorer Tool. Here is my test result, you could refer to it.
Table Storage Analytics logging
Table Storage Metrics
As noted in this document:
As requests are logged, Storage Analytics will upload intermediate results as blocks. Periodically, Storage Analytics will commit these blocks and make them available as a blob.
In summary, please follow this tutorial to enable and configure Storage Analytics, then wait for some time and check your table storage logging.
If you are leveraging the Azure Activity log, remember that it is meant for control plane operations. So listkeys would show up there.
if you are looking for data plane operations (such as entity writes into a table), then make sure Diagnostics are turned on inside the Storage account that you are writing to.
Azure Activity Log is only for management plane records through Azure Resource Manager (ARM), specifically PUT/DELETE/POST which includes ListKeys which is an HTTP POST.
For storage analytics logging, you can use this article to see the types of data logged.

Azure: how to count storage transactions

Can anyone explain me how count storage transactions ?
For example, I need storage for 10 GB, and daily incremental is about 100mb.
How to count the transactions ?
Azure
Azure Storage Team had published a blog long time back on this - http://blogs.msdn.com/b/windowsazurestorage/archive/2010/07/09/understanding-windows-azure-storage-billing-bandwidth-transactions-and-capacity.aspx. To understand how you're going to get charged for using Azure Storage, I would highly recommend reading this post.
Azure Storage also provides detailed analytics on the operations performed against your storage account. You can find information about the transactions by looking at storage analytics data. You may find this link helpful for that: http://blogs.msdn.com/b/windowsazurestorage/archive/tags/analytics+2d00+logging+_2600_amp_3b00_+metrics/.
Every single access to the storage counts as one transaction (even local, EDIT: eg. web-app to storage). Then you just have to calculate an average.
Read more
Transactions include both read and write operations to storage.

How can I tell how full an Azure Storage account is?

Is there any way of determining the used and/or remaining capacity of an Azure Storage account? I know that the current size limit is 100TB per storage account and I'm looking for either number of bytes used/remaining or, failing that, percentage used/remaining. However I can't see any way of monitoring this even in the Azure portal, let alone doing it programmatically via the API.
You have to enable Storage Analytics. Then read about Storage Account Monitoring.
Finally a look at the Analytics Metrics table(s). Note that it takes minimum 15 minutes until metrics are updated.
And btw, Azure Storage account limit is 500 TB as per Azure Subscription and Service Limits.
UPDATE
After reconsidering, the only way you can get the full storage capacity report is via the Account Billing page. Simply click on your name on top right (management portal v.Current, a.k.a. https://manage.windowsazure.com/) then chose "View My Bill" option.
This page is updated on a daily basis.
There is no API or other programmatic way to get these statistics.
Now it's possible to get it in Azure Monitor. In Azure Portal navigate to All services -> Monitor, click Expore Metrics and select your storage account. There are different useful metrics and Capacity metric is among them.
If you open the storage account in the Azure Portal, there is a Metrics tab (in the menu on the left, not a conventional tab) which will provide you with graphs on account utilisation including used capacity.
This tab works even without usage of Storage Analytics or the Azure Monitor.
Storage Metrics only stores capacity metrics for the blob service because blobs typically account for the largest proportion of stored data (at the time of writing, it is not possible to use Storage Metrics to monitor the capacity of your tables and queues). You can find this data in the $MetricsCapacityBlob table if you have enabled monitoring for the Blob service. Take a look at this Monitoring Capacity document for information on how to monitor the capacity of your storage accounts.
For help estimating the size of various storage objects such as blobs, see the blog post Understanding Azure Storage Billing – Bandwidth, Transactions, and Capacity. Understanding Storage Billing,
Note that Storage does have API's for access metric data programmatically using the CloudAnalyticsClient API. See CloudAnalyticsAPI for a summary.

Resources