How can I tell how full an Azure Storage account is? - azure

Is there any way of determining the used and/or remaining capacity of an Azure Storage account? I know that the current size limit is 100TB per storage account and I'm looking for either number of bytes used/remaining or, failing that, percentage used/remaining. However I can't see any way of monitoring this even in the Azure portal, let alone doing it programmatically via the API.

You have to enable Storage Analytics. Then read about Storage Account Monitoring.
Finally a look at the Analytics Metrics table(s). Note that it takes minimum 15 minutes until metrics are updated.
And btw, Azure Storage account limit is 500 TB as per Azure Subscription and Service Limits.
UPDATE
After reconsidering, the only way you can get the full storage capacity report is via the Account Billing page. Simply click on your name on top right (management portal v.Current, a.k.a. https://manage.windowsazure.com/) then chose "View My Bill" option.
This page is updated on a daily basis.
There is no API or other programmatic way to get these statistics.

Now it's possible to get it in Azure Monitor. In Azure Portal navigate to All services -> Monitor, click Expore Metrics and select your storage account. There are different useful metrics and Capacity metric is among them.

If you open the storage account in the Azure Portal, there is a Metrics tab (in the menu on the left, not a conventional tab) which will provide you with graphs on account utilisation including used capacity.
This tab works even without usage of Storage Analytics or the Azure Monitor.

Storage Metrics only stores capacity metrics for the blob service because blobs typically account for the largest proportion of stored data (at the time of writing, it is not possible to use Storage Metrics to monitor the capacity of your tables and queues). You can find this data in the $MetricsCapacityBlob table if you have enabled monitoring for the Blob service. Take a look at this Monitoring Capacity document for information on how to monitor the capacity of your storage accounts.
For help estimating the size of various storage objects such as blobs, see the blog post Understanding Azure Storage Billing – Bandwidth, Transactions, and Capacity. Understanding Storage Billing,
Note that Storage does have API's for access metric data programmatically using the CloudAnalyticsClient API. See CloudAnalyticsAPI for a summary.

Related

Why are there so many storage transactions when querying an Azure blob storage account?

We are investigating using Application Insights. We have set up a system where we continuously export telemetry from Application Insights to an Azure storage account, where they are stored as blobs. The continuous export seems to batch up events (JSON objects) so that each blob tends to contain more than one telemetry event.
We currently have around 2500 (test) events in the Azure storage account. They are stored in 280 blobs.
We are using Power BI to query the data from the Azure storage account. In the case above, I would have expected that querying all the data would amount to around 280 transactions (1 per blob). Instead, the Azure Portal is telling me that the "total requests" is over 2500, which is similar to the number of events inside those blobs.
Any idea why so many requests being made? This obviously make a big difference as storage transactions are taken into account when calculating the cost of the service.

How can I programmatically find how much Azure Storage space I have consumed so far?

There's a volume limitation per Azure Storage Account is 200 TB (two hundred terabytes). This sounds real large but if you store files in blob storage 25 megabytes each you can have about four million of them stored which is nice but not something impossible to exhaust.
I want to craft some code that would periodically check how much space I've used and raise an alert.
How can I programmatically find how much space I have already consumed in my storage account?
It looks like current limit for Azure Storage Account is 500TB (see here )
If you can have Azure Storage Account only with blobs in it you can use metrics to fetch current capacity, but current metrics are only showing capacity taken by blobs. See Storage Metrics documentation and how to enable Storage Analytics
Maybe this would help you http://www.amido.com/richard-slater/windows-azure-storage-capacity-metrics-with-powershell/
Not sure about that but it looks like you can create an alert in azure portal on this metric:

Azure blob storage limitation and filter

Azure storage for multi-tenant application. We are working on to develop a multi-tenant application on Azure, with approximate 10,000 tenants and approximate 100 GB to 1 TB data storage is required per tenant. The application is to maintain the documents and binary content along with the metadata for each tenant separately. We are thinking towards Azure Block Blob storage to store the data. Since, the requirement is to maintain the data separate for each tenant, we came across with the following approach.
Create a separate storage account for each tenant
That helps to maintain the usage tenant wise, which again helps on billing as well
Create a separate container in each storage account to segregate
based on category
Store document in block blob storage along with the metadata.
We have following queries with respect to the our approach:
Is it good idea to store documents or binary content in block blob
along with the metadata? Or is there any better way of achieving it
(probably using SQL Azure for metadata and blob for content, or
better)?
How to query the data with some filter condition on metadata? i.e. retrieve all blob where metadat1 = value1 and metadata2=value2
Is it good idea to create a separate storage account for each tenant?
a. If not, then what would be the model thru which we can store tenant specific data in the Azure storage and application can efficiently use them?
Is there bandwidth or any other limitation on number of request to read/write data on Blob storage in context of scalability and high availability?
As per the azure pricing model, they charge slab wise for the storage, i.e. first 1 TB $0.095 / GB, next 49 TB $0.08 / GB. This charges are application on per storage account or on per subscription?
a. Same way, transaction cost is applicable on per storage account or on per subscription?
Is it good idea to store documents or binary content in block blob along with the metadata? Or is there any better way of achieving it (probably using SQL Azure for metadata and blob for content, or better)?
How to query the data with some filter condition on metadata? i.e. retrieve all blob where metadat1 = value1 and metadata2=value2
To answer 1 and 2, you can't query on metadata in blob storage. So I guess your best option would be to use SQL Azure or Azure Table Storage as both of them have querying capabilities. Given that you'll be storing huge number of blobs (and thus even more metadata), I'm more inclined towards table storage but that would require special design considerations like proper partitioning.
Is it good idea to create a separate storage account for each tenant?
a. If not, then what would be the model thru which we can store tenant specific data in the Azure storage and application can efficiently use them?
I can think of 3 reasons why having a separate storage account per tenant is a good idea:
It simplifies your billing.
It will help you maintain scalability targets.
Since you mentioned that each tenant can potentially store up to 1 TB of data, given the current storage account limit of 200 TB, you can only maintain a maximum of 200 tenants per storage account. After that you would need to find another storage account and start storing the data there.
All in all a much elegant solution keeping separate storage account / tenant. The challenge would be to have the default limit increased from 20 storage accounts / subscription. You would need to chat with support for that.
Is there bandwidth or any other limitation on number of request to read/write data on Blob storage in context of scalability and high availability?
Yes, Please read the scalability targets blog from Windows Azure Blob Storage team: http://blogs.msdn.com/b/windowsazurestorage/archive/2010/05/10/windows-azure-storage-abstractions-and-their-scalability-targets.aspx
As per the azure pricing model, they charge slab wise for the storage, i.e. first 1 TB $0.095 / GB, next 49 TB $0.08 / GB. This charges are application on per storage account or on per subscription?
a. Same way, transaction cost is applicable on per storage account or on per subscription?
Not sure about this but I am guessing it's per storage account. You may want to contact support for this.
Hope this helps.

Azure storage pricing : Charged for 30 gb?

Hy ppl , I dont understand how azure storage is charged for around 34gb in my subscription. We havent used that much storage space.
I heard there is a quest tool for azure storage explorer.How useful is that ?
Many Thanks.
Are you using Virtual Machines? If that's the case, you have to know that persisted disks are stored as page blobs in your storage account, and you're charged for that. The pricing details page explains why:
Compute hours do not include any Windows Azure Storage costs
associated with the image running in Windows Azure Virtual Machines.
These costs are billed separately. For a full description of how
compute hours are calculated, please refer to the Cloud Services
section.
If you want to know more details on how much data you've used per storage account/day/location/... I suggest you take a look on the subscriptions page. After choosing a subscription you can export a detailed CSV file you can analyse.

Azure storage metrics data

I am trying to implement azure storage metrics code in my role but i am checking if there is easy way to get azure storage metric data about my files usage. my code is stable and i do not want to change code again.
Actually if you already have Windows Azure role running, then you don't need to make any changes to your code and you still can get Windows Azure Blob storage Metrics data.
I have written a blog about it last time as Collecting Windows Azure Storage REST API level metrics data without a single line of programming, just by using tools
Please try above and see if this works for you.
Storage analytics is disabled by default, so any operations against your storage up til now has not been logged for analysis.
You may choose to enable analytics at any time, for both logging (detailed access information for every single object) and metrics (hourly rollups). Further, you may choose which specific storage service to track (blobs, tables, queues) and which operations to track (read, write, delete). Once analytics are enabled, you may access the resulting analytics data from any app (as long as you have the storage account name + key).
Persistent Systems just published a blog post on enabling storage analytics for Java apps. The same principles may be applied to a .net app (and the sdk's are very similar).
Additionally, Full Scale 180 published a sample app encapsulating storage analytics (based on REST API, as it was written before SDK v1.6 came out).

Resources