I've spent quite some time trying to find a way that I can do a storage audit on our subscription.
Currently we have a mixture of unmanaged disks in storage accounts (premium and standard), managed disks (prem / std) and storage accounts with random blob storage in.
From a costing question, i can use cost management to see how much "storage" is costing us on a monthly basis, but i've recently been asked the question about getting the total size of our storage formatted by type.
e.g. What is the total capacity of storage for:
+ Storage accounts
+ Managed Disks
++ Total storage for Premium managed disks vs standard managed disks
I just can't seem to find anything that will help me collate that data. Has anyone done something similar and can you point me in the right direction please?
Related
I have Azure Storage files (blobs) in various states of Hot, Cool, and Archive. However, I have no idea how many files are in each tier or how much space they are taking up or the cost per tier.
Does anyone know how I can get this information either programmatically or through the Azure Portal? So far the portal only shows monthly totals for all files, size, and cost.
Thanks,
Nick
If you go to the Metrics blade you can use the following selection for a metric:
Metric Namespace: Blob
Metric: Blob Capacity or Blob Count
Then you can add a filter and filter on the Blob Tier. That should give you a good insight on the total space used and the amount of blobs for each tier. Using the Azure Calculator you can then easily calculate what the storage costs.
Is there any way of determining the used and/or remaining capacity of an Azure Storage account? I know that the current size limit is 100TB per storage account and I'm looking for either number of bytes used/remaining or, failing that, percentage used/remaining. However I can't see any way of monitoring this even in the Azure portal, let alone doing it programmatically via the API.
You have to enable Storage Analytics. Then read about Storage Account Monitoring.
Finally a look at the Analytics Metrics table(s). Note that it takes minimum 15 minutes until metrics are updated.
And btw, Azure Storage account limit is 500 TB as per Azure Subscription and Service Limits.
UPDATE
After reconsidering, the only way you can get the full storage capacity report is via the Account Billing page. Simply click on your name on top right (management portal v.Current, a.k.a. https://manage.windowsazure.com/) then chose "View My Bill" option.
This page is updated on a daily basis.
There is no API or other programmatic way to get these statistics.
Now it's possible to get it in Azure Monitor. In Azure Portal navigate to All services -> Monitor, click Expore Metrics and select your storage account. There are different useful metrics and Capacity metric is among them.
If you open the storage account in the Azure Portal, there is a Metrics tab (in the menu on the left, not a conventional tab) which will provide you with graphs on account utilisation including used capacity.
This tab works even without usage of Storage Analytics or the Azure Monitor.
Storage Metrics only stores capacity metrics for the blob service because blobs typically account for the largest proportion of stored data (at the time of writing, it is not possible to use Storage Metrics to monitor the capacity of your tables and queues). You can find this data in the $MetricsCapacityBlob table if you have enabled monitoring for the Blob service. Take a look at this Monitoring Capacity document for information on how to monitor the capacity of your storage accounts.
For help estimating the size of various storage objects such as blobs, see the blog post Understanding Azure Storage Billing – Bandwidth, Transactions, and Capacity. Understanding Storage Billing,
Note that Storage does have API's for access metric data programmatically using the CloudAnalyticsClient API. See CloudAnalyticsAPI for a summary.
There's a volume limitation per Azure Storage Account is 200 TB (two hundred terabytes). This sounds real large but if you store files in blob storage 25 megabytes each you can have about four million of them stored which is nice but not something impossible to exhaust.
I want to craft some code that would periodically check how much space I've used and raise an alert.
How can I programmatically find how much space I have already consumed in my storage account?
It looks like current limit for Azure Storage Account is 500TB (see here )
If you can have Azure Storage Account only with blobs in it you can use metrics to fetch current capacity, but current metrics are only showing capacity taken by blobs. See Storage Metrics documentation and how to enable Storage Analytics
Maybe this would help you http://www.amido.com/richard-slater/windows-azure-storage-capacity-metrics-with-powershell/
Not sure about that but it looks like you can create an alert in azure portal on this metric:
Azure storage for multi-tenant application. We are working on to develop a multi-tenant application on Azure, with approximate 10,000 tenants and approximate 100 GB to 1 TB data storage is required per tenant. The application is to maintain the documents and binary content along with the metadata for each tenant separately. We are thinking towards Azure Block Blob storage to store the data. Since, the requirement is to maintain the data separate for each tenant, we came across with the following approach.
Create a separate storage account for each tenant
That helps to maintain the usage tenant wise, which again helps on billing as well
Create a separate container in each storage account to segregate
based on category
Store document in block blob storage along with the metadata.
We have following queries with respect to the our approach:
Is it good idea to store documents or binary content in block blob
along with the metadata? Or is there any better way of achieving it
(probably using SQL Azure for metadata and blob for content, or
better)?
How to query the data with some filter condition on metadata? i.e. retrieve all blob where metadat1 = value1 and metadata2=value2
Is it good idea to create a separate storage account for each tenant?
a. If not, then what would be the model thru which we can store tenant specific data in the Azure storage and application can efficiently use them?
Is there bandwidth or any other limitation on number of request to read/write data on Blob storage in context of scalability and high availability?
As per the azure pricing model, they charge slab wise for the storage, i.e. first 1 TB $0.095 / GB, next 49 TB $0.08 / GB. This charges are application on per storage account or on per subscription?
a. Same way, transaction cost is applicable on per storage account or on per subscription?
Is it good idea to store documents or binary content in block blob along with the metadata? Or is there any better way of achieving it (probably using SQL Azure for metadata and blob for content, or better)?
How to query the data with some filter condition on metadata? i.e. retrieve all blob where metadat1 = value1 and metadata2=value2
To answer 1 and 2, you can't query on metadata in blob storage. So I guess your best option would be to use SQL Azure or Azure Table Storage as both of them have querying capabilities. Given that you'll be storing huge number of blobs (and thus even more metadata), I'm more inclined towards table storage but that would require special design considerations like proper partitioning.
Is it good idea to create a separate storage account for each tenant?
a. If not, then what would be the model thru which we can store tenant specific data in the Azure storage and application can efficiently use them?
I can think of 3 reasons why having a separate storage account per tenant is a good idea:
It simplifies your billing.
It will help you maintain scalability targets.
Since you mentioned that each tenant can potentially store up to 1 TB of data, given the current storage account limit of 200 TB, you can only maintain a maximum of 200 tenants per storage account. After that you would need to find another storage account and start storing the data there.
All in all a much elegant solution keeping separate storage account / tenant. The challenge would be to have the default limit increased from 20 storage accounts / subscription. You would need to chat with support for that.
Is there bandwidth or any other limitation on number of request to read/write data on Blob storage in context of scalability and high availability?
Yes, Please read the scalability targets blog from Windows Azure Blob Storage team: http://blogs.msdn.com/b/windowsazurestorage/archive/2010/05/10/windows-azure-storage-abstractions-and-their-scalability-targets.aspx
As per the azure pricing model, they charge slab wise for the storage, i.e. first 1 TB $0.095 / GB, next 49 TB $0.08 / GB. This charges are application on per storage account or on per subscription?
a. Same way, transaction cost is applicable on per storage account or on per subscription?
Not sure about this but I am guessing it's per storage account. You may want to contact support for this.
Hope this helps.
Hy ppl , I dont understand how azure storage is charged for around 34gb in my subscription. We havent used that much storage space.
I heard there is a quest tool for azure storage explorer.How useful is that ?
Many Thanks.
Are you using Virtual Machines? If that's the case, you have to know that persisted disks are stored as page blobs in your storage account, and you're charged for that. The pricing details page explains why:
Compute hours do not include any Windows Azure Storage costs
associated with the image running in Windows Azure Virtual Machines.
These costs are billed separately. For a full description of how
compute hours are calculated, please refer to the Cloud Services
section.
If you want to know more details on how much data you've used per storage account/day/location/... I suggest you take a look on the subscriptions page. After choosing a subscription you can export a detailed CSV file you can analyse.