Limiting initial size of an Azure Storage container - azure

I would like to know if there's a way to create an Azure storage container of a specific size, say 20 gb. I know it can be created without any restriction (I think up to 200 TB?), but can it be created with a specific size? What if I need that kind of set up? Like giving a user 20 gb initially, then at a later time increasing it to, say 50? Is that possible?
Like, how do I create that boundary/limitation for a new user that signs up my app?

Not possible with the service by itself. This should be a feature implemented in your app.

As mentioned in other answer, it is not possible to do with Blob Storage at the service level and you will have to implement your own logic to calculate the size of the blob container.
If restricting container size is the most important feature you are after, you may want to look at Azure File Storage. Equivalent to a blob container is a File Share there and you can set the quota for a File Share and change it dynamically. The quota of a File Share can be any value between 1GB - 5TB (100TB in case of Premium File Storage account) at the time of writing this answer.
Azure File Storage and Blob Storage are somewhat similar but they are meant to serve different purposes. However for simple object storage purposes you can use either of the two (File Storage is more expensive that Blob Storage though).

Related

Can I use azure functionapp storage account for other purposes like storing files in blob storage?

Can I use azure function app storage account for other purposes like storing files in blob storage? If yes will it according to Microsoft guidelines and also will it cause any performance issue? Specially when size of blob storage get increased to GBs?
I am near to production, so please come up with any suggestions, best practices, solutions as soon as possible.
Can I use azure function app storage account for other purposes like
storing files in blob storage?
Yes, you can.
If yes will it according to Microsoft guidelines and also will it
cause any performance issue? Specially when size of blob storage get
increased to GBs?
It depends. Each Azure Storage account has some pre-defined throughput limits. As long as you stay within those limits, you should be fine.
Having said this, ideally you should have a separate storage account. Considering creation of storage account doesn't cost you anything till the time you do some transactions in it, you may be better off creating a separate account to store data required by your application.

How to limit or change the size of Azure BLOB container?

Is there any way to change the size of an Azure blob container? For example, initially, I will set the limit to X. Then, using the Java SDK, I want to change that limit.
Is such an implementation possible? If not, what are the alternatives?
It is not possible to set a quota for a blob container. It can grow up to the size of a storage account.
If it's possible, do take a look at Azure File Service. Equivalent to a blob container in blob storage is a share in file storage and you can define quota for a share. You can also change the quota of a share as well. The method you would want to use there is setQuota or setQuotaWithResponse available in ShareClient.

Can we attach Azure Premium storage to multiple VM

Is it's possible to attach the same premium storage to multiple VMs so the files stored in the storage can be access in all of them.
The idea is to have a VM optimized for CPU that will calculate something and write results to the storage and have a low cost VM that will read the results and do other operations.
So if by saying "same" you mean same storage account - yes, you can do that, if by "same" you mean same VHD, no, you cant simultaneously attach same VHD to different VM's.
But you can have Azure Storage Files take on that role, it works like an SMB share, were you can store the results and other nodes will read them. Or you could just create a share on some VM that is supposed to read the results and store the results there.
Either way, its perfectly doable.

How to archive Azure blob storage content?

I'm need to store some temporary files may be 1 to 3 months. Only need to keep the last three months files. Old files need to be deleted. How can I do this in azure blob storage? Is there any other option in this case other than blob storage?
IMHO best option to store files in Azure is either Blob Storage or File Storage however both of them don't support auto expiration of content (based on age or some other criteria).
This feature has been requested long back for Blobs Storage but unfortunately no progress has been made so far (https://feedback.azure.com/forums/217298-storage/suggestions/7010724-support-expiration-auto-deletion-of-blobs).
You could however write something of your own to achieve this. It's rather very simple: Periodically (say once in a day) your program will fetch the list of blobs and compare the last modified date of the blob with current date. If the last modified date of the blob is older than the desired period (1 or 3 months like you mentioned), you simply delete the blob.
You can use WebJobs, Azure Functions or Azure Automation to schedule your code to run on a periodic basis. In fact, there's readymade code available to you if you want to use Azure Automation Service: https://gallery.technet.microsoft.com/scriptcenter/Remove-Storage-Blobs-that-aae4b761.
As I know, Azure Blob is a appropriate approach for you to storage some temporary files. For your scenario, I assumed that there is no in-build option for you to delete the old files, and you need to programmatically or manually delete your temporary files.
For a simple way, you could try to upload your blob(file) with the specific format (e.g. https://<your-storagename>.blob.core.windows.net/containerName/2016-11/fileName or https://<your-storagename>.blob.core.windows.net/2016-11/fileName), then you could manually manage your files via Microsoft Azure Storage Explorer.
Also, you could check your files and delete the old files before you uploading the new temporary file. For more details, you could follow storage-blob-dotnet-store-temp-files and override the method CleanStorageIfReachLimit to implement your logic for deleting blobs(files).
Additionally, you could leverage a scheduled Azure WebJob to clean your blobs(files).
You can use Azure Cool Blob Storage.
It is cheaper than Blob storage and is more suitable for archives.
You can store your less frequently accessed data in the Cool access tier at a low storage cost (as low as $0.01 per GB in some regions), and your more frequently accessed data in the Hot access tier at a lower access cost.
Here is a document that explains its features:
https://azure.microsoft.com/en-us/blog/introducing-azure-cool-storage/

How can I programmatically find how much Azure Storage space I have consumed so far?

There's a volume limitation per Azure Storage Account is 200 TB (two hundred terabytes). This sounds real large but if you store files in blob storage 25 megabytes each you can have about four million of them stored which is nice but not something impossible to exhaust.
I want to craft some code that would periodically check how much space I've used and raise an alert.
How can I programmatically find how much space I have already consumed in my storage account?
It looks like current limit for Azure Storage Account is 500TB (see here )
If you can have Azure Storage Account only with blobs in it you can use metrics to fetch current capacity, but current metrics are only showing capacity taken by blobs. See Storage Metrics documentation and how to enable Storage Analytics
Maybe this would help you http://www.amido.com/richard-slater/windows-azure-storage-capacity-metrics-with-powershell/
Not sure about that but it looks like you can create an alert in azure portal on this metric:

Resources