How to prevent large files upload to Azure via Put Blob - azure

Because of bandwidth issue, I'm currently sending my files directly to Azure blob storage.
But I don't want anyone with the access token to upload any file that is too large. According to Microsoft docs: Documentation
Note that the storage emulator only supports blob sizes up to 2 GB.
Is there any way that I can prevent user to upload file which is larger than 10MB to my blob storage via Azure settings or policies.

Please notice that the limit you show is about the storage emulator.(This is a local virtual storage)
For real Azure blob Storage, please refer to this:
https://learn.microsoft.com/en-us/azure/azure-resource-manager/management/azure-subscription-service-limits#azure-blob-storage-limits
And about your requirement, Azure Storage does not have such a feature that allows you to customize the size of the uploaded file, you can only design it in the code.

Related

File sync from network folder to azure blob without using the Command line (as in azcopy)

I want to sync my file from network folder to Azure blob but without using azcopy. Basically I dont want command line to be a part of the process but it should perform simple direct sync from file to azure blob. (if one file updates or deletes , the other storage should also update automatically). Is it possible to do that?
• You can use the ‘Azure Storage Explorer’ software in this case. But before using the Azure storage explorer, I would recommend you to please ensure that the RBAC assignments are correctly assigned to the Azure AD members or people who will be accessing the Azure Storage Explorer software for this purpose.
Kindly find the below snapshots for your reference on the demonstrated possibility of blob files syncing through the Azure Storage explorer software: -
Storage explorer – 1: -
Azure Blob storage: -
Storage explorer – 2: -
Azure Blob storage: -
Deleted file in Azure storage explorer and the same in Blob storage: -
Thus, as you can see from the images above, the blob file synced from the Azure storage explorer is correctly synchronised from the local system to the Azure blob storage just by uploading the blob files from the local system through the Azure storage explorer to the blob container in Azure. Also, when you delete the selected file from Azure blob storage, it is synced automatically in the Azure storage explorer application and shown accordingly.
For more details on configuring the Azure storage account in the storage explorer, kindly refer the below documentation link: -
https://learn.microsoft.com/en-us/azure/vs-azure-tools-storage-manage-with-storage-explorer?tabs=windows

Detailed logging in azure storage account?

Every 70 minutes or so, something is downloading some hundred megabytes of data from one of my azure storage accounts. Is there any way to figure out what is the cause?
All the logs and statistics I've looked at only give me graphs like this:
which tells me that stuff has been downloaded, but not what or by who
You can find this information by viewing Storage Analytics data especially the logging data contained in $logs blob container in the storage account in question.
You can use Microsoft Storage Explorer or any other storage explorer to browse the contents of this blob container and download appropriate blob. The request you would want to look into is GetBlob (that's the request sent to Azure Storage for downloading blob).

Is Azure Blob storage the right place to store many (small) communication logs?

I am working with a program which connects to multiple APIs, the logs for each operation (HTML/XML/Json) need to be stored for possible later review. Is it feasible to store each request/reply in an Azure blob? There can be hundreds of requests per second (all of which need storing) which vary in size and have an average size of 100kB.
Because the logs need to be searchable (by metadata) my plan is to store it in Azure Blob and put metadata (with blob locations, custom application-related request and content identifiers, etc) in an easily-searchable database.
You can store logs in the Azure table storage or Blob storage but Microsoft itself recommends using Blob storage. Azure Storage Analytics stores log data in Blob storage.
This 'Azure Storage Table Design Guide' points out several draw backs of using table storage for logs and also provides details on how to use the blob storage to store logs. Read the 'Log data anti-pattern' section in particular for this use case.

How to determine weather Azure storage is Page or Block Blob type?

I'm trying to configure an online backup to an Azure Storage account. Some of the files I am backing up are larger than 200GB, so I have to be using page Blob type storage.
I believe that, at the moment, this is the kind of storage I have configured; However, my backup of the files that are larger than this 200GB fails stating that the "block blob maximum size is 200GB."
How can I check what kind of storage my Azure storage is configured as? And, how can i ensure that in the future, I am configuring the correct type of storage?
An Azure Storage account can contain Block, Append and Page blobs in a same container. We do not any configurations on Account level or container level. The difference is we will need to use different APIs in SDK or implement with different REST APIs for the different type of Blobs.
You can refer to https://msdn.microsoft.com/en-us/library/azure/dd135733.aspx for more info.
And according to your requirement, for those blobs will be larger than 200GB. You can divide them into several pieces of block blobs, and you can custom mimetype of the blobs pieces to determine whether they are the piece of a special file.
Any further concern, please feel free to let me know.
It depends on how you upload the files to Azure Storge, you specify what type of blob you want to create, either Page, Blob or Append Blob.
Ex:
CloudPageBlob blob = container.GetPageBlobReference("file name");
blob.Properties.ContentType = "binary/octet-stream";
blob.Create(size)
Then you have to divide your stream into pages and iterate over it and upload it to the blob.

Azure Blob storage and HDF file storage

I am in the middle of developing a cloud server and I need to store HDF files ( http://www.hdfgroup.org/HDF5/ ) using blob storage.
Functions related to creating, reading writing and modifying data elements within the file come from HDF APIs.
I need to get the file path to create the file or read or write it.
Can anyone please tell me how to create a custom file on Azure Blob ?
I need to be able to use the API like shown below, but passing the Azure storage path to the file.
http://davis.lbl.gov/Manuals/HDF5-1.4.3/Tutor/examples/C/h5_crtfile.c
These files i am trying to create can get really huge ~10-20GB, So downloading them locally and modifying them is not an option for me.
Thanks
Shashi
One possible approach, admittedly fraught with challenges, would be to create the file in a temporary location using the code you included, and then use the Azure API to upload the file to Azure as a file input stream. I am in the process of researching how size restrictions are handled in Azure storage, so I can't say whether an entire 10-20GB file could be moved in a single upload operation, but since the Azure API reads from an input stream, you should be able to create a combination of operations that would result in the information you need residing in Azure storage.
Can anyone please tell me how to create a custom file on Azure Blob ?
I need to be able to use the API like shown below, but passing the
Azure storage path to the file.
http://davis.lbl.gov/Manuals/HDF5-1.4.3/Tutor/examples/C/h5_crtfile.c
Windows Azure Blob storage is a service for storing large amounts of unstructured data that can be accessed via HTTP or HTTPS. So from application point of view Azure Blob does not work as regular disk.
Microsoft provides quite good API (c#, Java) to work with the blob storage. They also provide Blob Service REST API to access blobs from any other language (where specific blob storage API is not provided like C++).
A single block blob can be up to 200GB so it should easily store files of ~10-20GB size.
I am afraid that the provided example will not work with Windows Azure Blob. However, I do not know HDF file storage; maybe they provide some Azure Blob storage support.

Resources