Upload blob directly from client to container - azure

Can anyone please advise me if it is possible to directly upload a file to a blob container without routing it through my web server? I'm thinking some sort of client-side JS/jQuery script or a 3rd party upload module that streams the file directly to the blob container.
With Amazon S3 I used a component called Flajaxian Direct Uploader to achieve this.
I have the need to upload zip files to an Azure blob container that are 50 mb - 200 mb in size and routing via the web server is slower and consumes additional bandwidth.

Yes, it is possible. This can be achieved by having the client contact your web server and ask for a Shared Access Signature with (w)rite only access and limited expiry. Your client can then use the simple REST API to upload the blob. The trick here is that if your blob is bigger than 64mb, you must use use the PUT block and PUT block list option. The latter is not as straightforward for a Javascript client. If your client can use curl, it works well.
Reference:
PUT Blob
PUT Block
PUT Block list

Related

Multipart byteranges on Azure static website

As it seems, Azure now has an option to publicly serve the contents of Blob storage via HTTP, mainly intended for static websites. It is rather new and currently tagged as "preview".
I'd like to store binary releases (about 3 GB each version) of a game on Azure storage, and allow players to perform a differential update to any version using a zsync-like algorithm. For this way to work, it is crucial to be able to downloaded only specified chunks of a large file. Normally, this is achieved over HTTP by sending a multipart byteranges GET request.
The question is: is it possible to enable HTTP requests with multiple byteranges on the Azure "static website"?
UPDATE: As Itay Podhajcer mentioned, I don't need the "static website" feature for serving my blob storage over HTTP, I can directly open my storage container for public. However, multipart byteranges requests do not work with direct access to blob storage too.
The Azure Static Website service is intended more for Single Page Applications (such as Angular and React apps).
If you only need to store binary content to be downloaded by clients, I think you should be fine using the "regular" Blob container.
To specify the a range header on a GET request, you can follow Specifying the Range Header for Blob Service Operations.
Hope it helps!
I managed to get multipart byteranges working by using CDN.
The full list of actions is:
Screw "static website" feature: as Itay Podhajcer wrote, it is absolutely unnecessary.
Upgrade DefaultServiceVersion using Cloud Shell, as said here (to ensure that Accept-Ranges is included in HTTP headers).
Enable "Standard Akamai" CDN to serve Blob Storage directly.
After that I can send multipart byteranges requests to the CDN endpoint, and it gives back multipart response with exactly the data I requested.

Should the client connect directly to Azure Blob Storage

Is it a good idea for the client to communicate directly to Azure Blob Storage? If we do it this way, how will we perform server side validation?
For example. Say I want to use blob storage to manage my uploaded images. But I want to prevent users from uploading certain image types and also files that are larger than 10 mb. How can I implement server side validation for this?
You have to handle the files in your back-end, you cannot allow them to upload the files directly in this case.
So you take the file in, validate it, and then upload it to Storage.
And don't give the keys to the client.

Streaming large files (2GB+) to Google Cloud Storage bucket by way of Node/Express server

I'm looking for the best way (or a way) to upload large files (1GB+) from the client side of my app to my Google Cloud Storage bucket.
The bucket is private and I'm currently trying to send the file to my node/express server and stream it to my GCS bucket. I think I'm running into issues due to file size constraints on the server though. Works fine for smaller files, but the 1GB file I'm using to test it isn't getting through.
These are the alternate methods I'm exploring:
file splitting client side and then reassembling server-side prior to sending to GCS
create a temporary service account through the Google IAM API (GCS write access), send this to the client to be able to upload the file, delete account after upload confirmed (not ideal since it exposes the service account)
???
Just looking to get pointed in the right direction at this point.
The most direct transfer would be from the client directly to GCS. You'd probably want to send the client a signed URL that they would use to start and then carry out the upload. You'd likely want to use a resumable upload unless you expect your customers to have fast enough Internet not to need to bother.
It would probably be a good idea to have clients upload to some staging GCS bucket and then notify your app that the upload is complete, at which point your app would then copy it to the final bucket (which is an instant operation if the objects are in the same region and storage class), although that's not necessarily required.
I would probably try to avoid streaming data through your app unless you wanted to transform it in some way or want to use multiple cloud providers transparently.
Creating one-off service accounts is probably not a great idea. I'm not sure what the limit on those is off-hand, but you may run into issues scaling that up.

How to bypass a webapi proxy and upload directly to the blob service [duplicate]

I have seen few examples where a file is transferred to server side and then uploaded to Azure Blob Storage.
But I have files with size in few GBs.
Is there a way I can upload a file directly to Azure Blob Storage using Client Side scripts instead of doing it from Server Side to save time.
Updating my answer, now that CORS is supported in Windows Azure Storage and the OP has not accepted any answer :).
Yes, it is possible to upload large files directly from your browser to Windows Azure Storage. You may find these steps useful:
First create a Shared Access Signature URL (SAS) with at least Write permission on the blob container in which you wish to upload the files. Since you're uploading large files, I would recommend keeping SAS expiry time to be long enough.
Next enable CORS on your storage account. If you wish to do it programmatically, you may find this post useful: http://gauravmantri.com/2013/12/01/windows-azure-storage-and-cors-lets-have-some-fun/. If you want to use a tool, my company has released a Free tool to do just that. You can read more about this tool and download from here: http://blog.cynapta.com/2013/12/cynapta-azure-cors-helper-free-tool-to-manage-cors-rules-for-windows-azure-blob-storage/.
I wrote a blog post some time back on uploading very large files into blob storage which you can read here: http://gauravmantri.com/2013/02/16/uploading-large-files-in-windows-azure-blob-storage-using-shared-access-signature-html-and-javascript/. Once CORS is enabled on your storage account, code mentioned in the blog should work just fine.
Actually there's a way though there are some preconditions/caveats.
Because CORS is not supported in Blob Storage just yet, your HTML and JS file need to be present in same blob storage account. They should be in a public blob container.
Since you're uploading large files, they would need to be split into chunks less than 4 MB in size. HTML 5 has a File API which can split the file into chunks but not all browsers support this feature.
I wrote a blog post some time ago about uploading large files using pure JavaScript and Shared Access Signature. You can read that post here: http://gauravmantri.com/2013/02/16/uploading-large-files-in-windows-azure-blob-storage-using-shared-access-signature-html-and-javascript.

Upload Vhd to an Azure Page blob in chunks

I am trying to upload a Vhd (sizing atleast 30GB) to a page blob in azure storage in an mvc web application. Due to size of the file i can not upload this large file as a whole as browsers don't allow this large request to be sent. So, the only option is to upload file in chunks (i.e. 4mb). on client size i am able to do chunking and i am sending chunks to my server side controller through an ajax request (in a loop). But using .net sdk for azure i am not finding a way to upload chunks to a page blob.
P.S There is a way to upload file in chunks in block blob using putblock() and putblocklist() methods and i am able to achieve the uploading in that way but i need to create a VM image out of the uploaded vhd and for that purpose it needs to be a page blob.
So, i would welcome any guidance to show me the way to upload vhd in chunks in a Page Blob using azure .net sdk.
You can try AzCopy tool without writing any code.
AzCopy /Source:C:\myfolder
/Dest:https://myaccount.blob.core.windows.net/mycontainer
/DestKey:mykey /Pattern:abc.vhd /BlobType:Page
You could use CloudPageBlob.WritePages method to upload the chunks of data. Please see this blog post from Azure Storage Team for an example of using this method: http://blogs.msdn.com/b/windowsazurestorage/archive/2010/04/11/using-windows-azure-page-blobs-and-how-to-efficiently-upload-and-download-page-blobs.aspx.

Resources