Azure IoT File Upload - azure

I have successfully used this feature to upload files to a storage container in Azure blob storage. I wanted to ask if this form of file upload has the same integrity checks using MD5 hash as there is with normal blob storage uploading. This link describes that feature for blob storage. Also, it seems this check is optional, if so, is there a way for me to ensure that this happens when I upload from my iot device using the azure-iot-sdk's.

I have tested this issue with Azure IoT SDK for CSharp. When uploading the file to azure iot hub, I uses fiddler to catch the http request. Actually, the headers in the request contains Content-MD5. So if you use Azure IoT SDK for CSharp, you need not to check the option for MD5. And you can refer to the method UploadFromStreamAsync in Microsoft Azure Storage SDK for .NET, this method is called when upload file via UploadFromStreamAsync method.
Update:
MD5 cannot be calculated for an existing blob because it would require reading the existing data. Please disable storeBlobContentMD5.

Related

Best way to upload medium sized videos to azure blob storage?

I am currently on a student plan for azure (gotta stay finessing as a college student lol) and am looking for the best way to upload videos to azure blob storage. Currently, I am using an azure function api to upload the video, but I am encountering a "Javascript heap out of memory" error when I try and multiparts parse big video files.
Ideally, I'd be able to quickly upload 3.5 minute music videos from mobile and desktop to azure blob storage with this method.
Either a better way of uploading videos to blob storage from my front-end or a solution for the javascript heap out of memory error would be amazing help.
Here's the link to that other post, if you are curious: How to fix JavaScript heap out of memory on multipart.Parse() for azure function api
Approaches:
After a workaround based on your issue, I would suggest that you use Azure Media Services.
Media Services can be integrated with Azure CDN. Refer to check Media Services-Managing streaming endpoints.
All supported formats use HTTP to transport data and benefit from HTTP caching. In live streaming, actual video/audio data is separated into fragments, which are cached in CDNs.
To start, I recommend that you use the Azure Storage SDK with Node.JS. The SDK will handle everything for you. Attaching few uploaders below to check accordingly.
Upload a video to Azure Blob examples
Refer MSDoc & SO thread by #Gopi for uploading a video with the .mp4 extension to Azure blob storage using C#.
You can upload a video using Azure functions directly. But to use Azure Functions, you must create a back-end component written in.NET, Java, JavaScript, or Python.
You can use the "Azure Storage Rest API" to upload files/video files using a storage account, like you mentioned. You will be able to get the desired result by using this Azure Storage Rest -API-MSDoc.

Azure Storage Blob with CodeIgnitor

I'd have a PHP {codeigniter} application that i want to migrate to its storage service from AWS S3 to Blob Storage,The application uploads all media files to S3 bucket and S3 generates a link that is stored to the database in which the media file can be accessed from,I want to do the same with azure Blobs storage.I'm facing technical hindrance as i can't find the right resources {libraries/code samples} achieve this goal.Tried the Azure PHP SKD but it didn't work out.
Actually, there is a detailed sample for using Azure Storage PHP SDK. You may refer to: https://github.com/Azure/azure-storage-php/blob/master/samples/BlobSamples.php
To run that sample, you just need to replace the following place with your own value:
$connectionString = 'DefaultEndpointsProtocol=https;AccountName=<yourAccount>;AccountKey=<yourKey>';
Suggestion:
I see that you want to generate an access url and store it in database. I am not familiar with AWS S3, but with Azure Storage you may need to set public access level on container or blob.
Otherwise, you can not access the blob directly. You may need to created a SAS token.

Uploading file to Azure BLOB using IoT Hub - Permissions

I'm uploading files from a Raspberry Pi to Azure Blob storage using an Azure IoT hub, using this microsoft tutorial as the basis for my C# code, and it's working fine.
Looking at the Microsoft documentation for the method UploadToBlobAsync(), "If the blob already exists, it will be overwritten."
I'm wondering if there's any way to restrict the device's permissions to create-only in the Azure portal or via PowerShell. My concern is that should someone access the device's storage and get the device id and key they would have the means to delete or overwrite files previously uploaded by that device in the storage container.
As a work-around I could have a server-side process pick up files once they've been received and move them elsewhere, but if the device id/key was restricted to create-only then I wouldn't need this overhead.
The method UploadToBlobAsync (assembly Microsoft.Azure.Devices.Client.UWP) is a wrapper of the REST API sequence calls for uploading a blob to the Azure Storage container.
The following sequence is processed:
REST API call to the Azure IoT Hub to obtain a reference for uploading blob, see the following screen snippet:
As you can see in the above picture, the sasToken for this operation has been generated for read/write.
Once the device received the above response, the REST API PUT the blob can be called.
Here is my suggestion. The device can call REST API Get the metadata of the blob, see the following screen snippet:
Based on the above result, this sequence can be either skipped or continue for actually uploading blob using the REST API PUT.
This is a last step of the sequence (very important). The device need to send a notification to the Azure IoT Hub with the status of the uploading sequence. The following screen snippet shows this REST API call:
Well, as you can see the above step #2 can decide about the skipping or overwriting the upload blob process.

FineUploader to Azure Storage

When uploading to Azure Storage, does FineUploader send the file directly to Azure Storage or to the server first?
I noticed on the website that with S3, one can upload directly but the fact that S3 was singled out got me curious.
I'm looking for a really robust solution to upload files - even large files up to 10 GB - to Azure Storage. Wanted to see if FineUploader could be the answer for me.
When uploading to Azure Storage, does FineUploader send the file directly to Azure Storage or to the server first?
Fine Uploader Azure sends the files directly to Azure Cloud Storage. You do need a server to generate Shared Access Signatures for each request. Fine Uploader Azure will contact your SAS server before each upload request (or before any request to Azure) to obtain a SAS. More information on the Azure feature page at http://docs.fineuploader.com/features/azure.html.
Fine Uploader S3 functions using a similar workflow, but there is also an option to upload files directly to S3 without maintaining your own signature server. That particular option is not available with Fine Uploader Azure.

Azure Blob storage and HDF file storage

I am in the middle of developing a cloud server and I need to store HDF files ( http://www.hdfgroup.org/HDF5/ ) using blob storage.
Functions related to creating, reading writing and modifying data elements within the file come from HDF APIs.
I need to get the file path to create the file or read or write it.
Can anyone please tell me how to create a custom file on Azure Blob ?
I need to be able to use the API like shown below, but passing the Azure storage path to the file.
http://davis.lbl.gov/Manuals/HDF5-1.4.3/Tutor/examples/C/h5_crtfile.c
These files i am trying to create can get really huge ~10-20GB, So downloading them locally and modifying them is not an option for me.
Thanks
Shashi
One possible approach, admittedly fraught with challenges, would be to create the file in a temporary location using the code you included, and then use the Azure API to upload the file to Azure as a file input stream. I am in the process of researching how size restrictions are handled in Azure storage, so I can't say whether an entire 10-20GB file could be moved in a single upload operation, but since the Azure API reads from an input stream, you should be able to create a combination of operations that would result in the information you need residing in Azure storage.
Can anyone please tell me how to create a custom file on Azure Blob ?
I need to be able to use the API like shown below, but passing the
Azure storage path to the file.
http://davis.lbl.gov/Manuals/HDF5-1.4.3/Tutor/examples/C/h5_crtfile.c
Windows Azure Blob storage is a service for storing large amounts of unstructured data that can be accessed via HTTP or HTTPS. So from application point of view Azure Blob does not work as regular disk.
Microsoft provides quite good API (c#, Java) to work with the blob storage. They also provide Blob Service REST API to access blobs from any other language (where specific blob storage API is not provided like C++).
A single block blob can be up to 200GB so it should easily store files of ~10-20GB size.
I am afraid that the provided example will not work with Windows Azure Blob. However, I do not know HDF file storage; maybe they provide some Azure Blob storage support.

Resources