In my app I would like my users to upload local videos, which I would want to store in the Blob Storage. But I would like to achieve this by using the Azure Functions. Is this possible? If so, I couldn't find any resource that could point me in the right direction. If not what would be the ideal way to achieve this. I am building the app using flutter, for which we do not yet have the SDK's. Any help is appreciated.
There's no SDK available for Dart. You'll have to make a back-end part running in .NET, Java, JavaScript or Python in order to use Azure Functions.
But you can use the Azure Storage REST API to store the videos as blobs using a Storage Account.
Take a look at the official reference here. Using this, you'll be able to store the videos using Http.
Also, this tutorial might be useful.
The fella here is using the File Service instead the Blob Service.
A point of interest is that you have to be in mind some limitations, not that it's impossible to do what you want to, but is good to be aware:
There are limitations to the storage service.
[...] You can only upload 4mb “chunks” per upload. So if your
files exeed 4mb you have to split them into parts. If you are a good
programmer you can make use of tasks and await to start multiple
threads. Please consult the Azure limits documentation to see if any
other restrictions apply.
Related
I am currently on a student plan for azure (gotta stay finessing as a college student lol) and am looking for the best way to upload videos to azure blob storage. Currently, I am using an azure function api to upload the video, but I am encountering a "Javascript heap out of memory" error when I try and multiparts parse big video files.
Ideally, I'd be able to quickly upload 3.5 minute music videos from mobile and desktop to azure blob storage with this method.
Either a better way of uploading videos to blob storage from my front-end or a solution for the javascript heap out of memory error would be amazing help.
Here's the link to that other post, if you are curious: How to fix JavaScript heap out of memory on multipart.Parse() for azure function api
Approaches:
After a workaround based on your issue, I would suggest that you use Azure Media Services.
Media Services can be integrated with Azure CDN. Refer to check Media Services-Managing streaming endpoints.
All supported formats use HTTP to transport data and benefit from HTTP caching. In live streaming, actual video/audio data is separated into fragments, which are cached in CDNs.
To start, I recommend that you use the Azure Storage SDK with Node.JS. The SDK will handle everything for you. Attaching few uploaders below to check accordingly.
Upload a video to Azure Blob examples
Refer MSDoc & SO thread by #Gopi for uploading a video with the .mp4 extension to Azure blob storage using C#.
You can upload a video using Azure functions directly. But to use Azure Functions, you must create a back-end component written in.NET, Java, JavaScript, or Python.
You can use the "Azure Storage Rest API" to upload files/video files using a storage account, like you mentioned. You will be able to get the desired result by using this Azure Storage Rest -API-MSDoc.
I want to spike whether azure and the cloud is a good fit for us.
We have a website where users upload documents to our currently hosted website.
Every document has an equivalent record in a database.
I am using terraform to create the azure infrastructure.
What is my best way of migrating the documents from the local file path on the server to azure?
Should I be using file storage or blob storage. I am confused about the difference.
Is there anything in terraform that can help with this?
Based on your comments, I would recommend storing them in Blob Storage. This service is suited for storing and serving unstructured data like files and images. There are many other features like redundancy, archiving etc. that you may find useful in your scenario.
File Storage is more suitable in Lift-and-Shift kind of scenarios where you're moving an on-prem application to the cloud and the application writes data to either local or network attached disk.
You may also find this article useful: https://learn.microsoft.com/en-us/azure/storage/common/storage-decide-blobs-files-disks
UPDATE
Regarding uploading files from local computer to Azure Storage, there are actually many options available:
Use a Storage Explorer like Microsoft's Storage Explorer.
Use AzCopy command-line tool.
Use Azure PowerShell Cmdlets.
Use Azure CLI.
Write your own code using any available Storage Client libraries or directly consuming REST API.
there is a tutorial in Microsoft docs that you can see here:
Tutorial: Build your first pipeline to transform data using Hadoop cluster
in "Prerequisites" section, in step 6, they wrote "Use tools such as Microsoft Azure Storage Explorer".
the question is, can I use some other tools? especially, is it possible to use scripting languages like Python directly?
I need to do all these 7 steps dynamically, using something like Azure Function Apps. do you know is it possible and if it is, from where I should start?
Short answer is YES. But again, you have not shared details on what functionality are you looking for specifically.
What you can do is call the REST API endpoints for the corresponding service.
Depending on whether you are using Blobs or Table or Queues, there are specific API's that you can call.
Azure Storage Services REST API Reference
Blob service REST API's
Queue Service REST API
Table Service REST API
File Service REST API
Taking Blobs as example, we have API's to upload content using PUT method. See this for more details: Put Blob
Similarly, you have API's for reading containers, listing containers etc.
There is also some samples on working with Azure Storage in Python on Github. See this: Azure-Samples/storage-python-getting-started
HTH
I have a web app that is hosted in Azure; one of it's functionalities is to be able to make a few cuts from the video(generate 2 or 3 small videos of 5-10 seconds from a larger video).
The videos are persisted in Azure Blob Storage.
How do you suggest to accomplish this in the Azure environment?
The actual cutting of the videos will be initiated by a web job. I'm also concerned about the pricing(within the Azure environment), I'm taking into account the possibility of high traffic.
Any feedback is appreciated.
Thank you.
Assuming you have video-cutting code that operates on files through normal I/O: You'd need to download the video file from blob, process it via code (or whatever library you've employed), and then store the result back in blob storage. You cannot reference a blob directly with normal standard IO libraries.
If, however, videos are stored in Azure File storage (which is an SMB layer on top of blob storage, then you will be able to directly manipulate your video files.
Web Jobs run within an App Service (just like Web Apps), so you have access to a certain amount of local disk space (depending on App Service tier) for use. You should have no problem temporarily storing a video file within your web app's disk space, for editing operations.
You asked about cost: Again, assuming you're talking about running code within a Web Job (app service), you're just paying for whatever App Service tier you've chosen.
How you actually do those edit operations is entirely up to you (language, library, etc).
Azure Blob Storage is simply an object store which stores the data. It does not have the capability you're looking for.
Azure Media Service however is the service you should look into. The media served by this service makes use of Azure Blob Storage.
For editing video, may I suggest you take a look at Video Editor Plugin for Azure Media Player. You can read more about this plugin here: https://azure.microsoft.com/en-in/blog/video-editor-plugin/. You can also try it out here: http://ampdemo.azureedge.net/amp_editor.html.
I need to store multiple files that users upload, and then provide these users with the capability of accessing their files via http. There are two key considerations:
- Storage (which is my primary concern here)
- Security (which let's leave aside for now)
The question is:
What is the most cost efficient and performant way of storing all these files and giving access to them later? I believe the answer is:
- Store files within Azure Storage Account, and have a key that references them in an SQL Azure database.
I am correct on this?
Is a blob storage flat? Or can I create something like folders inside it to better organize my files?
The idea of using SQL Azure to store metadata for your blobs is a pretty common scenario, which allows you to take advantage of SQL for searching, and blobs for storage.
Blobs are organized by container. So you'd have something like:
http://mystorage.blob.core.windows.net/mycontainer/myfile.doc
You can also simulate a hierarchy using a delimiter, but in reality there's just container plus blob.
If you keep the container or blob private, the user would either have to go through your web front end (or web service), or you'd have to provide them with a special URL with a Shared Access Signature appended, which is a time-limited URL.
I would recommend you to take a look at BlobShare Sample which is a simple file sharing application that demonstrates the storage services of the Windows Azure Platform, together with the authentication and authorization capabilities of Access Control Service (ACS). The full sample code is located at following link:
http://blobshare.codeplex.com/
You can use this sample code immediately, just by adding proper reference to your Windows Azure Account credentials. The best thing with this sample is that you can provide blob access directly through Access Control Services. You can also modify the code to add SAS support as well as blob download from public containers. Once you have it working and understood the concept you can tweak to make it the way you would want.