Process videos on Azure VM - azure

I am building an application that will process uploaded video. Each uploaded video will be trimmed into multiple shorter parts and all the cuts will be concatenated to create a highlight of the original file. All the processing is done with ffmpeg.
I am using Azure File Storage, to upload the videos and be able to access them via Samba layer.
I also have an Azure VM where I mounted the shared folders.
What is the best approach on the worker?
Should I build a console app and run it as a windows server inside an azure VM ?
Is there another way of doing things?
I am looking for a way that can be scaled up in production.
If there another way of doing everything I described above?

If you want scaling then host it in your VM as a Windows Service is not the best solution. You can use Azure Batch for that. https://azure.microsoft.com/en-gb/services/batch/
I also recommend a looking to Azure Media Services: https://learn.microsoft.com/en-us/azure/media-services/

Related

Azure Back Ground Services For File Processing

We currently have Window service to process Inbound/outbound files.
In Bound files we read data and perform some calculations and store data in Database.
Out Bound files we generate data from the database.
We want to migrate to azure now. I have following questions .
1) what is the best way to store files in azure (Blob or File Share in azure) . We have only ".pdf,.txt,.xlsx" formats no videos
2) Which process is better to process files - WebJobs, Virtual Machine and install window service , Azure Batch Jobs, azure kubernetes service,Service Fabric.
Please some can help me on this.
Thanks
How are you receiving the files API, FTP or some other way? There are a ton of details that are needed to really answer this, but here are my thoughts.
Blob storage would be more cost effective. You only need to use a file share if you want to be able to map a network drive from a VM.
If processing one file would complete in less than 10 minutes I would look at Azure functions for that. If you’re processing thousands of files per day Azure functions would be expensive so I would look at running them on an App Service on VMs or moving to Service Fabric.
If you have a web site that’s used to upload the files and you’re already using Azure App service then you could use Web Jobs.

Best way to download many images into Azure App Service .Net Core app for processing

I have over 500 large image files that I need to process in my .NET Core app hosted in an Azure App Service. That said, I need to download all of the images and run them through a machine learning categorization function in my code. I currently use blob storage as my mechanism for storing the images, but downloading all those images via blob rest api is slow. Is there a better architecture in Azure that I should be making use of to greatly increase performance of processing these images? Perhaps a storage mechanism much faster than blob storage?
Yes, I tried at my side. Even the Storage Account is at the same location as my web app, it will take about 3-6 second to download a 30MB file. (In VM, it will only take less than 1 second)
My suggestions:
You can zip your pictures into one archive file, and download it. It would be faster than downloading them one by one.
You can use DownloadToFileParallelAsync method to download a file. It would be a little faster.
You can refer to the official tutorial to Download large amounts of random data from Azure storage

Cut videos from Azure Blob Storage

I have a web app that is hosted in Azure; one of it's functionalities is to be able to make a few cuts from the video(generate 2 or 3 small videos of 5-10 seconds from a larger video).
The videos are persisted in Azure Blob Storage.
How do you suggest to accomplish this in the Azure environment?
The actual cutting of the videos will be initiated by a web job. I'm also concerned about the pricing(within the Azure environment), I'm taking into account the possibility of high traffic.
Any feedback is appreciated.
Thank you.
Assuming you have video-cutting code that operates on files through normal I/O: You'd need to download the video file from blob, process it via code (or whatever library you've employed), and then store the result back in blob storage. You cannot reference a blob directly with normal standard IO libraries.
If, however, videos are stored in Azure File storage (which is an SMB layer on top of blob storage, then you will be able to directly manipulate your video files.
Web Jobs run within an App Service (just like Web Apps), so you have access to a certain amount of local disk space (depending on App Service tier) for use. You should have no problem temporarily storing a video file within your web app's disk space, for editing operations.
You asked about cost: Again, assuming you're talking about running code within a Web Job (app service), you're just paying for whatever App Service tier you've chosen.
How you actually do those edit operations is entirely up to you (language, library, etc).
Azure Blob Storage is simply an object store which stores the data. It does not have the capability you're looking for.
Azure Media Service however is the service you should look into. The media served by this service makes use of Azure Blob Storage.
For editing video, may I suggest you take a look at Video Editor Plugin for Azure Media Player. You can read more about this plugin here: https://azure.microsoft.com/en-in/blog/video-editor-plugin/. You can also try it out here: http://ampdemo.azureedge.net/amp_editor.html.

Access Azure Files Services from Azure WebSites

As the title says, I'm looking for a way to access an azure files share (in preview) directly from an azure website. I cannot use any REST API or anything like this and I was looking for the possibility of mounting a SMB share directly into the website (through the new portal or any other way).
I found the following links, from which I understand that this is still under review (http://feedback.azure.com/forums/169385-web-apps-formerly-websites/suggestions/6084609-allow-map-azure-file-share-microsoft-azure-file-s) and also a SO question (Can the new Azure File Service be used from Azure WebSites?) that doesn't answer my question.
To be honest and for the sake of giving more details, my scenario is pretty simple - I have some websites and also some virtual machines that should access the files from the azure files service. Regarding the VM, the approach is pretty straight forward and easy but regarding the WebSites, I don't find any way at this moment.
On the other hand, regardless of the answer to the above question, does it make sense to (or do I have the possibility to) enable CDN over an Azure Files Share?
Thank you very much.
As of today, no single technology will serve your purpose. You can't use File Service as you don't have the capability to mount a share in an Azure Website as well as it is not suited for streaming purposes (all access to files there need to be authorized and there's no concept of Shared Access Signature in File Service today).
I guess, you would have to pick one of the two technologies (Blob Service and File Service) and make some compromises to make it work in both Websites and Virtual Machines.
Assuming you go with File Service, then you can mount them in the Virtual Machine and do the processing on the files there. On the website front, you would need to use Storage Client library to download the relevant files in some folder in your website and stream those files from there.
Assuming you go with Blob Service, then you can simply stream them in your website directly from blob storage (no need to have those files in your website). In the Virtual Machine, when you need to process those files (blobs), you would simply download them to your VM for processing and then re-upload them in blob storage.
Does it make sense to (or do I have the possibility to) enable CDN
over an Azure Files Share?
Currently it is not possible to serve Azure File Service files via CDN.

Is it possible to mount blob storage to my local machine for deployment?

I have a build script that it would be very useful to configure to dump some files into Azure blob storage so they can be picked up by my Azure web role.
My preferred plan was to find some way of mounting the blob storage on my build server as a mapped drive and simply using Robocopy copy to copy the files over. This will involve the least ammount of friction as I already am deploying some files like this to other web servers using WebDrive.
I found a piece of software that will allow me to do that: http://www.gladinet.com/
However on further investigation I found that it needs port 80 to run without some hairy looking hacking about on the server.
So is there another piece of software I could use or perhaps another way I haven't considered, such as deploying the files to a local folder that is automagically synced with blob storage?
Update in response to #David Makogon
I am using http://waacceleratorumbraco.codeplex.com/ this performs 2 way synchronisation between the blob storage and the web roles. I have tested this with http://cloudberrylab.com/ and I can deploy files manually to the blob and they are deployed correctly to the web roles. Also I have done the reverse and updated files in the web roles which have then been synced back to the blob and I have subsequently edited/downloaded them from blob storage.
What I'm really looking for is a way to automate the cloudberry side of things. So I don't have a manual step to copy a few files over. I will investigate the Powershell solutions in the meantime.
I know this is an old post - but in case someone else comes here... the answer is now "yes". I've been working on a CodePlex project to do exactly that. (All source code is available).
http://azuredrive.codeplex.com/
If you're comfortable using powershell in your build process then you could use the Cerebrata Cmdlets to upload the files. If that doesn't work for you, you could write a custom activity (but this sounds quite a bit more involved).
Mounting a cloud drive from a non-Windows Azure compute instance (e.g. your local build machine) is not supported.
Having said that: Even if you could mount a Cloud Drive from your build machine, your compute instances would need access to it too, and there can only be one writer. If your compute instances only needed read-only access, they'd need to create a snapshot after you upload new files.
This really doesn't sound like a good idea though. As knightpfhor suggested, the Cerebrata cmdlets provide this capability (look at Import-File). This allows you to push individual files into their own blobs. You can optimize further by pushing a single ZIP file into a blob. You can then use a technique similar to the one described by Nate Totten in his multi-tenant web role sample, to detect new zip files and expand them to your local storage. Nate's blog post is here.
Oh, and if you don't want to use the Cerebrata cmdlets, you can upload blobs directly with the Windows Azure Storage REST API (though the cmdlets are very simple to use and integrate seamlessly with PowerShell).

Resources