Multiple media encode and Merge (join) media file in azure - azure

I am new to azure, I ref media encoding sample tutorial & it working fine too,
but when I upload multiple file it successfully upload but while encoding job, my job.Submit(); method throw an exception
and another is I want single file in output (in short I want merge video files and add some snaps in between video)

No, currently (as of October the 7th) you can't join/combine multiple files into single one.
As for the errors you get while job.Submit() - in order to receive a proper answer, you have to ask the question properly - provide relevant code lines/snippet, full stack trace, exact steps to reproduce etc.
Also, then doing media trans-coding, always keep an eye on the list of supported media codecs here. Make sure you are using supported codecs before asking question.

Azure's Media Encoder Stander (MES) supports combining multiple video files using a custom preset.
Azure MES Doc: Concatenate two or more video files

Related

Play Audio File Stored in AWS S3

I have successfully uploaded some audio files via Nodejs to AWS, the file url is also returned from my function. I plan to save this url in MongoDB Atlas as a reference to the original file but before doing that, I tried to play the file (from the url) in my mobile app and it won't play.
The file is in .m4a format. How do I get this to work in any audio player for mobile and web? I'm using flutter for both. I don't want to do piping, chunking and streaming manually as this is just a dummy test of the system. The original files to be used in the app will be much larger.
Here's the file url https://empty-bouquet.s3.af-south-1.amazonaws.com/Dax+-Dear+God.m4a.
Thanks
.m4a audios aren't natively streamed from S3, but after a test I can verify that .mp3 files are. Most browsers will recognize that filetype and render a built-in player for you.
You can convert from one format to another using a lot of free tools. I used Audacity.
And yes, you need to make at least the file public. Or if you're going to do this a lot I would recommend making a bucket policy that makes everything public, no matter what you throw in there.

How to uncompress rar files using Azure DataFactory

We have a new client, while landing the project we gave them a blob storage for them to leave files so we could later automate and process the information.
The idea is to use Azure Datafactory but we find no way of dealing with .rar files, and even .zip, being it files from windows, are giving us trouble. And since it is the clien giving the .rar format, we wanted to make absolutely sure there is no way to process before asking them to change it, or deploying a databricks or similar service just for the purpose of transforming the file.
Is there any way to get a .rar file from a blob storage, uncompress it, then process it?
I have been looking in posts like this and related official documentation and closest we have come is using ZipDeflate, but it does not seem to fill our requirement.
Thanks in advance!
Data factory compression only supported types are GZip, Deflate, BZip2, and ZipDeflate.
For the Unsupported file types and compression formats, Data Factory provides some workarounds for us:
You can use the extensibility features of Azure Data Factory to transform files that aren't supported. Two options include Azure Functions and custom tasks by using Azure Batch.
You can see a sample that uses an Azure function to extract the contents of a tar file. For more information, see Azure Functions activity.
You can also build this functionality using a custom dotnet activity. Further information is available here.
Next way, you may need to figure out how to using Azure function to extract the contents of a rar file.
you can use logic apps
you can use webhook activity calling a runbook
both are easiee than using a custom activity

Unable to encode an asset with two videos in azure media services v3

I am capturing multiple videos and trying to store them all in one media asset and later encode them. But I always get the below error. There are no issues however if I upload one file to asset and run the encoding job.
Output 'BuiltInStandardEncoderPreset_0', Error : Microsoft.Azure.Management.Media.Models.JobError : While trying to download the input files, the files were not accessible, please check the availability of the source.
Do I need to use one media asset for each upload? Is having multiple files in one asset not the right approach?
FYI: I tried all thesew steps on media servicer explorer desktop app. So please be sure I didn't write any code for this.
You would indeed need to use a separate Asset for each video. We'll follow up with a service update to improve the error message in this situation.
I think the problem here is not the upload of multiple files in one asset, but the fact that the encoding job will not pick each blob of the asset and encoding them. That's why you need to use one asset per video file to be encoded.

Azure Media player gives "no compatible source was found for this media 0x10600000" error on android

Azure Media player gives "no compatible source was found for this media 0x10600000" error on android for downloaded video files for few formats like mpeg. Http url for the same file is playing fine but, if we download the file to local storage and try to play from storage it throws this error.
I think it might be related to the Media Server being able to do different encodings/formats to match the players supported formats, but once a file is downloaded locally, if the player doesn't support the format internally, it won't be able to play it.
I would suggest you try to tweak the encoding/format using HTTP headers (I think it's the Accepts header) and to trigger a download with a format supported by the player.
Hope it helps!

CloudBerry PowerShell Multipart

I've written a powershell script to upload from a windows system to an Amazon S3 Bucket. The script successfully uploads all files, except those over 5Gb. I have the Cloud Berry Explorer Pro license which allows for multipart upload on files up to 5TB. However there is no flag for multipart on the powershell snapin documentation. Cloudberry support directed me here as they only support the GUI not the powershell snapin. When running my script I get the error
"WARNING: Your proposed upload exceeds the maximum allowed object size (5 Gb)".
So question is, does anyone know if there is a command line option, or another way to enable multipart upload to Amazon S3 using Cloudberry Explorer Pro's Powershell Snapin?
Set-CloudOption UseChunks=true
I'm looking for the same in Powershell.
I believe in GUI, the original chunking mechanism has been depreciated. I have not tested myself, but I assume Powershell option UseChunks=true is still using the old mechanism? If so, files may be split into multiple parts and not automatically recombined when they arrive on S3. The new GUI Multipart Upload facility sorts this all out for you.
Annoyed Cloudberry still advertise Powershell as a component of Explorer (Free & Pro), but don't support it, even for fully paid up PRO support customers.
We did purchase the cloudberry explorer pro license for the native multipart upload capability, but we wanted to automate it. I believe based on their documentation that the old chunk method is deprecated in favor of their new multi-part functionality. We wound up testing the options listed in the powershell documentation. Those options are as follows:
Set-CloudOption -UseChunks -ChunkSizeKB
"Defines a size of chunk in KB; files larger than a chunk will be divided into chunks."
We verified that this was successfully uploading files beyond the 5GB restriction to our S3 bucket. I attempted to get a response from Cloudberry as to whether this was the old Chunking method or the new Multi-Part method, but I was unable to get a straight answer. They confirmed that because we were using pro, this powershell option was supported, but they failed to confirm which option the powershell command was using.
From what I can tell it appears that Cloudberry's legacy chunking mechanism would actually just break the file into individual files and thus would appear in S3 as multiple files. The Chunk Transparency mechanism in Cloudberry Explorer would allow the multiple chunks to appear as a single file in the Cloudberry Explorer GUI only. Since I can see the file as a single file on the S3 side, I'm assuming that the powershell option uses the new Multi-Part functionality and not the legacy Chunking functionality. Again I was not able to confirm this through Cloudberry so it's speculation on my part.
However, I can confirm that using the powershell options above will get you around the 5GB upload limit when using powershell.

Resources