Creating and Streaming a composite manifest file in Windows Azure - azure

I'm currenly developing an application using Windows Azure Media Services for video processing.
The scenario is: a user manages a sequence of video clips, puts them in a specific order and submits them to Media Services for processing.
I managed to stream the clips individually using they're *.ism/manifest url generated by Azure Media Services, but I want to stream them as a single clip.
I understand Azure Media Services does not provide merging/stitching capability for now, so the alternative is to use a composite streaming manifest file (*.csm) and reference the path of all the individual clips there.
The problems I run into are:
I could not find a single playable *.csm file on the internet as a reference
I used this tool http://code.msdn.microsoft.com/wpapps/Smooth-Streaming-Manifest-b1c3c9f9/view/SourceCode to generate a .csm based on an existing/playable ism/manifest, but I don't know how to play it.
The ism streaming url is something like:
http://.origin.mediaservices.windows.net/dd754ce8-3de8-457f-9e57-380723794e66/clipName.ism/Manifest
and is stored in Media Services and the actual files .ism and .ismv files are stored in a storage container like asset-1707d318-0484-4f8c-8f66-890786ccb1e3
Where should I store the *.csm file in Azure? Is there any running .csm player with samples available so I can test my generated .csm?
Thanks,
Florin

You can use azure storage container for storing CSM file and for testing any CSM file you can use SMF health monitor http://smf.cloudapp.net/healthmonitor.

Related

Getting insights from Azure Video Analyzer using an IP camera(no iot edge capability) live stream

I want to use an IP camera without iot edge support to live stream the video footage to azure and I want to get insights from video using Azure Video Analyzer for Media(aka VideoIndexer).
I have came across 2 possible ways to achieve it in Azure-
Came across LiveStreamAnalysis GitHub repo but the functions are not getting deployed as it is using older version of Media Service(v2). However, I read the newer version of Media Services but didn't found the Live Stream sample to start with.
Found Video Analyzer(preview) documentation but I am only able to stream and record the live stream using a simulated IP camera live stream.
I want to do further analysis on video by using video indexer apis but I didn't find any way to achieve it using 2nd approach.It only explained using IOT edge device pipelines and worflows.
How can I achieve this?
thank you for bringing (1) to our attention.
I reached out to the relevant contact.
There isn't other built in integration your (2) option is using Azure video analyzer (not for media) which is a different service.
The most promising path at the moment is (1) based on a fix.
Regarding (1), I am the main author of the Live Stream Analysis sample and it is true that the functions need to be updated to use AMS v3 API, and the logic apps to be updated too to use these new functions. I started the work to build the new functions but the work is not complete. They will be posted to https://github.com/Azure-Samples/media-services-v3-dotnet-core-functions-integration/tree/main/Functions
You can see that there is a SubmitSubclipJob which will be key for the sample.

Convert MP4 to ISM in Azure Storage

I need to generate .ISM files from MP4 files that are uploaded to Azure BLOB storage. Probably as soon as the user uploads a MP4 file to BLOB storage I should be able to fire up a Azure Function that does the conversion.
Can someone please help me how to do the conversion from MP4 to .ISM.
Note: I do not want to use Azure Media Service, it is too expensive.
The .ism file probably won't help you much at all for this situation.
If you are trying to avoid using AMS completely and just do static packaging, you should generate HLS or DASH content directly into storage blobs. You could do that with FFMPEG or the Shaka Packager tool from existing Mp4 file. There are lots of OSS solutions out there that can generate static HLS and DASH content if that is your goal.
The .ism file is primarily a feature of AMS - and it provides the information that the Streaming Endpoint (origin server) needs to dynamically packaging on-the-fly from standard MP4 files to MPEG DASH, HLS, Smooth and add on the DRM encryption for Widevine, Fairplay, and Playready. If you have no need for the multi-format dynamic packaging from MP4, then AMS is probably not the right solution for your needs.
If you can share - what parts are too expensive for you? The encoding, the streaming endpoint monthly cost for (standard endpoint cost?) or is it the overall egress bandwidth needed to deliver content from Azure (which won't go away with a storage based solution and is normally 90% of the cost of streaming if you have popular stuff.)
If you are trying to avoid encoding costs, you can encode locally or in ffmpeg on a server vm at your own costs, and then upload and stream with AMS - I have a good sample of that here - https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/Streaming/StreamExistingMp4
Thanks,
John

Record stream video by Azure

I need to record a stream video (say I have an url) and save the last N minutes of it to Azure?
My guess is that I need to use Azure media service for that.
I've already created an Azure media service account.
Could anybody give me a hint where to start from.
Update:
I'd prefer to use C#
Stream can be from blob:http://ipcamlive.com/a5fe3312-2a33-4b53-8b83-42af7928abb0 or from any web camera. Currently I'm not sure about the video format
You haven't really given much info on what type of video, what language you'll use so probably best to start with Azure Media Services documentation
Here you can find a tutorial on encoding from HTTPS source using .NET
If you can give more info on what you're looking to do, you'll likely get better hints; right now this smells like an XY Problem

Scalable cloud storage

I'm going to publish a video in a Web page for streaming. I expect having more than 100.000 visits per day in a month. I want to upload my video to a server (or service) that offers the same band-with for all the clients, even if there are hundreds of thousands of clients connected simultaneously.
I will connect the player with the external video.
Note: I cannot use Youtube or Vimeo because the video is 360ยบ technology, so I need to use my custom player.
Please, could you suggest any service that offers this feature?
Thanks!!
I would say this is mostly a question of the streaming technology you'd like use but not the storage alone.
E.g. if you wish to stream via some binary protocol like RTMP, you'll have to use software like Wowza for transcoding and delivery. Hence the load balancing for proper usage of bandwidth will also be served via load balancer like Wowza.
So you should decide what protocols and other technologies you plan using. This will narrow your search parameters.

Azure components for photo / video content management

I am evaluating Windows Azure for my photo / video management software which will have functionality of
1) uploading the photos / videos, Tagging, album creation etc
2) Live streaming of content from server
3) Download of content from server.
CDN and AppFabric Cache will definitely be helpful here. Can anyone please let me know if there are some built in components / off the shelf components / specific design patterns of Azure which can facilitate the fast development e.g. if there is something else which can help in fast streaming of content, it will definitely be helpful.
Thanks.
As you've noted, the CDN and caching will definitelly help you. However I would mostly look at the CDN. I would use caching for relatively small chunks of data (such as any data-base driven lists of details, i.e. list of cities, or countries), or a slowly-changing data. I would not put large media content in AppFabric Cache.
As for leveraging blobs/CDN for streaming, you may want to check this example.
UPDATE
Well, will you have some photo/video manipulation on server side? Or what people (users or admins) upload, that is served to the users?
If not, than there is nothing additional.
If you, however will have some image,video processing on the server side I suggest that you split your app into a WebRole (for users upload/download/stream) and a Worker Role (for processing). You can check out this lab to get understanding of how to decouple web from worker role and how to submit work items to the worker.
The reason to have separate worker role for processing is to be able to independenatly scale either web or worker on demand.

Resources