Chromecast support for Smooth Streaming with PlayReady - google-cast

I'm aware that developer-preview of Chromecast receiver does not fully support Smooth Streaming manifest URL (See Update#1).
I have tested content provided by Microsoft PlayReady(TM) Test Server - Smooth Streaming assets using sample receiver app provider in GitHub project.
Smooth Streaming Support
As expected, manifest file does not work (See Update#1). But I was able to play individual ismv file (but only low bitrates). When I use higher bitrate, the video container stays black.
PlayReady Support
When I tried to play PlayReady protected low bitrate ismv file, I was expecting some sort of call back MediaProtocolMessageStream.onKeyRequested(). But there did not happen. Here is my android CustomMediaProtocolMessageStream implementation.
So, does anybody know how PlayReady or Widevine supposed to work with Chromecast? I have seen Netflix invokes some binary shell command when app is loaded in chromecast. But I assume, they worked with Google to accomplish this.
Additional SO Resources
How to play smooth streaming video in Chromecast?
Is it actually possible to play SmoothStreaming videos on Chromecast without using (format=mpd-time-csf)?
Playing Smoothstreaming URL by providing Manifest file of smoothstreaming to Chromecast device
Update #1
Based on Les Vogel's answer, smooth streaming manifest file for adaptive bitrate streaming is supported by Chromecast. You need custom player to handle that.
As far as I am aware of, currently there are two JS player which can handle that, but I don't know if they will work on Chromecast.
dash.js - By DASH Industry Forum (https://github.com/Dash-Industry-Forum/dash.js)
Microsoft HTML5 Player Framework - Part of Microsoft Media Platform (http://playerframework.codeplex.com/)

Currently, you need to write your own media player to support adaptive bitrate streaming on Chromecast.
Unfortunately, the MS test server assets do not correctly provide a CORS header, which would be needed if you wrote a javascript player.
PlayReady and Windvine are both supported. We'll be providing additional documentation shortly.
EDIT We announced the beta of the Cast Media Player Library today 2/3/14 - it supports HLS, SmoothStreaming, and MPEG Dash.

Yes, you can use "com.microsoft.playready" for PlayReady and "com.widevine.alpha" for widevine.

Related

what protocol must use for broadcast live video?

A server is in the middle
And we want you to send live video to it
And on the other hand, watch it through the HTTPS like https://server/live.pm4
What protocols can be used for this purpose?
I used to do this experimentally with nodejs dgram and ffmpeg on the raw UDP and it worked fine !
but stability and security is an issue that must be observed !
Most live (or VOD) video, for services where quality is important and/or there are a large number or users, use Adaptive Bit Rate streaming protocols at this time.
The two leading ones are HLS and MPEG DASH.
As a very high level general rule, noting that there are exceptions:
Android, Chrome Browser, Edge Browser - use MPEG DASH
iOS, Safari browser - use HLS
The introduction of CMAF () has consolidated the two formats making life for service provider easier, and the media streams can be the same for both and just tine manifest of index files to the streams are specific to DASH and HLS. Unfortunately, encrypted stream support for CMAF is not yet rolled out across all consumer devices so many services cannot use it fully yet.
On the security side, nearly all services use DRM to encrypt content and control entitlements. The leading DRM's are Google Widevine, Apple FairPlay and Microsoft PlayReady. Again, these are generally platform specific, with the usual use being:
Android, Chrome browser - Widevine
Edge Browser, X-Box - PlayReady
iOS, Safari - FairPlay

What is the reccomended way to use HLS and DASH + DRM in a player?

I'm using the VideoJS player and have a CMAF video so I am using HLS and DASH. I'm also using all 3 types of DRM. Fairplay for hls, widevine and playready for dash.
My question is should I include both hls and dash sources in the player and let the player decide which one to play? Or do I detect the browser and only insert the correct url based on that? Also DRM is the same. Can I just add all DRM to the player? Or should I only add the one that applies?
The use of HLS vs DASH is typically dictated by the end device and client capabilities and rules.
iOS and Safari typically use HLS and FairPlay, Android, Firefox and Chrome use DASH and Widevine and Windows and Edge use DASH and PlayReady.
Note that Widevine and PlayReady can use the same DASH stream - CENC, Common Encryption standard, allows the same stream to include both Widevine and PlayReady DRM information.
At this time, Apple iOS devices must use HLS for content greater than 10 mins over a mobile network:
2.5.7 Video streaming content over a cellular network longer than 10 minutes must use HTTP Live Streaming and include a baseline 192 kbps HTTP Live stream.
(https://developer.apple.com/app-store/review/guidelines/)
For this reason streams served to Apple devices are usually HLS, while DASH is used for other devices.
CMAF greatly reduces the impact of this by allowing the same segmented media stream be used for both HLS and DASH, with just the 'index' or manifest files being specific to each protocol.
For encrypted content it is a bit more complicated. At this time, FairPlay uses a different AES encryption mode, AES CBC, than Widevine and PlayReady which use AES-CTR. This means you still need two copies of the media to serve encrypted content streams.
This is changing as Widevine and PlayReady now have announced support for AES-CBC as well as AES-CTR, but it will take some time for this to roll out to deployed devices.

Does PlayReady play encrypted HLS on iOS only?

It seems PlayReady can't play HLS on platforms other than iOS but no obvious proofs found.
It seems PlayReady can't play HLS on platforms other than iOS but no
obvious proofs found.
That is a wrong assumption.
PlayReady is DRM technology. HLS is streaming technology. These technologies work on different level of processing during video playback. DRM technology even doesn't know anything at all about used streaming technology (or content delivery in general).
HLS is adaptive streaming technology which allows you obtaining content from the server and dynamically change the quality based on device capabilities or network conditions. HLS work on the level of Mpeg2-TS or fragmented MP4 streams.
PlayReady is DRM technology protecting the content. It does not protect whole segments or fragments delivered by adaptive streaming. You first need to demux transport container (fMP4 or Mpeg2-TS) and get elementary stream of samples. Individual samples (or sub-samples) are protected by DRM.
There is clear separation between DRM and adaptive streaming technologies which allows mixing them. The only case when these technologies meet each other is adaptive streaming manifest where you may have description of protection header or key IDs which may be DRM specific.
Adaptive technologies are not bound to platform - you can write HLS player from scratch on any platform. The situation is not the same for DRM. You either work on platform where DRM technology is already present or you have access to porting kit and you go through process to port it to the platform and meet the robustness requirements (but usually only device manufacturers have resources to go through this path).
So can you use PlayReady and HLS on another platform? Definitely! But in most cases the response is more like: You can use HLS with PlayReady on all platforms where PlayReady port is already available for you.
I think you confusion comes from Microsoft providing iOS PlayReady Client SDK. It is ready to use player with PlayReady and HLS support. You don't need to port anything, you just pay license and use the player. You can still consume HLS delivered content protected with PlayReady for example in Windows 10 UWP application. You can also have JavaScript HLS implementation and use it with PlayReady in Internet Explorer or Edge browsers. You only need to write (or use existing) HLS component.
I'm quite sure that many Smart TV manufacturers also have ready to use HLS player with support for PlayReady.
PlayReady can be used on iOS, via an SDK like the official Microsoft PlayReady iOS SDK:
https://www.microsoft.com/playready/features/ClientOptions.aspx
Here is the supported streaming formats for that SDK at the time of writing:
iOS platform
Includes a basic reference media player to build a final app
Smooth Streaming (VoD/Live)
MPEG-DASH (ISOBFF, VoD/Live)
Key rotation and blackouts
Support for HLS on iOS (VoD/Live)
PlayReady ND-Receiver functionality on iOS clients

Can anyone explain me corellation between MSE DASH and HLS?

I am new to media streaming, just started learning about adaptive streaming.
I have few queries, please clarify -
Does MSE support only DASH streaming, I mean if any website using DASH and my browser supports MSE with DASH, it will play. But if a website uses HLS, then my browser is not playing video content although it has MSE.
Is it because MSE does not support HLS, or my browser MSE does not have implementation of HLS?
If I inspect a webpage playing video stream, I checked many sites uses video tag with "src" attribute as blob. Does blob means it is using MSE.
Can we have blob in "src" attribute for DASH(I checked in Youtube) and for HLS(as in dailymotion or twitch.tv) as well?
I was reading few articles on twitch.tv, does twitch.tv only support HLS with html5 player or flash? If suppose a browser does not support flash and HLS through html5 player, then there is no way to play twitch.tv content on browser?
Thanks
MediaSource Extensions (MSE) supports anything you can de-mux in JavaScript and send to the browser's native codecs. Browsers don't support DASH natively. Some browsers support HLS natively but most don't. It is possible to use both DASH and HLS in browsers that support MSE with the correct JavaScript library for handling each.
The blob you see could be a regular blob (an immutable chunk of binary), but more than likely it's coming from MSE.
I can't speak to what Twitch does internally.
Your questions don't really make sense as they are asked, so I can't answer the 1,2,3. But I can clear up some of your confusion. HLS and DASH are a collection of technologies, not single competing technologies. Most HTTPS streaming protocols are made up of a binary video format, and a text based manifest format. DASH uses an overly complex XML manifest format with a fragmented MP4 video format. HLS uses an m3u8 manifest, with fragmented Transport stream for the video format. As of IOS 10 HLS also supports fragmented MP4. MSE can play fragmented MP4. But browsers don't read manifests. Hence a player application must be used to download and parse the manifest, download the video fragments, then give them to the browser to play. Twitch uses HLS with transport streams, but runs custom software in the browser to convert them to MP4 fragments. (Or flv streams in the case of flash). When you see a src with a blob, that is a normal (not fragmented) MP4, and is completely different. Safari is an exception, it can play HLS using an m3u8 manifest as the source.

Azure media Services player

I'm feeling that this is a really dumb question, but my research tells me I have to create my own player. Is that true?
I have a link (publish URL) from Azure Media services like this:
http://streamvideotest.streaming.mediaservices.windows.net/4ed49a08-f82d-462e-a05e-acea910064a5/7g91d5d8-b213-406c-90d8-75a3a5e2456d.ism/Manifest
Which I would like to hand out to a few people to play the video, or live feed at that channel. But you need some type of player? I've tried Windows Media Player (open URL), but that always fails.
It depends on which platform you want to reach.
If you are trying to play live stream on PC using Smooth Streaming, you could use Silverlight Player (http://smf.cloudapp.net/healthmonitor). Or if you want to stream DASH through modern browser (IE or Chrome), you could do it through HTML5 video tag natively.
If you are trying to reach out to iOS platform, you could do it natively by delivering HLS stream - append (format=m3u8-aapl) in your link above.
This is an article that describe different players you could use in different platform: https://msdn.microsoft.com/en-us/library/azure/dn223283.aspx.
As said, Azure Media Services has just rolled out Azure Media Player, which could detect the capability of the platform and feed in right streaming protocols by using the right player technology. please check out http://azure.microsoft.com/blog/2015/04/15/announcing-azure-media-player/.
Thanks,
Mingfei Yan
The Azure Media Player has officially been rolled out at: http://amsplayer.azurewebsites.net/
Of course, Mingfei is the Queen of Azure Media Services and the most definitive answer to any and all AMS questions!
Just to add to this thread - As Mingfei mentioned it depends on where you want your video to work. However, if you have a browser based offering, we'd reccomend starting with Azure Media Player.
http://azure.microsoft.com/blog/2015/04/15/announcing-azure-media-player/

Resources