A server is in the middle
And we want you to send live video to it
And on the other hand, watch it through the HTTPS like https://server/live.pm4
What protocols can be used for this purpose?
I used to do this experimentally with nodejs dgram and ffmpeg on the raw UDP and it worked fine !
but stability and security is an issue that must be observed !
Most live (or VOD) video, for services where quality is important and/or there are a large number or users, use Adaptive Bit Rate streaming protocols at this time.
The two leading ones are HLS and MPEG DASH.
As a very high level general rule, noting that there are exceptions:
Android, Chrome Browser, Edge Browser - use MPEG DASH
iOS, Safari browser - use HLS
The introduction of CMAF () has consolidated the two formats making life for service provider easier, and the media streams can be the same for both and just tine manifest of index files to the streams are specific to DASH and HLS. Unfortunately, encrypted stream support for CMAF is not yet rolled out across all consumer devices so many services cannot use it fully yet.
On the security side, nearly all services use DRM to encrypt content and control entitlements. The leading DRM's are Google Widevine, Apple FairPlay and Microsoft PlayReady. Again, these are generally platform specific, with the usual use being:
Android, Chrome browser - Widevine
Edge Browser, X-Box - PlayReady
iOS, Safari - FairPlay
Related
There is an audio stream which sends from mobile device to the server. And server sends chunks of data (due web-sockets) to the web.
The question is. What to use to play this audio in live mode, also there is should be a possibility to rewind audio back, listen to what was before..and again switch to live mode.
I considered such possibilities as Media Source API but it's not supported by Safari and Chrome on IOS, isn't it? But we need that support.
Also, there is Web Audio API which supports by modern browsers, but I'm not sure does it possible to listen to audio in live mode and rewind audio back?
Any ideas or guides on how to implement it?
I considered such possibilities as Media Source API but it's not supported by Safari and Chrome on IOS, isn't it? But we need that support.
Then, you can't use MediaSource Extensions. Thanks Apple!
And server sends chunks of data (due web-sockets) to the web.
Without MediaSource Extensions, you have no way of using this data from a web socket connection. (Unless it's PCM, or you're decoding it to PCM, in which case you could use the Web Audio API, but this is totally impractical, inefficient, and not something you should pursue.)
You have to change how you're streaming. You have a few choices:
Best Option: HLS
If you switch to HLS, you'll get the compatibility you need, as well as the ability to go back in time and what not. This is what you should do.
Mediocre Option: HTTP Progressive
This is a fine way to stream for most use cases but there isn't any built-in way to handle the stream seeking that you want. You'd have to build it, which is not worth your time since you could just use HLS.
Even More Mediocre Option: WebRTC
You could switch to WebRTC for streaming, but you have greatly increased infrastructure costs and complexity. And, you still need to figure out how you're going to handle seeking. The only reason you'd want to go the WebRTC route is if you absolutely needed the lowest latency.
I'm using the VideoJS player and have a CMAF video so I am using HLS and DASH. I'm also using all 3 types of DRM. Fairplay for hls, widevine and playready for dash.
My question is should I include both hls and dash sources in the player and let the player decide which one to play? Or do I detect the browser and only insert the correct url based on that? Also DRM is the same. Can I just add all DRM to the player? Or should I only add the one that applies?
The use of HLS vs DASH is typically dictated by the end device and client capabilities and rules.
iOS and Safari typically use HLS and FairPlay, Android, Firefox and Chrome use DASH and Widevine and Windows and Edge use DASH and PlayReady.
Note that Widevine and PlayReady can use the same DASH stream - CENC, Common Encryption standard, allows the same stream to include both Widevine and PlayReady DRM information.
At this time, Apple iOS devices must use HLS for content greater than 10 mins over a mobile network:
2.5.7 Video streaming content over a cellular network longer than 10 minutes must use HTTP Live Streaming and include a baseline 192 kbps HTTP Live stream.
(https://developer.apple.com/app-store/review/guidelines/)
For this reason streams served to Apple devices are usually HLS, while DASH is used for other devices.
CMAF greatly reduces the impact of this by allowing the same segmented media stream be used for both HLS and DASH, with just the 'index' or manifest files being specific to each protocol.
For encrypted content it is a bit more complicated. At this time, FairPlay uses a different AES encryption mode, AES CBC, than Widevine and PlayReady which use AES-CTR. This means you still need two copies of the media to serve encrypted content streams.
This is changing as Widevine and PlayReady now have announced support for AES-CBC as well as AES-CTR, but it will take some time for this to roll out to deployed devices.
It seems PlayReady can't play HLS on platforms other than iOS but no obvious proofs found.
It seems PlayReady can't play HLS on platforms other than iOS but no
obvious proofs found.
That is a wrong assumption.
PlayReady is DRM technology. HLS is streaming technology. These technologies work on different level of processing during video playback. DRM technology even doesn't know anything at all about used streaming technology (or content delivery in general).
HLS is adaptive streaming technology which allows you obtaining content from the server and dynamically change the quality based on device capabilities or network conditions. HLS work on the level of Mpeg2-TS or fragmented MP4 streams.
PlayReady is DRM technology protecting the content. It does not protect whole segments or fragments delivered by adaptive streaming. You first need to demux transport container (fMP4 or Mpeg2-TS) and get elementary stream of samples. Individual samples (or sub-samples) are protected by DRM.
There is clear separation between DRM and adaptive streaming technologies which allows mixing them. The only case when these technologies meet each other is adaptive streaming manifest where you may have description of protection header or key IDs which may be DRM specific.
Adaptive technologies are not bound to platform - you can write HLS player from scratch on any platform. The situation is not the same for DRM. You either work on platform where DRM technology is already present or you have access to porting kit and you go through process to port it to the platform and meet the robustness requirements (but usually only device manufacturers have resources to go through this path).
So can you use PlayReady and HLS on another platform? Definitely! But in most cases the response is more like: You can use HLS with PlayReady on all platforms where PlayReady port is already available for you.
I think you confusion comes from Microsoft providing iOS PlayReady Client SDK. It is ready to use player with PlayReady and HLS support. You don't need to port anything, you just pay license and use the player. You can still consume HLS delivered content protected with PlayReady for example in Windows 10 UWP application. You can also have JavaScript HLS implementation and use it with PlayReady in Internet Explorer or Edge browsers. You only need to write (or use existing) HLS component.
I'm quite sure that many Smart TV manufacturers also have ready to use HLS player with support for PlayReady.
PlayReady can be used on iOS, via an SDK like the official Microsoft PlayReady iOS SDK:
https://www.microsoft.com/playready/features/ClientOptions.aspx
Here is the supported streaming formats for that SDK at the time of writing:
iOS platform
Includes a basic reference media player to build a final app
Smooth Streaming (VoD/Live)
MPEG-DASH (ISOBFF, VoD/Live)
Key rotation and blackouts
Support for HLS on iOS (VoD/Live)
PlayReady ND-Receiver functionality on iOS clients
I am trying to build a website and mobile app (iOS, Android) for the internet radio station.
Website users broadcast their music or radio and mobile users will just listen radio stations and chat with other listeners.
I searched a week and make a prototype with Wowza engine (using HLS and RTMP) and SHOUTcast server on Amazon EC2.
Using HLS has a delay with 5 seconds, but RTMP and SHOUTcast has 2 second delay.
With this result I think I should choose RTMP or SHOUTcast.
But I am not sure RTMP and SHOUTcast are the best protocol. :(
What protocol should I choose?
Do I need to provide a various protocol to cover all platform?
This is a very broad question. Let's start with the distribution protocol.
Streaming Protocol
HLS has the advantage of allowing users to get the stream in the bitrate that is best for their connection. Clients can scale up/down seamlessly without stopping playback. This is particularly important for video, but for audio even mobile clients are capable of playing 128kbit streams in most areas. If you intend to have a variety of bitrates available and want to change quality mid-stream, then HLS is a good protocol for you.
The downside of HLS is compatibility. iOS supports it, but that's about it. Android has HLS support but it is still buggy. (Maybe in another year or two once all the Android 3.0 folks are gone, this won't be as much of an issue.) JWPlayer has some hacks to make HLS work in Flash for desktop users.
I wouldn't bother with RTMP unless you're only concerned with Flash users.
Pure progressive streaming with HTTP is the route I almost always choose to go. Everything can play it. (Even my Palm Pilot's default media player from 12 years ago.) It's simple to implement and well understood.
SHOUTcast is effectively HTTP, but a poorly implemented version that has compatibility issues, particularly on mobile devices. It has a non-standard status line in its response which breaks a lot of clients. Icecast is a good alternative, and is what I would recommend for production use today. As another option, I have created my own streaming service called AudioPump which is HTTP as well, and has been specifically built to fix compatibility with oddball mobile clients, such as native Android players on old hardware. It isn't generally available yet, but you can contact me at brad#audiopump.co if you want to try it.
Latency
You mentioned a latency of 2 seconds being desirable. If you're getting 2-second latency with SHOUTcast, something is wrong. You don't want latency that low, particularly if you're streaming to mobile clients. I usually start with a 20-second buffer at a minimum, which is flushed to the client as fast as it can receive it. This enables immediate starting of the stream playback (as it fills up the client-side buffer so it can begin decoding) while providing some protection against buffer underruns due to network conditions. It's not uncommon for mobile users to walk around the corner of a building and lose their nice signal quality. You want your stream to survive that as best as possible, so if you have already sent the data to cover the drop out, the user doesn't have to know or care that their connection became mediocre for a short period of time.
If you do require low latency, you're looking at the wrong technology entirely. For low latency, check out WebRTC.
You certainly can tweak your traditional internet radio setup to reduce latency, but rarely is that a good idea.
Codec
Codec choice is what will dictate your compatibility more than anything else. MP3 is easily the most compatible, and AAC isn't far behind. If you go with AAC, you get better quality audio for a given bitrate. Most folks use this to reduce their bandwidth bill.
There are licensing fees with MP3, and there may be with AAC depending on what you're using for a codec. Check with a lawyer. I am not one, and the licensing is extremely complicated.
Other codecs include Vorbis and Opus. If you can use Opus, do so as the licensing is wide open and you get good quality for the bandwidth. Client compatibility here though is the killer of Opus. (Maybe in a few years it will be better.) Vorbis is a mediocre codec, but is free and clear.
On the extreme end, I have some stations doing their streaming in FLAC. This is lossless audio quality, but you're paying for 8x the bandwidth as you would with a medium quality MP3 station. FLAC over HTTP streaming compatibility is not code at the moment, but it works alright in VLC.
It is very common to support multiple codecs for your streams. Depending on your budget, if you can't do that, you're best off with MP3.
Finally on encoding, don't go from a lossy codec to another lossy codec if you can help it. Try to get the output stream as close to the input as possible. If you re-encode audio, you lose quality every time.
Recording from Browser
You mentioned users streaming from a browser. I built something like this a couple years ago with the Web Audio API where the audio is captured and then encoded and sent off to Icecast/SHOUTcast servers. Check it out here: http://demo.audiopump.co:3000/ A brief explanation of how it works is here: https://stackoverflow.com/a/20850467/362536
Anyway, I hope this helps you get started.
Streaming straight audio/mpeg (mp3 packets) has worked everywhere I've tried.
If you are developing an APP then go with AAC, if you are simply playing via web browser then you need a HTML5 Implimentation which is MP3. All custom protocols like RTMP or SHOUTcast requires additional UI to be built. There are some third party players available in open source world. You can either use them or stick to HTML5 MP3/OGG as most people now days are using chrome browser or other HTML5 complaint browsers.
I'm aware that developer-preview of Chromecast receiver does not fully support Smooth Streaming manifest URL (See Update#1).
I have tested content provided by Microsoft PlayReady(TM) Test Server - Smooth Streaming assets using sample receiver app provider in GitHub project.
Smooth Streaming Support
As expected, manifest file does not work (See Update#1). But I was able to play individual ismv file (but only low bitrates). When I use higher bitrate, the video container stays black.
PlayReady Support
When I tried to play PlayReady protected low bitrate ismv file, I was expecting some sort of call back MediaProtocolMessageStream.onKeyRequested(). But there did not happen. Here is my android CustomMediaProtocolMessageStream implementation.
So, does anybody know how PlayReady or Widevine supposed to work with Chromecast? I have seen Netflix invokes some binary shell command when app is loaded in chromecast. But I assume, they worked with Google to accomplish this.
Additional SO Resources
How to play smooth streaming video in Chromecast?
Is it actually possible to play SmoothStreaming videos on Chromecast without using (format=mpd-time-csf)?
Playing Smoothstreaming URL by providing Manifest file of smoothstreaming to Chromecast device
Update #1
Based on Les Vogel's answer, smooth streaming manifest file for adaptive bitrate streaming is supported by Chromecast. You need custom player to handle that.
As far as I am aware of, currently there are two JS player which can handle that, but I don't know if they will work on Chromecast.
dash.js - By DASH Industry Forum (https://github.com/Dash-Industry-Forum/dash.js)
Microsoft HTML5 Player Framework - Part of Microsoft Media Platform (http://playerframework.codeplex.com/)
Currently, you need to write your own media player to support adaptive bitrate streaming on Chromecast.
Unfortunately, the MS test server assets do not correctly provide a CORS header, which would be needed if you wrote a javascript player.
PlayReady and Windvine are both supported. We'll be providing additional documentation shortly.
EDIT We announced the beta of the Cast Media Player Library today 2/3/14 - it supports HLS, SmoothStreaming, and MPEG Dash.
Yes, you can use "com.microsoft.playready" for PlayReady and "com.widevine.alpha" for widevine.