Can anyone explain me corellation between MSE DASH and HLS? - http-live-streaming

I am new to media streaming, just started learning about adaptive streaming.
I have few queries, please clarify -
Does MSE support only DASH streaming, I mean if any website using DASH and my browser supports MSE with DASH, it will play. But if a website uses HLS, then my browser is not playing video content although it has MSE.
Is it because MSE does not support HLS, or my browser MSE does not have implementation of HLS?
If I inspect a webpage playing video stream, I checked many sites uses video tag with "src" attribute as blob. Does blob means it is using MSE.
Can we have blob in "src" attribute for DASH(I checked in Youtube) and for HLS(as in dailymotion or twitch.tv) as well?
I was reading few articles on twitch.tv, does twitch.tv only support HLS with html5 player or flash? If suppose a browser does not support flash and HLS through html5 player, then there is no way to play twitch.tv content on browser?
Thanks

MediaSource Extensions (MSE) supports anything you can de-mux in JavaScript and send to the browser's native codecs. Browsers don't support DASH natively. Some browsers support HLS natively but most don't. It is possible to use both DASH and HLS in browsers that support MSE with the correct JavaScript library for handling each.
The blob you see could be a regular blob (an immutable chunk of binary), but more than likely it's coming from MSE.
I can't speak to what Twitch does internally.

Your questions don't really make sense as they are asked, so I can't answer the 1,2,3. But I can clear up some of your confusion. HLS and DASH are a collection of technologies, not single competing technologies. Most HTTPS streaming protocols are made up of a binary video format, and a text based manifest format. DASH uses an overly complex XML manifest format with a fragmented MP4 video format. HLS uses an m3u8 manifest, with fragmented Transport stream for the video format. As of IOS 10 HLS also supports fragmented MP4. MSE can play fragmented MP4. But browsers don't read manifests. Hence a player application must be used to download and parse the manifest, download the video fragments, then give them to the browser to play. Twitch uses HLS with transport streams, but runs custom software in the browser to convert them to MP4 fragments. (Or flv streams in the case of flash). When you see a src with a blob, that is a normal (not fragmented) MP4, and is completely different. Safari is an exception, it can play HLS using an m3u8 manifest as the source.

Related

when video or audio is played from a uri is it streamed or downloaded fully and played?

I have a content creation site I am building and im confused on audio and video.
If I have a content creators audio or video stored in s3 and then I want to display their file will the html video player or audio player stream the media or will it download it fully then play it?
I ask because what if the video or audio is significantly long. like 2 hours for example. I need to know how to solve the use case.
Lastly what file type is most acceptable for viewing on webpages? It seems like MPEG-4 is the best bet. Is that true?
Most video player clients and browsers will attempt to stream the video if they can.
For an mp4 video file hosted on a server, so long as the header is at the start and the server accepts range requests, this will mean the player downloads the video in chunks and starts playing as soon as it has enough to decide the first frames.
For more professional streaming services, they will generally use an adaptive bit rate streaming protocol like DASH or HLS (see this answer: https://stackoverflow.com/a/42365034/334402) and again the video will be streamed in chunks, or segments, and will start playing while it is streaming.
To answer your last question you need to be aware that the raw video is encoded (e.g. h.264, VP9 etc) and the video, audio, subtitle etc tracks stored in a video container (e.g. mp4, Web etc).
The most common format is probaly h.264 encoded and mp4 containers at this time.
The particular profile for h.264 can matter also depending on the device - baseline is probably the most supported profile at this time. You can find examples of media support for different devices online, e.g. for Android: https://developer.android.com/guide/topics/media/media-formats
#Mick's answer is spot on. I'll just add that mp4 (with h264 encoding) will work in just about every browser out there.
The issue with mp4 files (especially with a 2 hour long movie) isn't so much the seeking & streaming. If your creator creates a 4K video - thats what you'll deliver to everyone (even mobile phones). HLS streaming on the other hand has adaptive bitrates - where the video adapts to both the screen & the available network speeds. You'll get better playback results with less buffering (and if you're using AWS - a LOT LESS data egress) with video streaming.
(there are a bunch of APIs and services that can help you do this - including api.video (where I work), Mux and others).

what protocol must use for broadcast live video?

A server is in the middle
And we want you to send live video to it
And on the other hand, watch it through the HTTPS like https://server/live.pm4
What protocols can be used for this purpose?
I used to do this experimentally with nodejs dgram and ffmpeg on the raw UDP and it worked fine !
but stability and security is an issue that must be observed !
Most live (or VOD) video, for services where quality is important and/or there are a large number or users, use Adaptive Bit Rate streaming protocols at this time.
The two leading ones are HLS and MPEG DASH.
As a very high level general rule, noting that there are exceptions:
Android, Chrome Browser, Edge Browser - use MPEG DASH
iOS, Safari browser - use HLS
The introduction of CMAF () has consolidated the two formats making life for service provider easier, and the media streams can be the same for both and just tine manifest of index files to the streams are specific to DASH and HLS. Unfortunately, encrypted stream support for CMAF is not yet rolled out across all consumer devices so many services cannot use it fully yet.
On the security side, nearly all services use DRM to encrypt content and control entitlements. The leading DRM's are Google Widevine, Apple FairPlay and Microsoft PlayReady. Again, these are generally platform specific, with the usual use being:
Android, Chrome browser - Widevine
Edge Browser, X-Box - PlayReady
iOS, Safari - FairPlay

What benefits howler.js brings for a basic audio player in comparison to `audio` element?

Preconditions:
Developing an audio player for a web application.
All target browsers fully support audio tag.
No need in sprites, multiple simultaneous sounds etc, just one audio track to be played back at a moment.
Audio file has to be streamed from the server, not downloaded at once. Therefore not Web Audio API.
Why would I want to utilize howler.js or similar library instead of relying on the built-in audio tag in this scenario?
The only howler.js feature that is intriguing is “Handles edge cases and bugs across environments”.

How to play live audio stream using Google Actions Dialogflow

I have been trying to find a way to play live stream of audio (mp3) using Google Actions but haven't found a way to do so.
I tried Media Response as well but as mentioned in the documentation it doesn't support live stream.
I followed this thread but it doesn't have any examples to help me with.
Is it possible to play live mp3 stream using Google Actions?
I've had relatively good results with the Media Player being able to handle mp3 "streams". There are a couple of problems doing this, however:
There is a time limit on the audio playback (4 hours last time I checked, but it may have changed).
There isn't any such thing as an mp3 "stream". The player treats it as a single mp3 file that it downloads in chunks using HTTP headers, unlike some of the streaming protocols that allow for varying bitrate based on network and other conditions.
If this is an issue, one alternative might be to use the Interactive Canvas (which uses Chrome on the device) to present an HTML page that has an <audio> tag in it that you control. This gives you a little more control (most streaming protocols are either supported or have JavaScript libraries that can do the work), but there are some downsides:
This will only work on Smart Displays and Android. Smart Speakers aren't supported.
Interactive Canvas is only allowed for certain types of Actions. Currently it must be a game, a story, or an educational Action.

Chromecast support for Smooth Streaming with PlayReady

I'm aware that developer-preview of Chromecast receiver does not fully support Smooth Streaming manifest URL (See Update#1).
I have tested content provided by Microsoft PlayReady(TM) Test Server - Smooth Streaming assets using sample receiver app provider in GitHub project.
Smooth Streaming Support
As expected, manifest file does not work (See Update#1). But I was able to play individual ismv file (but only low bitrates). When I use higher bitrate, the video container stays black.
PlayReady Support
When I tried to play PlayReady protected low bitrate ismv file, I was expecting some sort of call back MediaProtocolMessageStream.onKeyRequested(). But there did not happen. Here is my android CustomMediaProtocolMessageStream implementation.
So, does anybody know how PlayReady or Widevine supposed to work with Chromecast? I have seen Netflix invokes some binary shell command when app is loaded in chromecast. But I assume, they worked with Google to accomplish this.
Additional SO Resources
How to play smooth streaming video in Chromecast?
Is it actually possible to play SmoothStreaming videos on Chromecast without using (format=mpd-time-csf)?
Playing Smoothstreaming URL by providing Manifest file of smoothstreaming to Chromecast device
Update #1
Based on Les Vogel's answer, smooth streaming manifest file for adaptive bitrate streaming is supported by Chromecast. You need custom player to handle that.
As far as I am aware of, currently there are two JS player which can handle that, but I don't know if they will work on Chromecast.
dash.js - By DASH Industry Forum (https://github.com/Dash-Industry-Forum/dash.js)
Microsoft HTML5 Player Framework - Part of Microsoft Media Platform (http://playerframework.codeplex.com/)
Currently, you need to write your own media player to support adaptive bitrate streaming on Chromecast.
Unfortunately, the MS test server assets do not correctly provide a CORS header, which would be needed if you wrote a javascript player.
PlayReady and Windvine are both supported. We'll be providing additional documentation shortly.
EDIT We announced the beta of the Cast Media Player Library today 2/3/14 - it supports HLS, SmoothStreaming, and MPEG Dash.
Yes, you can use "com.microsoft.playready" for PlayReady and "com.widevine.alpha" for widevine.

Resources