VideoJS select the source according to browser - browser

I have a dash and an HLS stream. I can play them separately on safari chrome and edge. However, I want to create one player and detect the browser so I can pass the right configuration as a source to the player.
I tried something like the following
myPlayer.src([
{ type: "video/mp4", src: "http://www.example.com/path/to/video.mp4" },
{ type: "video/webm", src: "http://www.example.com/path/to/video.webm" },
{ type: "video/ogg", src: "http://www.example.com/path/to/video.ogv" }
]);
However, it just plays the first one and if the first one is a dash stream and you open it in the safari, it gives an error. The above is just example, my sources has DRM info and lots of options. Can you help me with a sample example.

The interaction between DRMs and packaging or streaming protocols is a little complex.
HLS and DASH are Adaptive Bit Rate streaming protocols. The sever creates multiple fragmented bit rate versions of the video and this allows the client device or player download the video in chunks, e.g 10 second chunks, and select the next chunk from the bit rate most appropriate to the current network conditions. See some more info in this answer also: https://stackoverflow.com/a/42365034/334402
There is an index or manifest file which is just a text/XML file listing the different video, audio, subtitle etc streams and proving URLs for them. This is the .mpd or .m3u8 file.
Most browsers do not support these HLS and DASH 'manifest' files directly at this time (Safari does support HLS as an exception). You need to use a HTML5 player like video.js, Shaka, BitMovin etc.
DRM's allow the content to be encrypted and the keys for the content to be shared securely between the server and the clients.
As a general rule the following DRM's are supported natively on device and browsers - naively means that the DRM is usually built into the OS or the browser when you purchase the device:
Android devices - Widevine
Chrome browser on a PC or MAC - Widevine
FireFox - Widevine
iOS device - FairPlay
Safari browser - FairPlay
Internet Explorer browser - PlayReady
The interaction between DRMs and packaging gets a bit complicated - MPEG-DASH (often just called DASH) is intended to be the industry standard and both Google and MS seem to favour it, but Apple devices still favour HLS.
DASH supports CENC, which allows a single stream support multiple DRM types. HLS is generally used with FairPlay, although it can support other schemes also.
So, with the caveat that this is not absolute and it is possible to find other examples, the typical case for a service to reach all devices at this time would be:
MPEG-DASH - single stream with Widevine and PlayReady DRM
HLS with FairPlay DRM
You can see from the above that HTML5 Players like video.js etc, have to check the browser they are running on and the types of streams available to make the best choice of stream.
So going back to your question, you can actually specify HLS and DASH streams in your video.js configuration, rather than mp4, web etc as in your code extract above. This will look like:
var player = videojs('some-video-id');
player.src({
src: 'https://d2zihajmogu5jn.cloudfront.net/bipbop-advanced/bipbop_16x9_variant.m3u8',
type: 'application/x-mpegURL',
withCredentials: true
});
This example is from the https://github.com/videojs/http-streaming which is now part of the standard video.js build. The documentation an examples are very HLS heavy, but it should work with DASH also.

I had the same problem. I'm using the VideoJs browser module (https://docs.videojs.com/module-browser.html) to check the browser and use the appropriated source. That module has different functions and you might need to chose a different one according to your needs.
My implementation is like this:
if (videojs.browser.IS_ANY_SAFARI)
player.src({src: <HLS source>, type: 'application/x-mpegURL'})
else
player.src({src: <Dash source>, type: 'application/dash+xml'})

Related

Azure Media Services HLS Stream Url Change Resolution No Audio

Below is the manifest file get from Azure Media Services HLS Url.
The default HLS provided have Video and Audio, but when we try to change the resolution, it became only have Video but no Audio.
#EXTM3U
#EXT-X-VERSION:7
#EXT-X-SESSION-KEY:METHOD=SAMPLE-AES,KEYFORMAT="com.apple.streamingkeydelivery",KEYFORMATVERSIONS="1",URI="skd://petronastownhallmedia.keydelivery.southeastasia.media.azure.net/FairPlay/?kid=4881e415-fb2d-45e4-a8dd-505a405cf93d"
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="audio",NAME="audio_und",LANGUAGE="und",DEFAULT=YES,AUTOSELECT=YES,CHANNELS="2",URI="QualityLevels(128000)/Manifest(audio_und,format=m3u8-aapl)"
#EXT-X-STREAM-INF:BANDWIDTH=351536,RESOLUTION=340x192,CODECS="avc1.64000d,mp4a.40.5",AUDIO="audio"
QualityLevels(200000)/Manifest(video,format=m3u8-aapl)
#EXT-X-I-FRAME-STREAM-INF:BANDWIDTH=351536,RESOLUTION=340x192,CODECS="avc1.64000d",URI="QualityLevels(200000)/Manifest(video,format=m3u8-aapl,type=keyframes)"
#EXT-X-STREAM-INF:BANDWIDTH=709236,RESOLUTION=384x216,CODECS="avc1.640015,mp4a.40.5",AUDIO="audio"
QualityLevels(550000)/Manifest(video,format=m3u8-aapl)
#EXT-X-I-FRAME-STREAM-INF:BANDWIDTH=709236,RESOLUTION=384x216,CODECS="avc1.640015",URI="QualityLevels(550000)/Manifest(video,format=m3u8-aapl,type=keyframes)"
#EXT-X-STREAM-INF:BANDWIDTH=1015836,RESOLUTION=512x288,CODECS="avc1.640015,mp4a.40.5",AUDIO="audio"
QualityLevels(850000)/Manifest(video,format=m3u8-aapl)
#EXT-X-I-FRAME-STREAM-INF:BANDWIDTH=1015836,RESOLUTION=512x288,CODECS="avc1.640015",URI="QualityLevels(850000)/Manifest(video,format=m3u8-aapl,type=keyframes)"
#EXT-X-STREAM-INF:BANDWIDTH=1526836,RESOLUTION=704x396,CODECS="avc1.64001e,mp4a.40.5",AUDIO="audio"
QualityLevels(1350000)/Manifest(video,format=m3u8-aapl)
#EXT-X-I-FRAME-STREAM-INF:BANDWIDTH=1526836,RESOLUTION=704x396,CODECS="avc1.64001e",URI="QualityLevels(1350000)/Manifest(video,format=m3u8-aapl,type=keyframes)"
#EXT-X-STREAM-INF:BANDWIDTH=2395536,RESOLUTION=960x540,CODECS="avc1.64001f,mp4a.40.5",AUDIO="audio"
QualityLevels(2200000)/Manifest(video,format=m3u8-aapl)
#EXT-X-I-FRAME-STREAM-INF:BANDWIDTH=2395536,RESOLUTION=960x540,CODECS="avc1.64001f",URI="QualityLevels(2200000)/Manifest(video,format=m3u8-aapl,type=keyframes)"
#EXT-X-STREAM-INF:BANDWIDTH=3724136,RESOLUTION=1280x720,CODECS="avc1.64001f,mp4a.40.5",AUDIO="audio"
QualityLevels(3500000)/Manifest(video,format=m3u8-aapl)
#EXT-X-I-FRAME-STREAM-INF:BANDWIDTH=3724136,RESOLUTION=1280x720,CODECS="avc1.64001f",URI="QualityLevels(3500000)/Manifest(video,format=m3u8-aapl,type=keyframes)"
#EXT-X-STREAM-INF:BANDWIDTH=138976,CODECS="mp4a.40.5",AUDIO="audio"
QualityLevels(128000)/Manifest(audio_und,format=m3u8-aapl)
We have tried using / modified into following URL:
Have Audio but no video
https://xxxx/c13459a8-065e-4d86-a2aa-c95f5ededafc/365a25ee-df07-4d1f-8679-0a029fccd397.ism/QualityLevels(128000)/Manifest(audio_und,format=m3u8-aapl)
No Audio no video
https://xxxx/c13459a8-065e-4d86-a2aa-c95f5ededafc/365a25ee-df07-4d1f-8679-0a029fccd397.ism/QualityLevels(128000)/Manifest(video,audio_und,format=m3u8-aapl)
https://xxxx/c13459a8-065e-4d86-a2aa-c95f5ededafc/365a25ee-df07-4d1f-8679-0a029fccd397.ism/QualityLevels(850000)/Manifest(video,format=m3u8-aapl,audiotrack=audio)
Any idea or suggestion regarding this issue?
Thanks
What player framework are you using? Are you seeing this issue in HLS.js, Shaka player or in the Native iOS AVPlayer?
That is pointing to the audio only AAC track at 128Kbps.
The other tracks appear to include video and audio, so I'm not sure without testing those on my own. Which I can't do unless you submit a support ticket through the Azure portal and include the details on your account and streaming endpoint. I suggest that path for faster support.

Google Action should play radio stream

I need to develop a Google Action which streams an audio/radio stream.
i thought about media response.
But the documentation says: "Audio for playback must be in a correctly formatted .mp3 file. Live streaming is not supported."
Documentation
Can someone give me an hint, what i have to do to stream an audio-stream? i found a german google action "baden fm" which streams their radio. But not sure how they do it.
Kind Regards
Stefan
The only ways to do this currently:
Stream it in chunks of MP3 files, using the callback at the end of streaming to stream the next chunk
Getting listed on TuneIn, Radio.com or iHeartRadio. From observation, Baden FM seems to be using TuneIn
Through an App Action
Use a Web site link that starts streaming via BrowseCarousel or Button
Last 2 options are not helpful if you're going after non-browser-enabled devices.
Also saw this thread which has some insight on MP3 size/duration: How can I tell Actions on Google to stream audio?
Google Actions do not currently support live audio streaming. I'm in contact with them but it seems they have no ETA to support this.
I was successful doing so with an mp3 live stream:
NPR: https://npr-ice.streamguys1.com/live.mp3?ck=1597372625378
but not with mpd
BBC test stream: https://rdmedia.bbc.co.uk/dash/ondemand/testcard/1/client_manifest-audio.mpd
or with the HLS that my company uses ( .m3u8, can't publish the link publicly )
Note: added links as text/code since I'm not sure whether their companies policies are cool with them being indexed.

Record Screen's Happenings(Audio+Video)

i am new baby in WebRTC and want to implement system like video conferencing , live streaming or you can skype using WebRTC and NodeJS.
i am confused with one thing , as its our one of client's requirement , suppose on page whatever is happening it may be video conferencing say one moderator answering to many audiences one by one , so there should be one video created , which continuously recording all this stuff together and sending live stream to server to save in our database.
is this kind of stuff implementable or not?
any help please.
You can capture video through grabbing Jpeg images from a canvas element. You could also capture the entire page(if numerous videos in the same page) through grabbing the page itself through chrome.
For audio, recording remote audio with the Audio API is still an issue but locally grabbed audio is not an issue.
RecordRTC and my Modified Version for recording streams either to file or through websockets respectively.
Capture a page to a stream on how to record or screenShare an entire page of Chrome.
If you have multiple different videos not all in the same page but want to combine them all, I would suggest recording them as above and then combining and syncing them up server side(not in javascript but probably in C or C++).
If you MUST record remote audio, then I would suggest that you have those particular pages send their audio data over websockets themselves so that you can sync them up with their video and with the other sessions.

Will the chromecast play media segment files (.ts) from an m3u8 playlist?

I've noticed a lot of websites use m3u8 playlists on their html5 video tags, and those segment files inside the playlist appear to be h264 encoded, so I'm guessing the container is the only thing that the chromecast doesn't support in this case although I know very little about video containers and codecs so I'm probably just making no sense. So with all this in mind, is there any chance the chromecast will one day play those files?
Here is an example http://stream.gravlab.net/003119/sparse/v1d30/posts/2014/barcelona/barcelona.m3u8
Thanks.
Yes - You can using either the default Receiver, a Styled Receiver, or in a Custom Receiver and using the Media Player Library. Of course, you (the owner of the data) must turn on CORS headers for the m3u8 manifest, any sub-manifests, and for the segments and any keys on your server / CDN to support this. This requirement is due to our player being written in JavaScript and running in Chrome on the Chromecast device.
Note - for the Default Receiver & Styled Receiver, the URL to allow CORS from is www.gstatic.com. For your Custom Receiver, it will be the URL where you host your Receiver.

Send content to chromecast from native application

Is it possible to send video to the chromecast device from a native application? It would be nice to share any window on a system instead of only chrome tabs. Also, is there any documentation of the communication used by chrome to communicate with the chromecast? It is my understanding that the chromecast essentially loads content from an embedded chrome instance, but there appears to be more direct ways of communicating with the device since it is able to stream content from a chrome tab using the extension.
You need to whitelist your receiver device if you are developing a receiver application. That would be a Chome app that runs on the receiver's Chrome instance.
You need to whitelist a sender url if you are developing a Chrome app that will cast it's contents.
Video casting works by sending a url to the receiver device, which the device will load directly.
Tab casting works by encoding the tab contents using WebM/Opus (in the Chrome cast extension) and streaming that to the receiver device. (This is limited to 720p, see this question)
Chrome apps can only use Video casting.
The chrome cast extension is going to be the only way to stream directly to the device.
So the answer to your question is no, you cannot stream video directly to the device. The receiver must load the video from the url you provide.
There is some speculation whether the receiver can be provided with a local url or if it must already be available on the internet. This has yet to be clarified.
From how I understand the Chromecast architecture:
You can display any URL you want on the TV (you have to whitelist your app and register the URL first). It must be a URL. This can include HTML, JS, CSS, etc. Anything that is already on the internet.
To receive data from a device (say, the URL of a video to load), you must implement logic to interpret messages from channels. These messages are encoded as JSON, which makes it difficult to send videos or pictures (binary data). It is obviously easiest to upload things like this to some website, and have the receiver display them.
People have asked, "well, then how does the tab/screen sharing work?" The JSON encoding is just what Google provides in their SDK. In their own source, they don't have this restriction.
Update:
It turns out you can actually stream local videos to your TV by just opening the local file in Chrome, and then casting that to your TV.

Resources