Capture video frames of DRM protected video in chrome - drm

I'm building a video player for my application, which will play DRM protected videos (WideVine on Chrome and Firefox) using dash.js and video.js. As part of this application i want to be able to annotate the video and send the data back to server.
Annotation data should be attached to a frame of the video instead of timestamps and the application should send the frame and related annotation data to server. Is it possible to capture the raw frames of the Widevine DRM protected video in chrome or firefox and send them to server using webGL.

No, it's not possible to capture the decrypted frame of a DRM protected video. This would to some degree defeat the DRM system. It's only possible for non-DRM protected streams.

Related

How to handle cross browser and device compatibility for MediaRecorder and Audio control

My web app needs to support recording of audio using a web interface and then subsequent playback of the same audio files.
My prelim analysis shows that chrome supports “audio/webm” and safari supports “audio/mp4” for recording audio using the MediaRecorder. This would have been fine but HTML5 AUDIO component on these browsers has the same format limitation and this is causing a device/OS fragmentation.
I am using AWS as my application's backend.
Is there a way to get around this browser segmentation.

How to play RTSP stream from ip video camera and NVR on user web page

I want to play RTSP stream from ip video cameras (MP4, H264) on my intranet web page, I use React. I have 12 cameras and NVR.
I did not find a way to do this without an intermediate server (Webrtc is not suitable), that spends resources on transcoding h264 stream to the mjpeg.
If I set a high resolution and quality of the stream, then a lot of resources are spent on transcoding, and most importantly, the streaming of mjpeg images takes a lot of traffic.
Is there a way or solution to stream from the ip camera directly to the web page so that the decoding is on the user's webbrowser side.
This will free the intermediate server from a heavy load for big streams.
It is necessary that the playback work on mobile phones.
Thanks for the answer.
There is no way to stream RTSP camera's H264 video directly to web browser.
But cameras support outputting still jpeg images - you can create a webpage that will display such an image from a camera every 200ms or so.
If you are not happy with the above solution, you must use a media server in between, which will pull RTSP stream from the camera and will convert it to some protocol that browser understands. You are mistaken in one thing: no video transcoding is involved. I don't know why WebRTC is not an option for you, but most media servers will offer 4 types of output:
Low latency:
WebRTC
Websockets to MSE
High latency:
HLS
MPEG-Dash
All these methods do NOT require transcoding of your original H264 video, encoded by RTSP camera/NVR. Some media servers you can use: Unreal Media Server, Wowza, Janus.
Live demo: http://www.umediaserver.net/umediaserver/demos.html
No browser has native RTSP support, so if you want decoding to happen on the end user side, then you'll have to write your own custom web player.
You can start by looking at the open-source solution like this one:
git://github.com/Streamedian/html5_rtsp_player.git
It works on PC and Android, but didn't work with iPhone for me (but you can try it for yourself https://streamedian.com/demonstration/ maybe it's just my issue), but maybe you can find better alternative or fork it and make it work on all devices.
It still requires a middle-man proxy server though because it uses a websocket tech to work, but since it doesn't do any video converting or decoding, it don't suppose to take any resources at all.

Black screen observed sometimes while playing the FairPlay Streaming protected content

We are using AVPlayerLayer to play the content in iOS application. Some time there is a black screen occurred while playing the content but content continue playing audio during playback. This can be observed randomly during the playback also when initially starts playing the content or seek the content at specified time. This is not happening all the time but occurring frequently. Please note that we are using FairPlay Streaming protected content to play the video using AVPlayer.
Sometime we didn't get any such issue during whole playback and player play the content smoothly.
This issue is not related to the player as such. Rather it is because of the DRM flags used in the Fairplay license.
It looks like the HDCP enforcements is the one affecting your playback. The licensing server can set different flags for HDCP (HDCP not required, Type-0, Type-1). By default, Type-0 is enforced by the FPS server unless changed.
Generally, a 'Type-0' or a 'HDCP not required' enforcement should playback your content seamlessly.

Chromecast support for Smooth Streaming with PlayReady

I'm aware that developer-preview of Chromecast receiver does not fully support Smooth Streaming manifest URL (See Update#1).
I have tested content provided by Microsoft PlayReady(TM) Test Server - Smooth Streaming assets using sample receiver app provider in GitHub project.
Smooth Streaming Support
As expected, manifest file does not work (See Update#1). But I was able to play individual ismv file (but only low bitrates). When I use higher bitrate, the video container stays black.
PlayReady Support
When I tried to play PlayReady protected low bitrate ismv file, I was expecting some sort of call back MediaProtocolMessageStream.onKeyRequested(). But there did not happen. Here is my android CustomMediaProtocolMessageStream implementation.
So, does anybody know how PlayReady or Widevine supposed to work with Chromecast? I have seen Netflix invokes some binary shell command when app is loaded in chromecast. But I assume, they worked with Google to accomplish this.
Additional SO Resources
How to play smooth streaming video in Chromecast?
Is it actually possible to play SmoothStreaming videos on Chromecast without using (format=mpd-time-csf)?
Playing Smoothstreaming URL by providing Manifest file of smoothstreaming to Chromecast device
Update #1
Based on Les Vogel's answer, smooth streaming manifest file for adaptive bitrate streaming is supported by Chromecast. You need custom player to handle that.
As far as I am aware of, currently there are two JS player which can handle that, but I don't know if they will work on Chromecast.
dash.js - By DASH Industry Forum (https://github.com/Dash-Industry-Forum/dash.js)
Microsoft HTML5 Player Framework - Part of Microsoft Media Platform (http://playerframework.codeplex.com/)
Currently, you need to write your own media player to support adaptive bitrate streaming on Chromecast.
Unfortunately, the MS test server assets do not correctly provide a CORS header, which would be needed if you wrote a javascript player.
PlayReady and Windvine are both supported. We'll be providing additional documentation shortly.
EDIT We announced the beta of the Cast Media Player Library today 2/3/14 - it supports HLS, SmoothStreaming, and MPEG Dash.
Yes, you can use "com.microsoft.playready" for PlayReady and "com.widevine.alpha" for widevine.

Audio Streaming Using J2ME

I've got audio online in the form of MP3 files, how do I stream the audio from my J2ME app? A website give the app a list of audio to play, select the audio and it must then stream from the website.
Sample code would be nice. thanks
There is no reliable way to ensure that a MIDlet will stream audio data because you don't control how the phone manufacturer implemented JSR-135 (the specification that gives you the API to play audio in a MIDlet).
Technically, creating a Java media player using javax.microedition.media.Manager.createPlayer(String aUrl) should make the JSR-135 implementation stream the audio data located at the url.
Unfortunately, only streaming of very simple audio content (wav more often than mp3), if any, is usually supported over a network connection and, more often than not, a call to createPlayer(String aUrl) will throw an exception if the url doesn't begin with "file://"
There are probably devices where the manufacturer managed to plug a more complete audio/networking module into the JSR-135 implementation but finding them will require a lot of testing for you.
J2ME won't let you do this over HTTP. It will download the entire audio before it starts playback. What you need is to host it on an RTP server instead; only then will J2ME stream the audio.
If that's no good, then you might be stuck looking for devices that have their own proprietary libraries for this kind of thing.
There's a better way to do this.
Create your own InputStream extended class. say MyHTTPInputStream, implement all the methods. Run a thread to retrieve the data from HTTP and store it in buffer, when the Player class calls InputStream.read() method, provide the data from the buffer.
Before implementing this class for Player, test the MyHTTPInputStream using a dummy WAV file stored in phone memory or add-on card. So, you can know which methods are called from InputStream and also you can know the sequence of method calls made by the class Player.

Resources