To play VOD streams in chrome using videojs - node.js

I am trying to play an HLS stream (.m3u8) containing .ts file buffers.
It was playing all right before when i tried with some open sources. Now when i am trying to play with the streams provided by my service provider, It does play in all major browsers except chrome.
P.s: I am using videojs for accomplishing these. I also tested using viblast but no luck there.
For reference i am posting my code :
<script>
//myplayer is my <video> object for videojs
myPlayer.src({
type: "application/x-mpegURL; application/vnd.apple.mpegurl",
src: encodeURI(some m3u8 link)
});
myPlayer.play();
</script>
Now this code here is ruling all browsers but when it faces chrome it kneels down.
The error response from chrome is like below :
VIDEOJS: ERROR: (CODE:3 MEDIA_ERR_DECODE) The media playback was
aborted due to a corruption problem or because the media used features
your browser did not support.
MediaError {code: 3, message: "The media playback was aborted due to a
corruption…media used features your browser did not support."}
Note: I am getting my streams from scaleEngine.

I came across this error when using an Mp4 with a Webm fallback. Everything worked well in Firefox but I was getting this error in Chrome. I switched the order of fallbacks so that videojs used Webm first and Mp4 as a fallback. This fixed it for me at least.

Related

Latency in mp3 live streaming on Google home assistant

I’m trying to play mp3 live streaming and I’m using “media response” as shown in the Action on Google guide, here is the code:
if(!conv.surface.capabilities.has('actions.capability.MEDIA_RESPONSE_AUDIO')) {
conv.ask('Sorry, this device does not support audio playback.');
}
else{
conv.ask(new MediaObject({
name: 'Radio one',
url: 'my_streaming_url.mp3',
description: 'A funky Jazz tune',
}));
conv.ask(new Suggestions(['Radio two']));
}
All works fine, but there is about 20 seconds of audio latency on Google home and Google home mini. No latency on Google Assistant Android app and on Action on Google Simulator and no latency also if “url” is an mp3 file. Any idea why there is this delay?
Google Home's media player appears to buffer roughly 20-30 seconds of playable audio before starting.
If you control the Icecast streaming server, increase the <burst-size> value either in the <limits /> section or <mount />. By default, it is set to 65536 bytes.
You can work out the ideal burst-size limit for your stream in bytes by calculating the following:
bitrateKbps * bufferSeconds * 1024 / 8
For a 128kbps stream, try 327680.
(Also, make sure the server's <queue-size /> is bigger than the burst-size)
The 20 second latency with live streaming urls appears to be a thing on google home (mini) and chromecast. Not only if initiated from google assistant : also when using google cast API. I have no idea why they use so much buffering.

Need help for audio conference using Kurento composite media element in Nodejs

I am refereeing the code from GitHub for audio AND video conference using Kurento composite media element, It work's fine for audio AND video streaming over WebRTC.
But I need only audio conference using WebRTC, I have added changes in above GitHub code and new code is uploaded on GitHub Repository.
I have added below changes in static/js/index.js file
var constraints = {
audio: true, video: false
};
var options = {
localVideo: undefined,
remoteVideo: video,
onicecandidate : onIceCandidate,
mediaConstraints: constraints
}
webRtcPeer = kurentoUtils.WebRtcPeer.WebRtcPeerSendrecv(options, function(error) {
When I am running this code, no error for node server as well as on chrome console. But audio stream does not get start. It only showing spinner for long time. Chrome console log is here.
As per reply for my previous stack overflow question, We need to specify MediaType.AUDIO in java code like below
webrtc.connect(hubport, MediaType.AUDIO);
hubport.connect(webrtc, MediaType.AUDIO);
But I want to implementing it in Nodejs using kurento-client.js, I did not get any reference to set MediaType.AUDIO to connect with hubPort and webRtcEndpoint in Nodeja API.
Please someone can help me to do code changes for same in Nodejs or suggest me any reference so I can implement only audio conference using composite media element and Nodejs.
This should do
function connectOnlyAudio(source, sink, callback) {
source.connect(sink, "AUDIO" , function(error) {
if (error) {
return callback(error);
}
return callback(null);
});
}
We are in the process of improving the documentation of the project. I hope that this will all be made more clear in the new docs.
EDIT 1
It is important to make sure that you are indeed sending something, and that the connection between your client and the media server is negotiated correctly. Going through your bower.json, I've found that you are setting the adapter dependency as whatever, so to speak. In the latest releases, they've done some refactoring that makes the kurento-utils-js library fail. We haven't yet adapted to the new changes, so you need to fix the dependency of adapter.js like so
"adapter.js": "v0.2.9"

Capturing desktop video and microphone audio from a chrome extension

I am using the navigator.webkitGetUserMedia API to capture the desktop and using microphone to capture audio. When I make the following call
navigator.webkitGetUserMedia({
audio:true,
video: {
mandatory: {
chromeMediaSource: 'desktop',
chromeMediaSourceId: id,
maxWidth:screen.width,
maxHeight:screen.height}
}
}, gotStream, getUserMediaError);
I am getting a screen capture error. Does this API not support the above scenario?
I am able to capture audio and desktop video individually but not together. Also, since I am capturing desktop and not the webcam video, does that make any difference?
Chrome does not allow you to request an audio stream alongside a chromeMediaSource.
See Why Screen Sharing Fails here for more info.
You may be able to circumvent this by sending individual getUserMedia requests - one for the audio stream and one for desktop.

Playing Smoothstreaming URL by providing Manifest file of smoothstreaming to Chromecast device

We want to play Smoothstreaming URL by providing Manifest file of smoothstreaming to Chromecast device.
We could play the following on Chromecast device,
1. .mp4 file
2. .ismv file
3. .isma file.
But, if we provide an Manifest file as follows, we are not able to play on Chromecast device.
http://playready.directtaps.net/smoothstreaming/SSWSS720H264/SuperSpeedway_720.ism/Manifest
Please let us know how to play Smoothstreaming URL on Chromecast device.
Or do we need to play .ismv files one by one by providing them in a loop.
The Chromecast has support for SmoothStreaming content through their Media Player Library: https://developers.google.com/cast/docs/player
Below is a bare bones implementation.
Google provides a proper example on GitHub which takes advantage of the MediaManager and accounts for other streaming formats: https://github.com/googlecast/CastMediaPlayerStreamingDRM)
var $mediaElement = $('<video>').attr('autoplay', ''),
mediaElement = $mediaElement[0],
mediaUrl = "http://playready.directtaps.net/smoothstreaming/SSWSS720H264/SuperSpeedway_720.ism/Manifest",
mediaHost,
mediaPlayer;
cast.receiver.CastReceiverManager.getInstance().start();
$('body').append(mediaElement);
mediaHost = new cast.player.api.Host({
mediaElement: mediaElement,
url: mediaUrl
});
var protocol = cast.player.api.CreateSmoothStreamingProtocol(mediaHost);
mediaPlayer = new cast.player.api.Player(mediaHost);
mediaPlayer.load(protocol);
Microsoft's test files (inc the ISM) do not return the CORS header required for the chromecast. use a CORS on all of your server and it shall work.
I've encountered this too, and its working if I host them myself with CORS

Monotouch video streaming with MPMoviePlayerController

I am creating a Monotouch iPhone app that will display streaming videos. I have been able to get the MPMoviePlayerController working with a local file (NSUrl FromFile), but have not been able to get videos streamed from a media server.
Here is the code I am using to play the video:
string url = #"http://testhost.com/test.mp4";
var nsurl = NSUrl.FromString(url);
mp = new MPMoviePlayerController(nsurl);
mp.SourceType = MPMovieSourceType.Streaming;
//enable AirPlay
//mp.AllowsAirPlay = true;
//Add the MPMoviePlayerController View
this.View.AddSubview(mp.View);
//set the view to be full screen and show animated
mp.SetFullscreen(true, true);
//MPMoviePlayer must be set to PrepareToPlay before playback
mp.PrepareToPlay();
//Play Movie
mp.Play();
Is there something else in implementing the MPMoviePlayerController for video streaming that I am missing? I also read that videos for iOS should be streamed using Apple's HTTP Live Streaming on the media server, is that a requirement? I have never worked with video streaming on an iOS device before so I am not sure if there is something lacking in my code or the media server setup or a combination.
I'm pretty sure you need a streaming server to use a video file from an HTTP url. However it's a requirement for applications (on the AppStore) to do so:
Important iPhone and iPad apps that send large amounts of audio or
video data over cellular networks are required to use HTTP Live
Streaming. See “Requirements for Apps.”
The good news is that your existing code should not have to be changed to handle this (it's server, not client side).

Resources