Chrome stops MPEG-DASH playback after some time - node.js

I am working on a project which needs to play MPEG-DASH stream playback on top of videojs.
Now the playback of the stream stops playing after 59 seconds.
See below code and error i am getting
<video id="example-video">
<source src="http://hitsradio.videocdn.scaleengine.net/ondemand/play/mp4:sestore8/hitsradio/ZZ Top - Cheap Sunglasses.mp4/manifest.mpd" type="application/dash+xml">
</video>
<script src="path/to/video.js">
<script src="path/to/videojs-dash.js"></script>
<script src="path/to/dash.all.js"></script>
<script>
myPlayer = document.getElementById("example-video");
myPlayer.play()
</script>
Now the error i got in browsers
[60269][bufferController][video] Waiting for more buffer before starting playback.
dash.all.js:11 [60271][scheduleController][audio] Stalling Buffer
dash.all.js:11 [60271][bufferController][audio] Waiting for more buffer before starting playback.
dash.all.js:11 [60272][playbackController] <video> ratechange: 0

This seems to be a bug in video.js. I just tested other web-based players like dash.js, bitmovin's adaptive streaming player and Google's Shaka player and all three of them played the stream without problems.
To the best of my knowledge, video.js uses dash.js for MPEG-DASH playback, so either you're using an old dash.js version (I tested the latest v2.0.0) or there is a problem in the video.js DASH plugin. If the later is the problem you should create an issue in their github repository.

Related

How to play Audio sound at startup automatically in GWT

I want to play a sound at start of the GWT client app.
I used the following code
import com.google.gwt.media.client.Audio;
Audio mistakeAudio;
mistakeAudio=Audio.createIfSupported();
if (mistakeAudio!=null) {
mistakeAudio.setSrc("waves/Gong.wav");
mistakeAudio.load();
mistakeAudio.setLoop(true);
mistakeAudio.play();
}
I tested various variations, but finally the sound only plays, if before the command mistakeAudio.play() at least one mouse click happened on the client window. Not even
doing anything..
Is there a workaround - starting really only programmatically?
You can use HTML5 instead:
<audio id="gong" autoplay>
<source src="waves/Gong.wav" type="audio/wav">
Your browser does not support the audio element.
</audio>
or use javascript for the autoplay:
<script>
var audio = document.getElementById("gong");
audio.autoplay = true;
audio.load();
</script>
which you could also code in GWT with JsInterop and Elemental2:
HTMLAudioElement audio = (HTMLAudioElement) Document.getElementById("gong");
audio.autoplay = true;
audio.load();

NodeJS Express video streaming server with server controlled range

So I don't currently have any code but its just a general question. I've seen multiple articles and SO Questions that handle this issue except that in all of those the byte range header, that essentially specifies what time segment of the video is sent back to the client, is also specified by the client. I want the server to keep track of the current video position and stream the video back to the client.
The articles and SO Questions I've seen for reference:
https://blog.logrocket.com/build-video-streaming-server-node/
Streaming a video file to an html5 video player with Node.js so that the video controls continue to work?
Here's a solution that does not involve explicitly changing the header for the <video> tag src request or controlling the byte range piped from the server.
The video element has a property called currentTime (in seconds) which allows the client to control its own range header. A separate request to the server for an initial value for 'currentTime' would allow the server to control the start time of the video.
<video id="video"><source src="/srcendpoint" muted type="video/mp4" /></video>
<script>
const video = document.getElementById("videoPlayer")
getCurrentSecAndPlay()
async function getCurrentSecAndPlay() {
let response = await fetch('/currentPosition').then(response => response.json())
video.currentTime += response.currentSec
video.play()
}
</script>
This is sort of a work-around solution to mimic livestream behaviour. Maybe it's good enough for you. I do not have a lot of knowledge on HLS and RTMP but if you wanted to make a true livestream you should study those things.
sources:
https://developer.mozilla.org/en-US/docs/Web/API/HTMLMediaElement
https://www.w3.org/2010/05/video/mediaevents.html

Samsung Smart TV App (not Tizen) audio plays only once

In a Samsung Smart TV App up to 2014 (not Tizen) I have tried two ways to play a short (about a second) audio file and tested it in the 2014 emulator 5.1:
index.html:
<!-- HTML5 audio tag -->
<audio id="audio" src="http://luniks.net/other/0-99A-Z/1.ogg"></audio>
<!-- Player plugin -->
<object id="pluginPlayer" classid="clsid:SAMSUNG-INFOLINK-PLAYER"></object>
Main.js:
Main.keyDown = function() {
var keyCode = event.keyCode;
// arrow left on remote control, just for testing
if (keyCode == 4) {
// HTML5 audio tag
document.getElementById("audio").play();
}
// arrow right on remote control, just for testing
if (keyCode == 5) {
// Player plugin
var playerObj = document.getElementById('pluginPlayer');
playerObj.Play("http://luniks.net/other/0-99A-Z/1.ogg");
}
};
Each way, the audio file is played only once during the app's lifecycle and only plays again when restarting the app.
I am not sure if the sound is muted or if audio doesn't play at all after it played once.
The same with a <video> tag works fine, the video can be repeatedly played without problems.
When a <video> tag is present (defined after the <audio> tag), the <audio> tag does nothing.
When playing audio with the Player plugin while the video is playing, the video continues to play but is muted.
My final goal is to play a series of short audio files while a video that possibly has (low volume) sound is playing. From my experience so far, I can forget about that. Am I right?
Just add loop="true" in the audio tag and test. It works fine for me.

Soundjs on safari and mobile browsers

Currently my audio won't play on safari and on mobile devices.
It works fine on a normal pc on FireFox, Chrome and IE
var manifest = [
{ id: "correct", src: 'assets/correct.mp3|assets/correct.ogg' },
{ id: "wrong", src: 'assets/wrong.mp3|assets/wrong.ogg' }
];
var queue = new createjs.LoadQueue();
queue.installPlugin(createjs.Sound);
queue.loadManifest(manifest, true);
And I'm calling the play function like this;
createjs.Sound.play("correct");
This function is written inside a function that's called when a user presses a div.
That code looks like it should work. Web Audio is initially muted on iOS devices, but when play is called inside of a user event it unmutes.
There are a couple of possibilities (without seeing the rest of the code):
You are working on iPad 1, which does not support web audio and has html audio disabled by default due to severe limitations.
You are not waiting for the audio to finish loading before calling play:
queue.addEventListener("complete", loadComplete);
The audio file path is incorrect and therefore the load is failing, which you can detect by listening for an error event.
You are using a non-default encoding for the mp3 files that is not supported by Safari. Generally that would break in other browsers as well though.
Safari requires quicktime for html audio to play, so that could be a problem.
Using createjs.Sound.registerPlugins, SoundJS is being set to use an unsupported on mobile plugin, such as FlashPlugin. You can check your current plugin with:
createjs.Sound.activePlugin.toString();
You may find the Mobile Safe Tutorial useful. Hope that helps.
There is a way to hack it, play an empty mp3 then play the audio.
It must load the empty mp3 within mainfest array firstly:
var manifest = [
...
{ id: "empty", src: 'assets/empty.mp3|assets/empty.ogg' }
];
...
Before playing the sound, play the empty mp3:
createjs.Sound.play("empty");
createjs.Sound.play("correct");

How to play audio in background with firefox os?

In my manifest file I've add the audio-channel-content in permissions:
"permissions": {
"audio-channel-content":{"description":"Use the audio channel for the music player"}
}
In my index.html I've got an audio tag like:
<audio mozaudiochannel="content" preload="none" src="http://my-stream-url"></audio>
I can play my audio stream during 2mn:
The first one when the phone is unlock.
After 1mn my phone auto-lock the screen and it continue playing for another minute.
Does it possible to play this audio stream more than 1mn after the lock?
Thanks in advance.
The code and permission block you have are correct and I can confirm it is working in Firefox OS 1.1. You can also do the whole thing in Javascript:
audio = new Audio();
audio.preload = 'none';
audio.mozAudioChannelType = 'content';
It's because the wifi is closed after screen shutdown (auto-lock)?
Are you using the dev version of gaia? Wifi is set to always connect in product version.

Resources