Feasibility of one-to-many audio broadcast using a "fake" /stream.mp3 endpoint (node.js) - node.js

I wanted to experiment with something outside my comfort zone and prototype a "online radio app".
I then fell into the rabbit hole of WebRTC streaming, media servers, WebRTC gateways, P2P network graphs...
It seems WebRTC is not suited for these kinds of tasks. It is limited to 10 peers in most browsers. Scaling WebRTC also requires a lot of work for large numbers of viewers.
Ex:
WebRTC - scalable live stream broadcasting / multicasting
Then it occurred to me that simple live audio streams without JavaScript have existed for a while, in this form:
http://stream.radioreklama.bg/radio1.opus
The client for such streams can simply be simple html <audio> tags.
Now all I have to do is to create this "magic" url where a live audio stream is available. Is this possible to do using Node.js?
The missing parts to create my prototype are:
1: Send a "live" audio stream from a client (broadcaster) to the server (using getUserMedia and socket.io).
2: Pipe this audio stream to a "/stream.mp3" URL with the proper encodings.
If feasible, I think this would be an interesting approach to solve the large-scale one-to-many streaming problem for audio, but maybe I'm missing some core information.
Ideal client:
import io from 'socket.io-client';
const socket = io.connect('//localhost:8888');
// Broadcasting code
navigator.mediaDevices
.getUserMedia({ audio: true, video: false })
.then(userMediaStream => {
const mediaRecorder = new MediaRecorder(userMediaStream, {mimeType: 'audio/webm'});
mediaRecorder.ondataavailable = event => {
socket.emit('sound-blob', event.data);
}
mediaRecorder.start();
})
// Could be just a static html file
const App = () => (
<div>
<h1>Audio streaming client</h1>
<audio>
<source src="http://example.com/stream.mp3" type="audio/webm" />
</audio>
</div>
)
Ideal server:
const app = express();
const io = require('socket.io').listen(8888);
const stream = require('stream');
const audioStream = new stream.Readable();
var app = express();
app.get('/stream.mp3', (req, res) => {
audioStream.pipe(res);
})
io.on('connection', (socket) => {
socket.on('sound-blob', (blob) => {
audioStream.push(blob);
})
})
server = http.createServer(app);
server.listen(8080);
Right now, the ondataavailable event is only fired when the stream ends, but I think it would be possible to split the recording into chunks and stream it in real time. I'm not sure of the appropriate approach for this.
Once a stream is sent to the server, there will probably be some encoding / converting to do before piping it to the /stream.mp3 endpoint. I don't know if this is necessary either.
Would this be even possible to do? Any pitfalls I'm not seeing?
Thanks for sharing your thoughts!

Related

Push local WebRTC stream to a NodeJS server in the cloud

I have a task, but I can't seem to get it done.
I've created a very simple WebRTC stream on a Raspberry Pi which will function as a videochat-camera.
With ionic I made a simple mobile application which can display my WebRTC stream when the phone is connected to the same network. This all works.
So right now I have my own local stream which shows on my app.
I now want to be able to broadcast this stream from my phone to a live server, so other people can spectate it.
I know how to create a NodeJS server which deploys my webcam with the 'getUserMedia' function. But I want to 'push' my WebRTC stream to a live server so I can retrieve a public URL for it.
Is there a way to push my local Websocket to a live environment?
I'm using a local RTCPeerConnection to create a MediaStream object
this.peerconnection = new RTCPeerConnection(this.peerservers);
this.peerconnection.onicecandidate = (event) => {
if (event.candidate && event.candidate.candidate) {
var candidate = {
sdpMLineIndex: event.candidate.sdpMLineIndex,
sdpMid: event.candidate.sdpMid,
candidate: event.candidate.candidate
};
var request = {
what: "addIceCandidate",
data: JSON.stringify(candidate)
};
this.websockets.send(JSON.stringify(request));
} else {
console.log("End of candidates.");
}
};
And to bind the stream object to my HTML Video tag I'm using this
onTrack(event) {
this.remoteVideo.srcObject = event.streams[0];
}
My stream url is something like: MyLocalIP:port/streams/webrtc
So I want to create a public URL out of it to broadcast it.
Is there a way to push my local Websocket to a live environment?
It's not straightforward because you need more than vanilla webrtc (which is peer-to-peer). What you want is an SFU. Take a look at mediasoup.
To realize why this is needed think about how the webrtc connection is established in your current app. It's a negotiation between two parties (facilitated by a signaling server). In order to turn this into a multi-cast setup you will need a proxy of sorts that then establishes separate peer-to-peer connections to all senders and receivers.
You can do it with Socket.io & WebRTC, see the sample here
var offerer = new PeerConnection('http://domain:port', 'message', 'offerer');
offerer.onStreamAdded = function(e) {
document.body.appendChild(e.mediaElement);
};
var answerer = new PeerConnection('http://domain:port', 'message', 'answerer');
answerer.onStreamAdded = function(e) {
document.body.appendChild(e.mediaElement);
};
answerer.sendParticipationRequest('offerer');

How could I stream a video with a range from a FTP server in node.js

I'm using nodejs with express and this FTP node package
https://www.npmjs.com/package/ftp
here is what I do:
var Client = require('ftp');
var fs = require('fs');
var c = new Client();
c.on('ready', function() {
c.get('foo.txt', function(err, stream) {
if (err) throw err;
stream.once('close', function() { c.end(); });
stream.pipe(res);
});
});
c.connect();
and in front I simply use a video player that get it's stream from that server
The issue I'm having is that the .get method does not provide a range parameter so I cannot get a specific part of a video (get a stream that start at 5mins of the video). I'm only capable to get a stream from it start's.
How could I manage to open a stream of a video on a FTP server with a giving range so I can later stream a specific part of that video using the range header coming from the client ?
Thanks a lot
Have you found this example? Streaming a video file to an html5 video player with Node.js so that the video controls continue to work?
You didn't provide any details on how are you loading the video on the frontend, add some snippets of how did you wrote that both on front and backend.
IF you just need a way to pass range parametar through get request, you can use query, but you would have to manually implement that and I dont believe you would want to do that (/video.mpg?range=99)

Error video-stream socket.io + socket.io-stream (Maximum call stack size exceeded)

I'm trying to create a webcam video-stream trough my node.js server with the help of socket.io and socket.io-stream.
I want to capture the video in /camera, open a stream trough socket.io (with the help of socket.io-stream) with the video, and receive it on the index url.
When I connect to the server trough /camera, and thus initiate the stream, the server crashes with the error "RangeError: Maximum call stack size exceeded".
The error seems to coming from "/node_modules/socket.io/node_modules/has-binary/index.js:48:23".
In the examples, I left out most of the arbitrary code, as the server/connection is working fine for transferring data-snippits.
Here is my current setup:
Server:
io.on('connection', function(socket) {
ioStream(socket).on('videoStream', function(stream, data) {
socket.broadcast.emit('videoStream', stream);
});
});
Camera
window.glob_socket = io();
var video = document.getElementById('camera');
navigator.getUserMedia = navigator.getUserMedia || navigator.webkitGetUserMedia || navigator.mozGetUserMedia;
if (navigator.getUserMedia) {
navigator.getUserMedia({
audio: false,
video: {
width: 320,
height: 240
}
}, function(videoStream) {
// Local preview
video.src = window.URL.createObjectURL(videoStream);
video.onloadedmetadata = function(e) {
video.play();
};
// Stream
var stream = ss.createStream();
ss(glob_socket).emit('videoStream', stream, videoStream);
fs.createReadStream(videoStream).pipe(stream);
}, function(err) {
console.log("The following error occurred: " + err.name);
});
} else {
console.log("getUserMedia not supported");
}
Index
var video = document.getElementById('camera');
ss(glob_socket).on('videoStream', function(stream) {
video.src = window.URL.createObjectURL(stream);
video.onloadedmetadata = function(e) {
video.play();
};
});
I'm unable to test the code on the server/index as the server crashes when the camera initiates the stream.
Anyone has an idea what's wrong here?
Unfortunately, you can't do that. socket.io-stream library only deals with static files, but not with live video streams.
To share video stream you should use WebRTC. There are a couple of libraries that may help you get started:
https://simplewebrtc.com/
http://peerjs.com/
Worth noting that WebRTC won't transfer video through your server (in most cases). It does more - it transfers video stream from one peer to another directly, which is good for your server's bandwidth. But there might be problems when peers are behind symmetric NAT. In this case video stream should be transfered through a TURN server.
More information about WebRTC you can find here.

Wav to Blob in nodejs

I'm not sure how to create a blob from a wav file in node. Do I just use Buffer like so?...
var blippityBlob = new Buffer(filePathToWave);
Maybe you could take a look at BinaryJS
Quoting:
BinaryJS is a lightweight framework that utilizes websockets to send, stream, and pipe binary data bidirectionally between browser javascript and Node.js.
Server Code
var server = BinaryServer({port: 9000});
server.on('connection', function(client){
client.on('stream', function(stream, meta){
var file = fs.createWriteStream(meta.file);
stream.pipe(file);
});
});
Client Code
var client = BinaryClient('ws://localhost:9000');
client.on('open', function(stream){
var stream = client.createStream({file: 'hello.txt'});
stream.write('Hello');
stream.write('World!');
stream.end();
});
The answer lies in a combination of these two posts:
Node.js canĀ“t create Blobs?
Convert a binary NodeJS Buffer to JavaScript ArrayBuffer

Broadcast web cam with socket.io?

I can get stream from browser with these lines of code:
var socket = io.connect('127.0.0.1:9000');
navigator.getUserMedia = navigator.getUserMedia ||
navigator.webkitGetUserMedia ||
navigator.mozGetUserMedia ||
navigator.msGetUserMedia;
var cam;
navigator.getUserMedia({video: true, audio: true}, function(stream) {
//var call = peer.call('another-peers-id', stream);
//call.on('stream', function(remoteStream) {
// Show stream in some video/canvas element.
//});
cam = stream;
console.log(stream);
}, function(err) {
console.log('Failed to get local stream' ,err);
});
Now I want to send live stream to socket.io server and then broadcast it with socket.io server.
Is there any simple code to do it ?
I tried for a few days to get something like this working, and after going down the rabbit hole I ended up just firing up an instance of Wowza media server on AWS (following these instructions) and managing the server with my node instance instead of trying to do the video.
It worked beautifully. Scales well (auto-scaling even), relatively easy to deploy, and has great support on their forums. A++, would code again.
Also, ultimately you're probably going to need to do some transcoding/scaling/watermarking if this is to be a commercial project, and Wowza leverages NVENC on the GPU on Amazon's graphics instances, which just blows anything else out of the water.

Resources