Send MediaStream through NodeJS - node.js

I have the MediaStream object returned from getUserMedia and can show it in my own screen.
The thing is, I don't know how to *send / pipe / stream/ * that MediaStream from Point A through NodeJS using socket.io to Point B.
My code right now is:
// Cámara
if (navigator.getUserMedia) {
navigator.getUserMedia({audio: true, video: true}, function(stream) {
video.src = URL.createObjectURL(stream) || window.URL.createObjectURL(stream);
webcamstream = stream;
}, onVideoFail);
} else {
alert ('failed');
}
});
function onVideoFail(e) {
console.log('webcam fail!', e);
};
I need the way to make this stream being sent constantly to other user using NodeJS.

The comment made in the answer for Audio and video conference with NodeJS are still valid
if you were to send the streams through socket.io instead of a peer connection, in any case that would be the raw video content (bytes). You would lose:
the streaming part (RTP/RTCP), and corresponding packet loss cancellation
the bandwidth adaptation
the encryption (DTLS)
the media engine (jitter correction, …)
over webRTC.
why not implement an SFU in node.js? Use node.js/socket.io for the signaling (i.e. the initial handshake) but also use node.js as a peer connection endpoint which relays/routes the media stream to (an)other(s) peer(s)? You would have an intermediate server like you seem to want, and all the advantages of webRTC.
another solution is to use an MCU, google webrtc+mcu and you will find many.

this might be a little late, but there's now a library called wrtc/node-webrtc. use that! https://github.com/node-webrtc/node-webrtc

Related

Is there a better solution than socket.io for slow-speed in-game chat?

I am creating a browser game with node.js (backend api) and angular (frontend). My goal is to implement an in-game chat to allow communication between players on the same map. The chat is not an essential part of the game, so messages don't need to be instant (few seconds of latency should be ok). It is just a cool feature to talk some times together.
A good solution should be to implement socket.io to have real-time communication. But as chat is not an essential component and is the only thing which will require websockets, i'm wondering if there is not an alternative to avoid server overload with sockets handling.
I thinked about polling every 2 or 3 seconds my REST API to ask for new messages, but it may overload server the same way... What are your recommandations?
Thank you for your advices
There's a pretty cool package called signalhub. It has a nodejs server component and stuff you can use in your users' browsers. It uses a not-so-well-known application of the http (https) protocol called EventSource. EventSource basically opens persistent http (https) connections to a web server.
It's a reliable and lightweight setup. (The README talks about WebRTC signalling, but it's useful for much more than that.)
On the server side, a simple but effective server setup might look like this:
module.exports = function makeHubServer (port) {
const signalhubServer = require('signalhub/server')
const hub = signalhubServer({ maxBroadcasts: 0 })
hub.on('subscribe', function (channel) {
/* you can, but don't have to, keep track of subscriptions here. */
})
hub.on('publish', function (channel, message) {
/* you can, but don't have to, keep track of messages here. */
})
hub.listen(port, null, function () {
const addr = hub.address()
})
return hub
}
In a browser you can do this sort of thing. It user GET to open a persistent EventSource to receive messages. And, when it's time to send a message, it POSTs it.
And, Chromium's devtools Network tab knows all about EventSource connections.
const hub = signalhub('appname', [hubUrl])
...
/* to receive */
hub.subscribe('a-channel-name')
.on('data', message => {
/* Here's a payload */
console.log (message)
})
...
/* to send */
hub.broadcast('a-channel-name', message)

discord.js playing audio on multiple servers

I made a test test bot for discord.js using Node.js / ffmpeg, to play a radio station in a channel on discord and there are no problems there, however i was thinking about how it would stream the same station to multiple servers/channels efficiently.
For example, if I am playing it on "Discord server one":
var voiceChannel = message.member.voiceChannel;
voiceChannel.join().then(connection => {
console.log("joined channel");
const dispatcher = connection.playArbitraryInput('http://philae.shoutca.st:8950/live', { volume: 0.5 });
dispatcher.on("end", end => {
console.log("left channel");
voiceChannel.leave();
});
}).catch(err => console.log(err));
However if i want to play it on another server that my bot is a member of, do i need to create the audio stream again?
If so that kinda sux. My bot would have to encode the stream seperatly for each instance?
I would like to know if there is any way i can reuse the audio stream and restream it so that if I had my bot in 100 channels its only one download instance of the stream and 100 uploads
I haope i am making sense but let me make some fake code to try help explain:
lets say instead i could do something like this:
let cachedAudioStream = cacheArbitraryInput('http://philae.shoutca.st:8950/live', {});
then for every outgoing instance:
connection1.playArbitraryInput(cachedAudioStream, { volume: 0.5 });
connection2.playArbitraryInput(cachedAudioStream, { volume: 0.8 });
connection3.playArbitraryInput(cachedAudioStream, { volume: 1 });
and so on
Thanks
Since the stream is managed by Discord.js (you only give a link, and Discord.js uses FFMPEG to handle it), you can't do it without modifying the Discord.js code.
Your question is relevant but I don't think it's feasible, at least if you uses Discord.js. You can still open an issue on their Github repository to ask them.
discord.js has a 'Voice Broadcast' for stuff like radio bots. Yes, you do have to connect the call to the broadcast but that's all.

Stream data from Browser to nodejs server using getUserMedia

I'm trying to send data (video and audio) from Browser to a NodeJS server. On the client side, I'm using getUserMedia to get a stream data and trying to send it over websockets via SockJS. Here is my code so far :
navigator.mediaDevices.getUserMedia({
audio: true,
video: true
})
.then(function(stream){
// trying to send stream
var video = window.URL.createObjectURL(stream);
// send stream
mystream.send(video.play());
})
Where mystream is a SockJS instance.
My need is to persist the video as it is watched by a peer.
Has anyone ever sent a stream video to a server ? I'm out of ideas on this one. Any help/hint is appreciated.
After hours of researching, I just gave up and used Kurento. It is well documented and there is some pretty interesting examples involving NodeJS sample code. I let the question open in case someone comes with a better idea.

Send webRTC getUserMedia webCam stream over socketio

I have this piece of code:
navigator.mediaDevices.getUserMedia(param)
.then(function(stream) {
video.srcObject = stream;
video.play();
}
})
.catch(function (err) {});
In this code I want to send this stream over socketio to nodejs server so that I can use it on receiver end for display in video element.
How can I achieve this?
I think something like this is your best bet: https://stackoverflow.com/a/17938723/5915143
You'd record the stream using MediaStreamRecorder and send it with 'emit()' calls on socket io to your server.
Alternatively you can use a streaming library built on socket.io like Endpoint.js to handle the stream.

Stream audio simultaneously from soundcloud source with node

I am using the Soundcloud api from a node server. I want to stream an audio track simultaneously to multiple users.
I tried something like this (using the code on this question Streaming audio from a Node.js server to HTML5 <audio> tag) but it does not work. Any idea on how I could do this?
var radio = require("radio-stream");
var http = require('http');
var url = "http://api.soundcloud.com/tracks/79031167/stream?client_id=db10c5086fe237d1718f7a5184f33b51";
var stream = radio.createReadStream(url);
var clients = [];
stream.on("connect", function() {
console.error("Radio Stream connected!");
console.error(stream.headers);
});
stream.on("data", function (chunk) {
if (clients.length > 0){
for (client in clients){
clients[client].write(chunk);
};
}
});
stream.on("metadata", function(title) {
console.error(title);
});
var server = http.createServer(function(req, res){
res.writeHead(200,{
"Content-Type": "audio/mpeg",
'Transfer-Encoding': 'chunked'
});
clients.push(res);
console.log('Client connected; streaming');
});
server.listen("8000", "0.0.0.0");
console.log('Server running at http://127.0.0.1:8000');
There are several problems
Follow Redirects
The radio-stream module that you're using hasn't been updated in 4 years. That's an eternity in Node.js API's time. I recommend not using it, as there are undoubtedly compatibility issues with current and future versions of Node.js. At a minimum, there are much better ways of handling this now with the new streams API.
In any case, that module does not follow HTTP redirects. The SoundCloud API is redirecting you to the actual media file.
Besides, the radio-stream module is built to demux SHOUTcast/Icecast style metadata, not MP3 ID3 data. It won't help you.
All you need is a simple http.get(). You can then either follow the redirect yourself, or use the request package. More here: How do you follow an HTTP Redirect in Node.js?
Chunked Encoding
Many streaming clients cannot deal with chunked encoding. Node.js (correctly) adds it when you have streaming output. For our purposes though, let's disable it.
res.useChunkedEncodingByDefault = false;
https://stackoverflow.com/a/11589937/362536
Building a Coherent Stream
In theory, you can just append MPEG stream after MPEG stream and all will work fine. In practice, this doesn't work. ID3 tags will corrupt the stream. One file might be in a different sample rate than the other file and most software will not be able to switch the hardware to that new sample rate on the fly. Basically, you cannot reliably do what you're trying to do.
The only thing you can do is re-encode the entire stream by playing back these audio files, and getting a solid stream out the other end. This gives you the added bonus that you can handle other codecs and formats, not just MP3.
To handle many of your codec issues, you can utilize FFmpeg. However, you're going to need a way to play back those files to FFmpeg for encoding.
Rate Limiting
You must stream audio at the rate of playback. (You can send an initial buffer to get clients started quickly, but you can't keep slamming data to them as fast as possible.) If you don't do this, you will run out of memory on the server very quickly, as clients will lower their TCP window size down to zero and stay there until the audio has caught up enough to allow buffering more data. Since you're not using pipe, your streams are in flowing mode and will indefinitely buffer on the server. Now, this is actually a good thing in some ways because that prevents one slow client from slowing down the others. It's a bad thing though in that your code, you are streaming as fast as possible and not at the rate of playback.
If you play back the audio to another encoder, use RTC over several seconds as a clock. It doesn't have to be perfect, that's what client buffers are for. If you're playing back to an audio device, it has its own clock of course, which will be used.
What you should actually do
You've stumbled into a huge project. I strongly recommend using Liquidsoap instead. There are ways you can control it from Node.js. From there, use a server like Icecast for your streaming.

Resources