Push local WebRTC stream to a NodeJS server in the cloud - node.js

I have a task, but I can't seem to get it done.
I've created a very simple WebRTC stream on a Raspberry Pi which will function as a videochat-camera.
With ionic I made a simple mobile application which can display my WebRTC stream when the phone is connected to the same network. This all works.
So right now I have my own local stream which shows on my app.
I now want to be able to broadcast this stream from my phone to a live server, so other people can spectate it.
I know how to create a NodeJS server which deploys my webcam with the 'getUserMedia' function. But I want to 'push' my WebRTC stream to a live server so I can retrieve a public URL for it.
Is there a way to push my local Websocket to a live environment?
I'm using a local RTCPeerConnection to create a MediaStream object
this.peerconnection = new RTCPeerConnection(this.peerservers);
this.peerconnection.onicecandidate = (event) => {
if (event.candidate && event.candidate.candidate) {
var candidate = {
sdpMLineIndex: event.candidate.sdpMLineIndex,
sdpMid: event.candidate.sdpMid,
candidate: event.candidate.candidate
};
var request = {
what: "addIceCandidate",
data: JSON.stringify(candidate)
};
this.websockets.send(JSON.stringify(request));
} else {
console.log("End of candidates.");
}
};
And to bind the stream object to my HTML Video tag I'm using this
onTrack(event) {
this.remoteVideo.srcObject = event.streams[0];
}
My stream url is something like: MyLocalIP:port/streams/webrtc
So I want to create a public URL out of it to broadcast it.

Is there a way to push my local Websocket to a live environment?
It's not straightforward because you need more than vanilla webrtc (which is peer-to-peer). What you want is an SFU. Take a look at mediasoup.
To realize why this is needed think about how the webrtc connection is established in your current app. It's a negotiation between two parties (facilitated by a signaling server). In order to turn this into a multi-cast setup you will need a proxy of sorts that then establishes separate peer-to-peer connections to all senders and receivers.

You can do it with Socket.io & WebRTC, see the sample here
var offerer = new PeerConnection('http://domain:port', 'message', 'offerer');
offerer.onStreamAdded = function(e) {
document.body.appendChild(e.mediaElement);
};
var answerer = new PeerConnection('http://domain:port', 'message', 'answerer');
answerer.onStreamAdded = function(e) {
document.body.appendChild(e.mediaElement);
};
answerer.sendParticipationRequest('offerer');

Related

Adding a user to a room connected to a different server with Node, SocketIO and Redis

I am working on writing server-side code in node.js for a swift based iOS application. Currently, the code works when running it on one EC2 instance, but I am working on setting up a network load balancer so that it can more appropriately scale with incoming user traffic. I decided that the easiest way to achieve this is to use the redis adapter. So now, my server.js file includes:
const app = new Koa();
const server = http.createServer(app.callback));
const io = require('socket.io')(server);
const redisAdapter = require('socket.io-redis');
io.adapter({ host: 'my-elasticache-redis-endpoint', port: 6379 })
Based on the documentation, this seemed like the only step that was necessary from a code standpoint to get redis actually working in the project, but I may be missing something. From an architecture perspective, I enabled sticky sessions on the target group and set up two servers, both running this code. In addition, if I print out the socket io information, I can see that it has adequately connected to the redis endpoint.
The issue is as follows. Lets say I have two people, Person A and Person B, each connected to different servers. The application is supposed to function like so:
Person A adds person B to a socket room. Then the server emits an event to everyone in that room saying that person B has joined, so the front end can respond accordingly.
This is done through the following function:
protected async r_joinRoom(game: GameEntity, player: PlayerEntity): Promise<void> {
return new Promise((res, rej) => {
let socket: any;
socket = this._io.sockets.connected[player.socket_id];
if (!socket) {
socket = this._socket;
}
socket.join(`game/${game.id}`, (err: any) => {
if (err) {
return rej(new GameError(`Unable to join the room=${game.id}.\n${err}`));
}
res();
});
});
}
The premise here is that Person B is a player, and as a player, he has an associated socket id that the server is keeping track of. I believe the issue, however, is that socket = this._io.sockets.connected[player.socket_id]; Does not find the connected player, because he is technically connected to a different server. Printing out the socket shows it as null, and if I subsequently have that exact same function run on the server player B is connected to, he joins the room no problem. Therefore, when the emitted events takes place following 'adding' person B to the room, only person A's phone gets the event, and not B. So is this an issue with my Redis setup? Or is there a way to get all the connected clients to any of the servers running the node.js app?
I ended up answering my own question. When you add to the room, you have to do it directly from the adapter. From the documentation, that means I would switch socket.join... to
io.of('/').adapter.remoteJoin('<my-id>', 'room1', (err) => {
if (err) { /* unknown id */ }
// success
});
using that remoteJoin function worked off the bat

How could I stream a video with a range from a FTP server in node.js

I'm using nodejs with express and this FTP node package
https://www.npmjs.com/package/ftp
here is what I do:
var Client = require('ftp');
var fs = require('fs');
var c = new Client();
c.on('ready', function() {
c.get('foo.txt', function(err, stream) {
if (err) throw err;
stream.once('close', function() { c.end(); });
stream.pipe(res);
});
});
c.connect();
and in front I simply use a video player that get it's stream from that server
The issue I'm having is that the .get method does not provide a range parameter so I cannot get a specific part of a video (get a stream that start at 5mins of the video). I'm only capable to get a stream from it start's.
How could I manage to open a stream of a video on a FTP server with a giving range so I can later stream a specific part of that video using the range header coming from the client ?
Thanks a lot
Have you found this example? Streaming a video file to an html5 video player with Node.js so that the video controls continue to work?
You didn't provide any details on how are you loading the video on the frontend, add some snippets of how did you wrote that both on front and backend.
IF you just need a way to pass range parametar through get request, you can use query, but you would have to manually implement that and I dont believe you would want to do that (/video.mpg?range=99)

Is Node-XMPP useless ? Choosing XMPP server

I am choosing a XMPP server, and currently trying NodeXMPP.
I installed complete NodeXMPP (core,server,client,component,dependencies...).
What is striking me is that I have to do all the back-end stuff : making clients speak to each other etc. Other XMPP servers (tigase ejabberd ...) do this stuff from scratch.
My tiny instance :
I create a server and store clients in an array, then search for a client when an other try to speak :
var xmpp = require('../index')
var c2s = new xmpp.C2SServer({
port: 5222,
domain: 'localhost'
})
var clients = new Array();
c2s.on('connect', function(client) {
client.on('authenticate', function(opts, cb) {
console.log('AUTH' + opts.jid + ' -> ' +opts.password)
clients.push(client);
})
client.on('stanza', function(stanza) {
if (stanza.is('message') && (stanza.attrs.type !== 'error')) {
var interlocuteur = getClient(stanza.attrs.to)
if (interlocuteur)
interlocuteur.send(stanza)
}
})
client.on('disconnect', function() {
console.log('DISCONNECT')
})
client.on('online', function() {
console.log('ONLINE')
client.send(new xmpp.Message({ type: 'chat' }).c('body').t('Hello there, little client.'))
})
})
And my question : do I really need to code these basic operations by myself ?
If so, what is the point of Node-XMPP ? Maybe it's to use NodeJS over an other XMPP server like prosody ?
node-xmpp is "just" a library of components that allows you to build your own XMPP client, component or even server.
Being a library, it does not provide a complete solution for particular use case, but a set of building blocks allowing to build one.
If you are in the market of a complete, already made, boxed XMPP server solution, installing Prosody is a good bet. :^)

Socket.io with multiple Node.js hosts, emit to all clients

I am new to Socket.io and trying to get my head around the best approach to solve this issue.
We have four instances of a Node.js app running behind a load balancer.
What I am trying to achieve is for another app to POST some data to the load balancer URL which will hand if off to one of the instances.
The receiving instance will store the data, then use Socket.io to emit the data to the connected clients.
The issue is that browser/client can only be connected to a single instance at one time.
I am trying to determine if there is a way to emit to all clients at once?
Or have the clients connect to multiple servers using io.connect?
Or is this a case for Redis?
Publish/Subscribe is what you need here. Redis will give you the functionality your looking for out of the box. You just need to create a redis client and subscribe to an update channel on each of your app server nodes. Then, publish the update when a POST is successful (or whatever). Finally, have the redis client subscribe to the update chanel and on message emit a socketio event:
(truncated for brevity)
var express = require('express')
, socketio = require('socket.io')
, redis = require('redis')
, rc = redis.createClient()
;
var app = express();
var server = http.createServer(app);
var io = socketio.listen(server);
server.listen(3000);
app.post('/targets', function(req, res){
rc.publish('update', res.body);
});
rc.on('connect', function(){
// subscribe to the update channel
rc.subscribe('update');
});
rc.on('message', function(channel, msg){
// util.log('Channel: ' + channel + ' msg: ' + msg);
var msg = JSON.parse(msg);
io.sockets.in('update').emit('message', {
channel: channel,
msg: msg
});
});
Then in the JS app, listen for that emitted message:
socket.on('message', function(data){
debugger;
// do something with the updated data
});
Of course, introducing this new Redis Server adds another single point of failure. A more robust implementation may use something like a message broker with AMQP or ZeroMQ or some similar networking library which provides pub/sub capabilities.

Broadcast web cam with socket.io?

I can get stream from browser with these lines of code:
var socket = io.connect('127.0.0.1:9000');
navigator.getUserMedia = navigator.getUserMedia ||
navigator.webkitGetUserMedia ||
navigator.mozGetUserMedia ||
navigator.msGetUserMedia;
var cam;
navigator.getUserMedia({video: true, audio: true}, function(stream) {
//var call = peer.call('another-peers-id', stream);
//call.on('stream', function(remoteStream) {
// Show stream in some video/canvas element.
//});
cam = stream;
console.log(stream);
}, function(err) {
console.log('Failed to get local stream' ,err);
});
Now I want to send live stream to socket.io server and then broadcast it with socket.io server.
Is there any simple code to do it ?
I tried for a few days to get something like this working, and after going down the rabbit hole I ended up just firing up an instance of Wowza media server on AWS (following these instructions) and managing the server with my node instance instead of trying to do the video.
It worked beautifully. Scales well (auto-scaling even), relatively easy to deploy, and has great support on their forums. A++, would code again.
Also, ultimately you're probably going to need to do some transcoding/scaling/watermarking if this is to be a commercial project, and Wowza leverages NVENC on the GPU on Amazon's graphics instances, which just blows anything else out of the water.

Resources