Why is my client receiving socket emits from all child node cluster workers when it should only be connected to one? - node.js

I'm trying to make a scalable socket connection handler using node.js, express, socket.io, cluster, socket.io-adapter-mongo, and mubsub. This is my first attempt to use sockets, so forgive me if I reveal my noobness here, but it is my understanding that a cluster worker can only emit to the sockets that are connected to it.
In my dev environment I have cluster forking 8 workers (number of cpus.)
I have my workers subscribe to mubsub db so that they will pick up events that are published from other workers.
if (cluster.isMaster) {
var cpuCount = require("os").cpus().length;
for (var cp = 0; cp < cpuCount; cp++) {
cluster.fork();
}
} else {
io.adapter(mongo({ host: 'localhost', port: 27017, db: 'mubsub' }));
var client = mubsub('mongodb://localhost:27017/mubsub');
var channel = client.channel('test');
channel.subscribe('testEvent', function (message) {
console.log(message);
io.sockets.emit('testEvent', {message: cluster.worker.id});
});
io.sockets.on('connection', function (socket) {
console.log('connected to ' + cluster.worker.id);
channel.publish('testEvent', { message: 'connection3' });
});
...
server.listen(8080);
}
So when I try to connect from the client, the 'connection' event fires and a single console log is written by the worker that receives the connection.
That event is published to the database only once.
Each worker is subscribed to that event, and should emit to all sockets connected to that worker.
For some reason though, my connected client receives 8 messages, one for each worker.
How is the client picking up the emits from workers it should not be connected to? Am I overlooking some cluster magic here?

Not sure what version you are using but this should be true for most current versions.
From the socket.io docs (http://socket.io/docs/server-api/#server#emit):
Server#emit
Emits an event to all connected clients. The following two are equivalent:
var io = require('socket.io')();
io.sockets.emit('an event sent to all connected clients');
io.emit('an event sent to all connected clients');
So the method you are using will broadcast to all connected clients. If you want to split them across the workers this is something you need to manage.
There are a number of ways to address individual sockets(clients) that socket.ios API can help you with but probably best to refer to the docs for this:
http://socket.io/docs/rooms-and-namespaces/

Related

Socket IO Server Clusters working with Redis Pub/Sub

So firstly, I have built a microservice that fetches Football API, and thru pub/sub system of redis, it publishes any changes if there are any for livescores.
Now my server, with sockets and routes, will be in cluster mode. I already set this up with socketio-redis. Here is a snippet of this set up:
const io = require('socket.io')();
const sRedis = require('socket.io-redis');
const adapter = sRedis({ host: 'localhost', port: 6379 });
const { promisify } = require('util');
const Redis = require('ioredis');
const redis = new Redis();
redis.subscribe('livescore');
io.adapter(adapter);
const ioa = io.of('/').adapter;
ioa.clients = promisify(ioa.clients);
ioa.clientRooms = promisify(ioa.clientRooms);
ioa.remoteJoin = promisify(ioa.remoteJoin);
ioa.remoteLeave = promisify(ioa.remoteLeave);
ioa.allRooms = promisify(ioa.allRooms);
// notice this listener
redis.on('message', (channel, message) => {
io.emit('livescore', message);
})
io.on('connect', async (socket) => {
socket.clientRooms = () => ioa.clientRooms(socket.id);
socket.remoteJoin = (room) => ioa.remoteJoin(socket.id, room);
socket.remoteLeave = (room) => ioa.remoteLeave(socket.id, room);
socket.remoteDisconnect = () => ioa.remoteDisconnect(socket.id);
socket.on('join room', async (id) => {
await socket.remoteJoin(id);
socket.emit('join room', `You have joined room ${id}`)
socket.broadcast.emit('join room', `${socket.id} has joined.`)
});
socket.on('leave room', (id) => {
socket.remoteLeave(id);
});
})
module.exports = io;
So, if I run single instance of this node app, everything works perfectly.
But if I run it in cluster mode, let's say there are 4 clusters (I'm running cluster mode with pm2), the following happens:
Microservice publishes event.
Each cluster has a subscription on 'livescore' channel
Each cluster does io.emit() (to all clients)
Client get 4 same events at almost same time.
I figured out why the client gets 4 same events, but I wanna know what is the right way of handling this?
My only thought on solution is that I only do redis sub on one cluster, and publish everything from that one, but I fear that would be too much job for one cluster?
Any ideas?
There are probably multiple solutions to fix it, you could for example:
Use a message queue instead of pub/sub
Depending on the number of processing, you probably only want one node it process the message. A pub/sub is not what you want in that case. You could for example store your messages in a list and use the LPOP command to get and delete a message. Then you could say the "first one catches it" - this way only one of your servers will do the work, but a random one basically.
You could also use a distinct message queue like RabbitMQ, SQS, etc.
Use socket.io-emitter to send messages
Since you're using socket.io-redis anyway, your messages get distributed to your nodes. There's a project which is part of socket.io-redis, it's called socket.io-emitter. That can be used to send messages to all your nodes without being one itself. When you implement that in your worker microservice (the one that writes the message to "livescore" at the moment), you can send messages directly to your clients.
That might not work if you need to process the messages in your node app though.

How to send message to a specific client with socket.io if the application uses the cluster up and running in several processes on different ports?

The application starts in cluster mode, each worker is to establish a connection to the socket, using redis adapter:
app.set('port', httpPort);
let server = http.createServer(app);
let io = require('./socketServer')(server);
io.adapter(redis({host: host, port: port}));
app.set('io', io);
then we connect the main socket.io file (socketServer), where after authorization of the socket and on.connection event, we save sessionID in variable socketID, and store current socket connection in array io.clients
io.sockets.on('connection', (socket) =>{
var socketID = socket.handshake.user.sid;
io.clients[socketID] = socket;
io.clients[socketID].broadcast.emit('loggedIn',socket.handshake.user.data);
socket.on('disconnect', () =>{
delete io.clients[socketID];
});
});
Before nodejs app, we have nginx with customized "upstream" to organize a "sticky sessions" (http://socket.io/docs/using-multiple-nodes/#nginx-configuration).
Then, when we want to send a message to a particular customer, already from the controller we get id user, and get session-id for id (we pre-authorization keep these correspondences in redis), and then just send a message:
this.redis.getByMask(`sid_clients:*`,(err,rdbData) =>{
Async.each(clients,(client,next)=>{
let sid = `sid_clients:${client}`;
let currentClient = rdbData[sid];
if(!currentClient || !this.io.clients[currentClient]) return next();
this.io.clients[currentClient].emit(event,data);
return next();
});
It works fine when we run the application in a single process. But this don't work when running in a cluster mode. Connection message "loggedIn" is send to all customers on all processes. But if a single process to send a message to the client that connects to a server in another process - does not work, because that each process has own array io.clients and they are always have different content, so the message does not can reach the right customer.
So, how send events to the specific client in a cluster mode? How to keep all connected sockets in one place to avoid situations such as mine?

Socket.io with multiple Node.js hosts, emit to all clients

I am new to Socket.io and trying to get my head around the best approach to solve this issue.
We have four instances of a Node.js app running behind a load balancer.
What I am trying to achieve is for another app to POST some data to the load balancer URL which will hand if off to one of the instances.
The receiving instance will store the data, then use Socket.io to emit the data to the connected clients.
The issue is that browser/client can only be connected to a single instance at one time.
I am trying to determine if there is a way to emit to all clients at once?
Or have the clients connect to multiple servers using io.connect?
Or is this a case for Redis?
Publish/Subscribe is what you need here. Redis will give you the functionality your looking for out of the box. You just need to create a redis client and subscribe to an update channel on each of your app server nodes. Then, publish the update when a POST is successful (or whatever). Finally, have the redis client subscribe to the update chanel and on message emit a socketio event:
(truncated for brevity)
var express = require('express')
, socketio = require('socket.io')
, redis = require('redis')
, rc = redis.createClient()
;
var app = express();
var server = http.createServer(app);
var io = socketio.listen(server);
server.listen(3000);
app.post('/targets', function(req, res){
rc.publish('update', res.body);
});
rc.on('connect', function(){
// subscribe to the update channel
rc.subscribe('update');
});
rc.on('message', function(channel, msg){
// util.log('Channel: ' + channel + ' msg: ' + msg);
var msg = JSON.parse(msg);
io.sockets.in('update').emit('message', {
channel: channel,
msg: msg
});
});
Then in the JS app, listen for that emitted message:
socket.on('message', function(data){
debugger;
// do something with the updated data
});
Of course, introducing this new Redis Server adds another single point of failure. A more robust implementation may use something like a message broker with AMQP or ZeroMQ or some similar networking library which provides pub/sub capabilities.

Is it possible to broadcast from socket.io-emitter

Im currently emitting messages using socket.io-emitter to emit messages (in namespace) from a worker in my app, however now i need to broadcast to all connected sockets(to the namespsace), when something happends, is there any work around there?
For example this is a socket.io exposed(HTTP) emit and broadcast using socket.io adapter to
be able to run different socket.io instances in different processes
var io = require('socket.io')(http);
io.adapter(redis(config.redis));
io.of('/namespace').on('connection', function(socket){
socket.emit('message', 'Hi you!');
socket.broadcast.emit('broadcast', 'Heya all!');
});
This is now a different process (MQ worker) that is emitting events to the clients
var io = require('socket.io-emitter')(redis(config.redis));
var socket = io.of('/namespace');
socket.emit('message', 'Hi you!'); // This works
socket.broadcast('broadcast', 'Heya all!'); // This won't work
It doesn't work this way.
With client-emitter you can only emit, then the server process what he want to do with this event.
Server-side :
socket.on('msg', function (msg) {
socket.broadcast.emit('msg', msg);
});
client-side :
socket.emit('msg', 'msg');

How can I send packets between the browser and server with socket.io, but only when there is more than one client?

In my normal setup, the client will emit data to my server regardless of whether or not there is another client to receive it. How can I make it so that it only sends packets when the user-count is > 1? I'm using node with socket.io.
To do this you would want to listen to the connection event on your server (as well as disconnect) and maintain a list of clients which are connected in a 'global' variable. When more than 1 client is connected send out a message to all connected clients to know they can start sending messages, like so:
var app = require('express').createServer(),
io = require('socket.io').listen(app);
app.listen(80);
//setup express
var clients = [];
io.sockets.on('connection', function (socket) {
clients.push(socket);
if (clients.length > 1) {
io.socket.emit('start talking');
}
socket.on('disconnect', function () {
var index = clients.indexOf(socket);
clients = clients.slice(0, index).concat(clients.slice(index + 1));
if (clients.length <= 1) {
io.sockets.emit('quiet time');
};
});
});
Note: I'm making an assumption here that the socket is passed to the disconnect event, I'm pretty sure it is but haven't had a chance to test.
The disconnect event wont receive the socket passed into it but because the event handler is registered within the closure scope of the initial connection you will have access to it.

Resources