Scaling Socket.IO across multiple servers - node.js

I've been searching around looking for help on setting up a multi-server cluster for a Node.js Socket.IO install. This is what I am trying to do:
Have 1 VIP in an F5 loadbalancer, pointing to n number of Node servers running Express, and Socket.IO
Have client connect to that 1 VIP via io.connect and then have it filter to one the of the servers behind the loadbalancer.
When a message is emitted on any one of those servers, it is is sent to all users who are listening for that event and connect via the other servers.
For example - if we have Server A, Server B and Server C behind LB1 (F5), and User A is connected to Server A, User B is connected to Server B and User C is connected to Server C.
In a "chat" scenario - basically if a message is emitted from Server A to message event - Server B and C should also send the message to their connected client. I read that this is possible with using socket-io.redis, but it needs a Redis box - which server should that be install on? If all the servers are connected to the same Redis box - does this work automatically?
var io = require('socket.io')(server);
var redis = require('socket.io-redis');
io.adapter(redis({ host: 'localhost', port: 6379 }));
Any help would be greatly appreciated thanks!

The answer to this question is that you must set up a single Redis server that is either outside your SocketIO cluster - and have all nodes connect to it.
Then you simply add this at the top of your code and it just works magically without any issues.
var io = require('socket.io')(server);
var redis = require('socket.io-redis');
io.adapter(redis({ host: 'localhost', port: 6379 }));

Related

Socket.io using Using Node.JS Cluster with PM2

I have tried, https://socket.io/docs/using-multiple-nodes/
const io = require('socket.io')(3000);
const redis = require('socket.io-redis');
io.adapter(redis({ host: 'localhost', port: 6379 }));
but didn't work, with multiple core of server. Can anyone expert here will guilde me will appreciated by me.
I using PM2 for node process clustering.
and i am facing issue, user in different thread could no connect with socket.IO but all users in same Socket.IO thread connection is working fine.
In short I want to cluster multiple Socket.IO server for Load balancing.

Socket.io on multiple servers with HaProxy

I have HAProxy which serve multiple servers with nodejs on expressjs. I have added to that express socket.io and to make them work i tried to connect them with socket.io-redis and socket.io-ioredis. All looks to be connected without any error but when an user get his socket connected with different server from other user, their emits don't read don't send to other servers.
Nodejs setup
var app = express();
var server = require('http').Server(app);
var io = require('socket.io').listen(server);
var redis = require('socket.io-ioredis');
io.adapter(redis({ host: 'serverIP', port: 6565 }));
server.listen(6565);
How do i do the emit:
io.to(roomID).emit(event, object);
The actual problem was that port on which redis was connecting in server was blocked by the firewall.

How to use socket.io-redis with multiple servers?

i have following code on two machines
var server = require('http').createServer(app);
io = require('socket.io')(server);
var redisAdapter = require('socket.io-redis');
io.adapter(redisAdaptebr({host: config.redis.host, port: config.redis.port}));
server.listen(config.port, function () {
and I store socket.id of every client connected to these two machines on central db, ID of sockets is being saved and event sending on same server works flawlessly, but when I try to send message to the socket of other server it doesn't work..
subSocket = io.sockets.connected[userSocketID];
subSocket.emit('hello',{a:'b'})
How can i know that redis is wokring good.
How to send message to socket connected on another server.
You can't. Socket.IO requires sticky sessions. The socket must communicate solely with the originating process.
docs
You can have the socket.io servers communicate to each other to pass events around, but the client must continue talking to the process with which it originated.
I'm in a similar issue but I can answer your first question.
you can monitor all the commands processed by redis using that command on the terminal:
redis-cli monitor
http://redis.io/commands/MONITOR
Unfortunately I cannot help you further as I am still having issues even though both server are sending something to redis.

How to check socket is alive (connected) in socket.io with multiple nodes and socket.io-redis

I am using socket.io with multiple nodes, socket.io-redis and nginx. I follow this guide: http://socket.io/docs/using-multiple-nodes/
I am trying to do: At a function (server site), I want to query by socketid that this socket is connected or disconnect
I tried io.of('namespace').connected[socketid], it only work for current process ( it mean that it can check for current process only).
Anyone can help me? Thanks for advance.
How can I check socket is alive (connected) with socketid I tried
namespace.connected[socketid], it only work for current process.
As you said, separate process means that the sockets are only registered on the process that they first connected to. You need to use socket.io-redis to connect all your nodes together, and what you can do is broadcast an event each time a client connects/disconnects, so that each node has an updated real-time list of all the clients.
Check out here
as mentioned above you should use socket.io-redis to get it work on multiple nodes.
var io = require('socket.io')(3000);
var redis = require('socket.io-redis');
io.adapter(redis({ host: 'localhost', port: 6379 }));
I had the same problem and no solution at my convenience. So I made a log of the client to see the different methods and variable that I can use. there is the client.conn.readystate property for the state of the connection "open/closed" and the client.onclose() function to capture the closing of the connection.
const server = require('http').createServer(app);
const io = require('socket.io')(server);
let clients = [];
io.on('connection', (client)=>{
clients.push(client);
console.log(client.conn.readyState);
client.onclose = ()=>{
// do something
console.log(client.conn.readyState);
clients.splice(clients.indexOf(client),1);
}
});
When deploying Socket.IO application on a multi-nodes cluster, that means multiple SocketIO servers, there are two things to take care of:
Using the Redis adapter and Enabling the sticky session feature: when a request comes from a SocketIO client (browser) to your app, it gets associated with a particular session-id, these requests must be kept connecting with the same process (Pod in Kubernetes) that originated their ids.
you can learn more about this from this Medium story (source code available) https://saphidev.medium.com/socketio-redis...

How io.adapter works under the hood?

I'm working on 1-1 chat rooms application powered by node.js + express + socket.io.
I am following the article: Socket.IO - Rooms and Namespaces
In the article they demonstrate how to initiate the io.adapter using the module socket.io-redis:
var io = require('socket.io')(3000);
var redis = require('socket.io-redis');
io.adapter(redis({ host: 'localhost', port: 6379 }));
Two questions:
In the docs, They are mentioning two more arguments: pubClient and subClient. Should I supply them? What's the difference?
How the io.adapter behaves? For example, if user A is connected to server A and user B is server B, and they want to "talk" with each other. What's going under the hood?
Thanks.
You do not need to pass your own pubClient/subClient. If you pass host/port, they will be created for you. But, if you want to create them yourself, for any reason (e.g. you want to tweak reconnection timeouts), you create those 2 clients and pass it to adapter.
The adapter broadcasts all emits internally. So, it gives you the cluster feature. E.g. lets suppose that you have chat application, and you have 3 node.js servers behind load balancer (so they share single URL). Lets also assume that 6 different browsers connect to load balancer URL and they are routed to 3 separate node.js processes, 2 users per node.js server. If client #1 sends a message, node.js #1 will do something like io.to('chatroom').emit('msg from user #1'). Without adapter, both server #1 users will receive the emit, but not the remaining 4 users. If you use adapter, however, remaining node.js #2 and node.js #3 will receive info that emit was done and will issue identical emit to their clients - and all 6 users will receive initial message.
I've been struggling with this same issue, but have found an answer that seems to be working for me, at least in my initial testing phases.
I have a clustered application running 8 instances using express, cluster , socket.io , socket.io-redis and NOT sticky-sessions -> because using sticky seemed to cause a ton of bizarre bugs.
what I think is missing from the socket.io docs is this:
io.adapter(redis({ host: 'localhost', port: 6379 })); only supports web sockets ( well at the very least it doesn't support long polling ) , and so the client needs to specify that websockets are the only transport available. As soon as I did that I was able to get it going. So on the client side, I added {transports:['websockets']} to the socket constructor... so instead of this...
var socketio = io.connect( window.location.origin );
use this
var socketio = io.connect( window.location.origin , {transports:['websocket']} );
I haven't been able to find any more documentation from socket.io to support my theory but adding that got it going.
I forked this great chat example that wasn't working and got it working here: https://github.com/squivo/chat-example-cluster so there's finally a working example online :D

Resources