Socket.io using Using Node.JS Cluster with PM2 - node.js

I have tried, https://socket.io/docs/using-multiple-nodes/
const io = require('socket.io')(3000);
const redis = require('socket.io-redis');
io.adapter(redis({ host: 'localhost', port: 6379 }));
but didn't work, with multiple core of server. Can anyone expert here will guilde me will appreciated by me.
I using PM2 for node process clustering.
and i am facing issue, user in different thread could no connect with socket.IO but all users in same Socket.IO thread connection is working fine.
In short I want to cluster multiple Socket.IO server for Load balancing.

Related

Socket.io-redis configuration not working

I'm trying to set up socket.io-redis for a kubernetes deployment, but something is wrong with the configuration. It is partially working, in that I can see socket.io messages in redis using redis-cli PSUBSCRIBE, but I don't have access to any of socket.io-redis' functions.
io.of('/').adapter.sockets(), io.of('/').adapter.allRooms()and every other function that should be available upon the successful configuration of socket.io-redis is undefined. My configuration is below.
const app = require('express')();
const server = require('http').createServer(app);
const io = require('socket.io')(server, {transports: ['websocket']});
const redisAdapter = require('socket.io-redis');
io.adapter(redisAdapter({port: 6379, host: '127.0.0.1'}));
I can't find any other cases of difficulty in what should be a simple configuration. I am using socket.io 2.3.0 and socket.io-redis 5.4.0 which should be compatible according to the docs.
When deploying Socket.IO application on a Kubernetes cluster, that means multiple SocketIO servers (Pods), there are two things to take care of:
Enabling the sticky session feature:
when a request comes from a SocketIO client (browser) to your app, it gets associated with a particular session-id, these requests must be kept connecting with the same process (Pod in Kubernetes) that originated their ids.
Using the Redis adapter
you can learn more about this from this Medium story (source code available) Medium

Why does connecting to a cluster constantly loop in IoRedis?

I am currently trying to connect to my Redis cluster stored on another instance from a server running my application. I am using IoRedis to interface between my application and my Redis instance and it worked fine when there was only a single Redis node running. However, after trying to setup the cluster connection in my Node application, it constantly loops on the connection. My cluster setup works correctly.
As of now, I have tried the following configuration in my application to connect to the cluster. The issue is that the 'connect' even constantly loops printing out 'Connected to Redis!'. The events for 'ready' and 'error' are never fired.
const cache: Cluster = new Cluster([{
port: 8000,
host: REDIS_HOST
}, {
port: 8001,
host: REDIS_HOST
}, {
port: 8002,
host: REDIS_HOST
}]);
cache.on('connect', () => {
console.log('Connected to Redis!');
});
In the end, the 'connect' event should only fire once. Does anyone have any thoughts on this?
This kind of error, as I discovered it today, is not related to ioredis but to the redis instance setup. In my case, the problem I had with p3x-redis-ui, which use ioredis, it was the cluster that was not initialized.
See https://github.com/patrikx3/redis-ui/issues/48
maybe you'll find any clues to help you resolve your bug.

Socket.io on multiple servers with HaProxy

I have HAProxy which serve multiple servers with nodejs on expressjs. I have added to that express socket.io and to make them work i tried to connect them with socket.io-redis and socket.io-ioredis. All looks to be connected without any error but when an user get his socket connected with different server from other user, their emits don't read don't send to other servers.
Nodejs setup
var app = express();
var server = require('http').Server(app);
var io = require('socket.io').listen(server);
var redis = require('socket.io-ioredis');
io.adapter(redis({ host: 'serverIP', port: 6565 }));
server.listen(6565);
How do i do the emit:
io.to(roomID).emit(event, object);
The actual problem was that port on which redis was connecting in server was blocked by the firewall.

Scaling Socket.IO across multiple servers

I've been searching around looking for help on setting up a multi-server cluster for a Node.js Socket.IO install. This is what I am trying to do:
Have 1 VIP in an F5 loadbalancer, pointing to n number of Node servers running Express, and Socket.IO
Have client connect to that 1 VIP via io.connect and then have it filter to one the of the servers behind the loadbalancer.
When a message is emitted on any one of those servers, it is is sent to all users who are listening for that event and connect via the other servers.
For example - if we have Server A, Server B and Server C behind LB1 (F5), and User A is connected to Server A, User B is connected to Server B and User C is connected to Server C.
In a "chat" scenario - basically if a message is emitted from Server A to message event - Server B and C should also send the message to their connected client. I read that this is possible with using socket-io.redis, but it needs a Redis box - which server should that be install on? If all the servers are connected to the same Redis box - does this work automatically?
var io = require('socket.io')(server);
var redis = require('socket.io-redis');
io.adapter(redis({ host: 'localhost', port: 6379 }));
Any help would be greatly appreciated thanks!
The answer to this question is that you must set up a single Redis server that is either outside your SocketIO cluster - and have all nodes connect to it.
Then you simply add this at the top of your code and it just works magically without any issues.
var io = require('socket.io')(server);
var redis = require('socket.io-redis');
io.adapter(redis({ host: 'localhost', port: 6379 }));

Errors going to 2 dynos on Heroku with socket.io / socket.io-redis / rediscloud / node.js

I have a node.js / socket.io app running on Heroku. I am using socket.io-redis with RedisCloud to allow users who connect to different dynos to communicate, as described here.
From my app.js:
var express = require('express'),
app = express(),
http = require('http'),
server = http.createServer(app),
io = require('socket.io').listen(server),
redis = require('redis'),
ioredis = require('socket.io-redis'),
url = require('url'),
redisURL = url.parse(process.env.REDISCLOUD_URL),
And later in app.js ...
var sub1 = redis.createClient(redisURL.port, redisURL.hostname, {
no_ready_check: true,
return_buffers: true
});
sub1.auth(redisURL.auth.split(":")[1]);
var pub1 = redis.createClient(redisURL.port, redisURL.hostname, {
no_ready_check: true,
return_buffers: true
});
pub1.auth(redisURL.auth.split(":")[1]);
var redisOptions = {
pubClient: pub1,
subClient: sub1,
host: redisURL.hostname,
port: redisURL.port
};
if (io.adapter) {
io.adapter(ioredis(redisOptions));
console.log("mylog: io.adapter found");
}
It is kind of working -- communication is succeeding between dynos.
Three issues that happen with 2 dynos but not with 1 dyno:
1) There is a login prompt which comes up and works reliably with 1 dyno but is hit-and-miss with 2 dynos -- may not come up and may not work if it does come up. It is (or should be) triggered by the io.sockets.on('connection') event.
2) I'm seeing a lot of disconnects in the server log.
3) Also lots of errors in the client console on Chrome, for example:
socket.io.js:5039 WebSocket connection to 'ws://example.mydomain.com/socket.io/?EIO=3&transport=websocket&sid=F8babuJrLI6AYdXZAAAI' failed: Error during WebSocket handshake: Unexpected response code: 503
socket.io.js:2739 POST http://example.mydomain.com/socket.io/?EIO=3&transport=polling&t=1419624845433-63&sid=dkFE9mUbvKfl_fiPAAAJ net::ERR_INCOMPLETE_CHUNKED_ENCODING
socket.io.js:2739 GET http://example.mydomain.com/socket.io/?EIO=3&transport=polling&t=1419624842679-54&sid=Og2ZhJtreOG0wnt8AAAQ 400 (Bad Request)
socket.io.js:3318 WebSocket connection to 'ws://example.mydomain.com/socket.io/?EIO=3&transport=websocket&sid=ITYEPePvxQgs0tcDAAAM' failed: WebSocket is closed before the connection is established.
Any thoughts or suggestions would be welcome.
Yes, like generalhenry said, the issue is that Socket.io requires sticky sessions (meaning that requests from a given user always go to the same dyno), and Heroku doesn't support that.
(It works with 1 dyno because when there's only 1 then all requests go to it.)
https://github.com/Automattic/engine.io/issues/261 has a lot more good info, apparently web sockets don't really require sticky sessions, but long-polling does. It also mentions a couple of potential work-arounds:
Roll back to socket.io version 0.9.17, which tries websockets first
Only use SSL connections which, makes websockets more reliable (because ISP's and corporate proxies and whatnot can't tinker with the connection as easily.)
You might get the best results from combining both of those.
You could also spin up your own load balancer that adds sticky session support, but by that point, you're fighting against Heroku and might be better off on a different host.
RE: your other question about the Node.js cluster module: it wouldn't really help here. It's for using up all of the available CPU cores on a single server/dyno,

Resources