Using (socket.io + RedisStore) to communicate across multiple servers - node.js

I am working on a multiplayer online game, using Node.js and Socket.io. I expect a lot of players to join the game, so I am hosting it on Amazon Opworks.
The problem is that the servers aren't able to send socket events to clients connected to a different server. I am using RedisStore to manage socket.io sessions. I believed RedisStore and socket.io took care of this inter-server communication under the hood in a seamless manner. Here is a reference to another question: How does socket.io send messages across multiple servers?
But that's not the case it seems. Messages do not go through to other clients if they are on different servers; the app works if there is only one server, but fails if I use multiple servers loadbalanced using ELB on Opsworks.
This is just an extract from the whole code. Please ignore syntax errors etc if any.
app.js
//Redis Client Initialization
var redis = require("redis");
redis_client = require('redis-url').connect('redis://xxxxx');
//setting RedisStore for socket.io
var RedisStore = require('socket.io/lib/stores/redis')
, redis = require('socket.io/node_modules/redis')
, pub = redis.createClient(11042,'-----')
, sub = redis.createClient(11042,'-----')
, client = redis.createClient(11042,'-----');
// using RedisStore in combo with Socket.io to allow communications across multiple servers
io.set('store', new RedisStore({
redis : redis,
redisPub : pub,
redisSub : sub,
redisClient : client
}));
//socket communication specific code
io.of('/game')
.on('connection', function (socket) {
socket.on('init' , function(data){
var user_id = data.user_id; // collecting user_id that was sent by the client
var socket_id = socket.id;
redis_client.set("user_socket:"+user_id, socket_id, function(err, reply){
//stored a referece to the socket id for that user in the redis database
});
});
socket.on('send_message', function(data){
var sender = data.sender_id;
var reciepient = data.reciepient_id ; // id of the user to whom message is to be sent to
redis_client.get("user_socket:"+reciepient, function(err,socket_id){
if(socket_id){
var socket = io.of('/game').sockets[socket_id];
socket.emit("message", {sender : sender}); // This fails. Messages to others servers dont go through.
}
})
})
});

You can't broadcast directly to socket objects on other servers. What Redis does is allows you to broadcast to 'rooms' on other servers. Thankfully with socket.io 1.x, new connections automatically join a room with the name of their socket id. To solve your problem, change:
if(socket_id){
var socket = io.of('/game').sockets[socket_id];
socket.emit("message", {sender : sender}); // This fails. Messages to others servers dont go through.
}
to emit to the room instead of calling emit on a socket object:
if(socket_id){
io.to(socket_id).emit("message", {sender : sender}); // This fails. Messages to others servers dont go through.
}
And you might have more luck.

Related

optimize number of redis connections with a node.js-application

I have a question about redis connections.
I'm developing an app in react native which will use websockets for chat messages. My backend consists of a node.js-app with redis as pubsub mechanism for socket.io.
I'm planning on deploying on heruko. I'm currently on the free hobby plan, which has a limit of 20 connections to redis.
My question now is: how can I optimize my code so that a minimum of connections are used. I'm ofc planning to upgrade my heroku plan once I launch, but then still I want to optimize.
My node.js-code looks like this (simplified):
const Redis = require('ioredis');
const pubClient = new Redis(/* redis url */);
const subClient = new Redis(/* redis url */);
const socketClient = new Redis(/* redis url */);
const io = require('socket.io')(server);
io.on('connection', async (socket) => {
// store socket.id in redis so I can send messages to individual users
// based on the user ID
const userId = socket.handshake.query.userId;
await socketClient.hset('socketIds', userId, socket.id);
socket.on('message', async (data) => {
/**
* data {
* userId,
* message
* }
*/
const data2 = JSON.parse(data);
// get the socket.id based on the user ID
const socketId = await socketClient.hget('socketIds', data2.userId);
// send the message to the correct socket.id
io.to(socketId).emit('message', data.message);
};
});
So when I deploy this code to heroku, when started, it will create 3 connections to the same redis server. But what if 2-3-4-... people connect to this node.js-server? If 2 people connect, will there be 6 redis-connections, or only 3? Like: will the node.js-server initiate every time a users accesses the server 3 new redis connections, or will it always be 3 connections?
I'm trying to track all connections with CLIENT LIST in redis-cli, but I does not give me the correct thing I guess. I was just testing my code with only one user connection to the socket server and it gave me 1 client in redis (instead of 3 connections).
Thanks in advance.
It doesn't matter how many people are using the app, each client instance will have only 1 socket at any time, which means you'll see at most 3 clients per node process.
You see only 1 connection because by default ioredis initiates the connection when the first command is executed, and not when the client is created. You can call client.connect() in order to initiate the socket without executing a command.

Can't receive redis data from socket io

I'm building a realtime visualization using redis as pubsub messenger between python and node. There's a python script always running which sets a redis hash with hmset. That side of the app is working fine, if I enter the following example command: "HGETALL 'sellers-80183917'" in a redis client I end up getting the proper data.
The problem is in the js side. I'm using socketio and redis nodejs libraries to listen to the redis instance and publish the results online through a d3js viz.
I run the following code with node:
var express = require('express');
var app = express();
var redis = require('redis');
app.use(express.static(__dirname + '/public'));
var http = require('http').Server(app);
var io = require('socket.io')(http);
var sredis = require('socket.io-redis');
io.adapter(sredis({ host: 'localhost', port: 6379 }));
redisSubscriber = redis.createClient(6379, 'localhost', {});
redisSubscriber.on('message', function(channel, message) {
io.emit(channel, message);
});
app.get('/sellers/:seller_id', function(req, res){
var seller_id = req.params.seller_id;
redisSubscriber.subscribe('sellers-'.concat(seller_id));
res.render( 'seller.ejs', { seller:seller_id } );
});
http.listen(3000, '127.0.0.1', function(){
console.log('listening on *:3000');
});
And this is the relevant part of the seller.ejs file that's receiving the user requests and outputting the viz:
var socket = io('http://localhost:3000');
var stats;
var seller_key = 'sellers-'.concat(<%= seller %>);
socket.on(seller_key, function(msg){
stats = [];
console.log('Im in');
var seller = $.parseJSON(msg);
var items = seller['items'];
for(item in items) {
var item_data = items[item];
stats.push({'title': item_data['title'], 'today_visits': item_data['today_visits'], 'sold_today': item_data['sold_today'], 'conversion_rate': item_data['conversion_rate']});
}
setupData(stats);
});
The problem is that the socket_on() method never receives anything and I don't see where the problem is as everything seems to be working fine besides this.
I think that you might be confused as to what Pub/Sub in Redis actually is. It's not a way to listen to changes on hashes; you can have a Pub/Sub channel called sellers-1, and you can have a hash with the key sellers-1, but those are unrelated to each other.
As documented here:
Pub/Sub has no relation to the key space.
There is a thing called keyspace notifications that can be used to listen to changes in the key space (through Pub/Sub channels); however, this feature isn't enabled by default because it'll take up more resources.
Perhaps an easier method would be to publish a message after the HMSET, so any subscribers would know that the hash got changed (they would then retrieve the hash contents themselves, or the published message would contain the relevant data).
This brings us to the next possible issue: you only have one subscriber connection, redisSubscriber.
From what I understand from the Node.js Redis driver, calling .subscribe() on such a connection would remove any previous subscriptions in favor of the new one. So if you were previously subscribed to the sellers-1 channel and subscribe to sellers-2, you wouldn't be receiving messages from the sellers-1 channel anymore.
You can listen on multiple channels by either passing an array of channels, or by passing them as a arguments:
redisSubscriber.subscribe([ 'sellers-1', 'sellers-2', ... ])
// Or:
redisSubscriber.subscribe('sellers-1', 'sellers-2', ... )
You would obviously have to track each "active" seller subscription. Either that, or create a new connection for each subscription, which also isn't ideal.
It's probably a better idea to have a single Pub/Sub channel on which all changes would get published, instead of a separate channel for each seller.
Finally: if your seller id's aren't hard to guess (for instance, if it's based on an incremental integer value), it would be trivial for someone to write a client that would make it possible to listen in on any seller channel they'd like. It might not be a problem, but it is something to be aware of.

How to send message to a specific client with socket.io if the application uses the cluster up and running in several processes on different ports?

The application starts in cluster mode, each worker is to establish a connection to the socket, using redis adapter:
app.set('port', httpPort);
let server = http.createServer(app);
let io = require('./socketServer')(server);
io.adapter(redis({host: host, port: port}));
app.set('io', io);
then we connect the main socket.io file (socketServer), where after authorization of the socket and on.connection event, we save sessionID in variable socketID, and store current socket connection in array io.clients
io.sockets.on('connection', (socket) =>{
var socketID = socket.handshake.user.sid;
io.clients[socketID] = socket;
io.clients[socketID].broadcast.emit('loggedIn',socket.handshake.user.data);
socket.on('disconnect', () =>{
delete io.clients[socketID];
});
});
Before nodejs app, we have nginx with customized "upstream" to organize a "sticky sessions" (http://socket.io/docs/using-multiple-nodes/#nginx-configuration).
Then, when we want to send a message to a particular customer, already from the controller we get id user, and get session-id for id (we pre-authorization keep these correspondences in redis), and then just send a message:
this.redis.getByMask(`sid_clients:*`,(err,rdbData) =>{
Async.each(clients,(client,next)=>{
let sid = `sid_clients:${client}`;
let currentClient = rdbData[sid];
if(!currentClient || !this.io.clients[currentClient]) return next();
this.io.clients[currentClient].emit(event,data);
return next();
});
It works fine when we run the application in a single process. But this don't work when running in a cluster mode. Connection message "loggedIn" is send to all customers on all processes. But if a single process to send a message to the client that connects to a server in another process - does not work, because that each process has own array io.clients and they are always have different content, so the message does not can reach the right customer.
So, how send events to the specific client in a cluster mode? How to keep all connected sockets in one place to avoid situations such as mine?

Sticky socket.io sessions by cookie for node.js cluster without sticky express sessions

I am working with the express and socket.io libraries of Node.js on the same server, listening on the same port. I would like to use the cluster module to support round-robin load balancing, but I want the load-balancing behavior for express and socket.io to be different. The behavior is as follows:
Incoming connections for HTTP/S should connect to any single worker
Incoming connections for WS/S should connect to a specific worker, based on a cookie value---more broadly, based on a value
Are there any available libraries to accomplish my desired behaviors? If not, how should I go about accomplishing these behaviors?
There's a bunch of ways you could do this. I'm gonna link you to this guide on using redis as a pub sub but also I'll give you a super short overview of what it could look like.
So spawn two clusters at start up, or however many you want..
var aWorker = cluster.fork();
var bWorker = cluster.fork();
then you need to set them up to listen on their respective ports, so using the net module:
var server1 = require('net').createServer([options], function( connection ) {
aWorker.send( 'ConnectionEvent' , connection );
}).listen(80); //HTTP/WS
var server2 = require('net').createServer([options], function( connection ) {
bWorker.send( 'ConnectionEvent' , connection );
}).listen(443); //HTTPS/WSS
In your cluster process:
var app_server = require('express')().listen( 0, 'localhost' );
var io = require('socket.io')(app_server);
io.adapter(require('socket.io-redis')({ host: '127.0.0.1', port: **REDIS PORT** });
io.on('connection', function (socket) {
//Rest of your io server code
...
process.on( 'message' , function( message, connection ) {
if( connection && message === 'ConnectionEvent' ) {
app_server.emit( 'connection', connection );
connection.resume();
}
}
I believe the room function of Socket.io would accomplish what you're trying to do in your second point rather than relying on creating new worker tasks. That's just my opinion though.

Socket.io with multiple Node.js hosts, emit to all clients

I am new to Socket.io and trying to get my head around the best approach to solve this issue.
We have four instances of a Node.js app running behind a load balancer.
What I am trying to achieve is for another app to POST some data to the load balancer URL which will hand if off to one of the instances.
The receiving instance will store the data, then use Socket.io to emit the data to the connected clients.
The issue is that browser/client can only be connected to a single instance at one time.
I am trying to determine if there is a way to emit to all clients at once?
Or have the clients connect to multiple servers using io.connect?
Or is this a case for Redis?
Publish/Subscribe is what you need here. Redis will give you the functionality your looking for out of the box. You just need to create a redis client and subscribe to an update channel on each of your app server nodes. Then, publish the update when a POST is successful (or whatever). Finally, have the redis client subscribe to the update chanel and on message emit a socketio event:
(truncated for brevity)
var express = require('express')
, socketio = require('socket.io')
, redis = require('redis')
, rc = redis.createClient()
;
var app = express();
var server = http.createServer(app);
var io = socketio.listen(server);
server.listen(3000);
app.post('/targets', function(req, res){
rc.publish('update', res.body);
});
rc.on('connect', function(){
// subscribe to the update channel
rc.subscribe('update');
});
rc.on('message', function(channel, msg){
// util.log('Channel: ' + channel + ' msg: ' + msg);
var msg = JSON.parse(msg);
io.sockets.in('update').emit('message', {
channel: channel,
msg: msg
});
});
Then in the JS app, listen for that emitted message:
socket.on('message', function(data){
debugger;
// do something with the updated data
});
Of course, introducing this new Redis Server adds another single point of failure. A more robust implementation may use something like a message broker with AMQP or ZeroMQ or some similar networking library which provides pub/sub capabilities.

Resources