I am trying to scale a simple socket.io app across multiple processes and/or servers.
Socket.io supports RedisStore but I'm confused as to how to use it.
I'm looking at this example,
http://www.ranu.com.ar/post/50418940422/redisstore-and-rooms-with-socket-io
but I don't understand how using RedisStore in that code would be any different from using MemoryStore. Can someone explain it to me?
Also what is difference between configuring socket.io to use redisstore vs. creating your own redis client and set/get your own data?
I'm new to node.js, socket.io and redis so please point out if I missed something obvious.
but I don't understand how using RedisStore in that code would be any different from using MemoryStore. Can someone explain it to me?
The difference is that when using the default MemoryStore, any message that you emit in a worker will only be sent to clients connected to the same worker, since there is no IPC between the workers. Using the RedisStore, your message will be published to a redis server, which all your workers are subscribing to. Thus, the message will be picked up and broadcast by all workers, and all connected clients.
Also what is difference between configuring socket.io to use redisstore vs. creating your own redis client and set/get your own data?
I'm not intimately familiar with RedisStore, and so I'm not sure about all differences. But doing it yourself would be a perfectly valid practice. In that case, you could publish all messages to a redis server, and listen to those in your socket handler. It would probably be more work for you, but you would also have more control over how you want to set it up. I've done something similar myself:
// Publishing a message somewhere
var pub = redis.createClient();
pub.publish("messages", JSON.stringify({type: "foo", content: "bar"}));
// Socket handler
io.sockets.on("connection", function(socket) {
var sub = redis.createClient();
sub.subscribe("messages");
sub.on("message", function(channel, message) {
socket.send(message);
});
socket.on("disconnect", function() {
sub.unsubscribe("messages");
sub.quit();
});
});
This also means you have to take care of more advanced message routing yourself, for instance by publishing/subscribing to different channels. With RedisStore, you get that functionality for free by using socket.io channels (io.sockets.of("channel").emit(...)).
A potentially big drawback with this is that socket.io sessions are not shared between workers. This will probably mean problems if you use any of the long-polling transports.
I set up a small github project to use redis as datastore.
Now you can run multiple socket.io server processes.
https://github.com/markap/socket.io-scale
Also what is difference between configuring socket.io to use redisstore vs. creating your own redis client and set/get your own data?
The difference is that, when you use 'RedisStore', the socket.io itself will save the socket heartbeat and session info into the Redis, and if you use cluster with node.js, the user client can work.
Without redis, the client might change the node.js process next time, so the session will be lost.
The difference is, if you have a cluster of node.js instances running, memStore won't work since it's only visible to a single process.
Related
I have a nodejs application with cluster and I use hazelcast for scalebility. I need to add socket.io for real time messaging but when I connect to the socket on the worker processes, my client's socket object has lost.
I research for a while but I didn't find any solutions. There are too many examples with Redis but I cannot use Redis. I must use Hazelcast.
How can I share the sockets to every process and also every server with Hazelcast? Is there a way?
Node version:v10.16.3
Socket.io version: 2.3.0
Thanks in advance.
Hazelcast provides you both pub/sub mechanism and in-memory caching on a JVM-based Hazelcast cluster and you can connect to cluster via Hazelcast node.js client. So, It gives an ability to manage and scale Socket.io application via pub/sub topics and storing socket ids in a Hazelcast IMap datastructure.
I can share an example pseudo-code for you:
/* Assumed multi-instance socket.io servers are working and
** they are listening to the Hazelcast pub/sub topics. So, every socket.io
** servers can communicate via pub/sub topics and your other applications as well.
*/
const topic = await client.getTopic('one-of-socket-rooms');
const map = await client.getMap('user-socket-ids');
io.on('connection', (socket) => {
console.log('a user connected with socketId: ' + socket.id );
// assumed you have already userId
await map.put(userId, socket.id);
socket.join('one-of-socket-rooms');
topic.publish('User with '+ socket.id + ' joined the room')
});
If you can share a specific code piece, I can edit my answer according to your exact requirements.
You can't share sockets across processes. What you can do is distribute messages across processes using redis or some other system. If you may use the long polling fallback of socket.io you will need to ensure sticky sessions is enabled otherwise you need to ensure you are only using websockets. Socket.io provides an out of the box solution to integrating redis into your system to distribute messages across processes if your use case is simple.
I am developing a website where I will use microservices.
I will have a couple or more Node.js application that will use Socket.io.
I was trying to figure out how I will architecture this.
Can I use multiple Node.js with Socket.io connecting to a user or will I run into conflicts? I can use NGiNX as a proxy a an UUID to identify which microservice to send the request to. Does that make sens? Is there a better way?
Or I was also thinking of using a Node.js has a proxy that receives all the Socket.io connection and then it creates a connection with the user. But this seems to be adding to the network load because I am adding a another microservice.
Anyways, I would love your views on this.
You can setup multi nodejs services and all of them using socket.io-redis lib connect to a Redis server to share socketID of all nodejs services. That mean you storage all socket info on Redis server. When you emit an event they will automatically emit to all nodejs services.
const server = require('http').createServer();
const io = require('socket.io')(server);
const redis = require('socket.io-redis');
io.adapter(redis({ host: 'localhost', port: 6379 }));
io.on('connection', socket =>{
// put socket of user to a room name by user_id
socket.join(user_id);
});
// emit an event to a specify user, all nodejs keep that user connection will be emit
io.to(user_id).emit('hello', {key: 'ok'})
Depending of how your project goes, it might or might not be a good idea to build microservices communication over Socket.io
The first thing that comes to mind are the poor guarantees of the socket.io messaging system, especially the fact that your messages are not stored on disk. While this might be convenient, this is also a real problem when it comes to building audit trails, debugging or replayability.
Another issue I see is scalability/cluster-ability: yes you can have multiple services communicating over one nodeJS broker, but what happens when you have to add more? How will you pass the message from one broker to the other? This will lead you to building an entire message system and you will find safer to use already existing solutions (RabbitMQ, Kafka...)
Im actually desperate now. I need to design and code solution I've never even thought I'd be making, and it seems completely unrealistic to achieve (for me) with clean express. But I might be wrong.
What I need is:
REST API - done
Socket.IO server - done
bunch of Socket.IO-client connections - can be done
But now, what I need is to some efficient way of making it, to communicate with each other. So the app has to be basically a bridge between all of them. I was thinking of using some kind of state managing library, like Redux solution, that way I (at least I think) could achieve it pretty easy. But I also haven't found any of this kind, for the node, or neither found how to achieve it on the backend.
So the question is: what's the best solution, to achieve one-to-many and many-to-one communication between multiple socket.io connections? Again, to make it clear.
socket-client connection = connecting to already existing server, as a node app.
I am not speaking of making simple solution for clients to communicate between each other on 1 server. I want to create 1 server, and to connect to multiple others.
Okay, so, what you probably want to do is to create one "server" application, then connect your Electron app as well as the bots as clients to that server.
For the bots, which would be running as their own node scripts, you would be using the socket.io-client package, and then use that the same way you use the client in the browser. Call io() with your connection settings, get a socket from that, and start binding your events as usual:
let io = require('socket.io-client');
let socket = io('http://YOUR_SERVER_IP/');
socket.on('connect', function(){});
socket.on('event', function(data){});
socket.on('disconnect', function(){});
If you want to tell the difference between the bots and the Electron clients, I would send out some special event on connect, that your server can then handle:
socket.on('connect', function(){
socket.emit('identify_as_bot', {id: IDENTIFIER})
});
I hope that helps.
From the socket.io docs [http://socket.io/docs/rooms-and-namespaces/#sending-messages-from-the-outside-world] I read the following but I can't seem to connect it to any use case in my head:
Sending messages from the outside-world In some cases, you might want
to emit events to sockets in Socket.IO namespaces / rooms from outside
the context of your Socket.IO processes.
There’s several ways to tackle this problem, like implementing your
own channel to send messages into the process.
To facilitate this use case, we created two modules:
socket.io-redis
socket.io-emitter
By implementing the Redis Adapter:
var io = require('socket.io')(3000);
var redis = require('socket.io-redis');
io.adapter(redis({ host: 'localhost', port: 6379 }));
you can then emit messages from any other process to any channel
var io = require('socket.io-emitter')();
setInterval(function(){
io.emit('time', new Date);
}, 5000);
If you have a cluster of servers and want to talk to clients that are connected to different instances, you'll need a common storage -- that's when you use Redis.
You are also mentioning io-emitter, which is a way for other processes to post messages to your clients. For example, if a worker needs to emit messages to your clients, it can use io-emitter. Redis is the common glue for sharing messages across different processes/servers.
The module is needed only when you want to spread your solution to multiple solutions or node processes. Through the redis adapter the multiple servers could broadcast to other clients.
Basically when you have two servers each running their own server. Server A has three clients. Server B has two different clients. These two servers does not share any client information so you won't be able to broadcast to all the users message. The adapter gives you ability to connect these different servers into one(using redis), so you would be able to broadcast to all the users.
Also good presentation to look about socket.io and redis http://www.slideshare.net/YorkTsai/jsdc2013-28389880.
Im building a nodejs/socket.io based game and Im trying to implement node clustering to take advantage on multicore machines ( few machines, each has few cores ). I figured out that memcache will be nice solution, but Im not completely sure if it'll survive high load, because each game will do about 50 write/read per second. Also what will be the best solution to broadcast message to all clients while they're connected to different servers. For example player X is connected to node 1, he do a simple action and how I can broqdcast the action to player Y which is connected to node 2.
If you are going to be clustering across threads, then using Redis as your Socket.IO store is a perfect solution.
Redis is an extremely fast database, entirely run from memory, that can support thousands of publish/subscribe messages in a single second. Socket.IO has built-in support for Redis and when using it, any message emitted from one instance of Socket.IO is published to Redis, and all Socket.IO instances that have subscribed to the same Redis instance will also emit the same message.
This is how you would set up each Socket.IO instance:
var RedisStore = require('socket.io/lib/stores/redis');
var redis = require('socket.io/node_modules/redis')
io.set('store', new RedisStore({
redisPub: redis.createClient(),
redisSub: redis.createClient(),
RedisClient: redis.createClient()
}));
When Socket.IO is set up like this, and you have a Redis server, a simple io.sockets.emit() will broadcast to all clients on any server regardless of which server executed the code if all servers are publishing/subscribing to the Redis instance.
You can actually serialize the object into a JSON string and pass it as argument and de-serialize at each worker (num of CPUs).
Otherwise you can use cluster.worker[<id>].send(<object>) which will automatically take care of serializing/de-serializing.
For more information: Check Node API docs