Subscribe to multiple channels with ioredis - node.js

I have a broadcast server with socket.io and ioredis via node.
However, with my current form I can only subscribe to one channel a time.
var Redis = require('ioredis');
var redis = new Redis();
redis.subscribe('mychannel');
Considering that I must have innumerable channels (one for each registered user, for instance) I cannot hard type every channel on the node server.
I've tried also redis.subscribe('*') but without success.
Any light?

Using redis.psubscribe('*') and redis.on('pmessage', handlerFunction) did the trick.

Related

call NodeJs method without load view

In my application NodeJs is running on xxx port. Mouse click event trigger node js function using socket.io. My application have real time notification feature.
My Question is
Can I trigger nodeJs method when real time notification comes from google
because, method which handle this notification not load any view ?
For pushing server to node you can use redis for that. you can use a code like below. so for that you need to install redis. you can find installtaion steps on google.
At your server side for e.g. you are getting the data into your function.
$redisobject = new Redis();
$redisobject->connect('127.0.0.1');
$purchase_info = json_encode(array('user_id' =>'2','purchase_information'=>'sample_data')); // json array of data which you want to post
$redisobject->publish('redis_data_php', $purchase_info); //your channel name and data.
Inside your node.js after installing redis module using npm
var redis = require('redis');
var redis_client = redis.createClient();
redis_client.subscribe('redis_data_php'); // same channel name as you have mentioned on php side
redis_client.on('message', function(channel, message){
var purchase_data = JSON.parse(message);
console.log(purchase_data);
socket.emit('redis_data', {"info" : purchase_data.purchase_information}); //emit your data to client as you want.
});
basically redis will create a channel between php and node.js so you can communicate via that channel.
Need to install below things
Install redis (if you are using ubuntu than follow this link https://www.digitalocean.com/community/tutorials/how-to-install-and-use-redis)
Than install redis for php (https://joshtronic.com/2014/05/12/how-to-install-phpredis-on-ubuntu-1404-lts/)
Install redis for node (npm install redis)
Hope this will work for you...

Redis pub/sub limit

Using Redis pub/sub, is there a way to limit the number of listeners that Redis will publish to?
http://redis.io/commands/publish
For example, using Node.js parlance:
var redis = require('redis');
var rcPub = redis.createClient();
var rcSub = redis.createClient();
rcPub.publish('channel','messageA',{limit:1}); //desired functionality/syntax
basically, when I send messageA to Redis, I want to tell Redis to publish the message to only one listener/subscriber. Seems possible, but can Redis do this?
In the Redis docs, the command is:
PUBLISH channel message
what I am looking for is:
PUBLISH channel message limit
where limit is an integer. Seems reasonable, and easy to implement from Redis' perspective.
You can add a channel per subscribed, and then publish to a single user's channel when you need to communicate with him.

How to check socket is alive (connected) in socket.io with multiple nodes and socket.io-redis

I am using socket.io with multiple nodes, socket.io-redis and nginx. I follow this guide: http://socket.io/docs/using-multiple-nodes/
I am trying to do: At a function (server site), I want to query by socketid that this socket is connected or disconnect
I tried io.of('namespace').connected[socketid], it only work for current process ( it mean that it can check for current process only).
Anyone can help me? Thanks for advance.
How can I check socket is alive (connected) with socketid I tried
namespace.connected[socketid], it only work for current process.
As you said, separate process means that the sockets are only registered on the process that they first connected to. You need to use socket.io-redis to connect all your nodes together, and what you can do is broadcast an event each time a client connects/disconnects, so that each node has an updated real-time list of all the clients.
Check out here
as mentioned above you should use socket.io-redis to get it work on multiple nodes.
var io = require('socket.io')(3000);
var redis = require('socket.io-redis');
io.adapter(redis({ host: 'localhost', port: 6379 }));
I had the same problem and no solution at my convenience. So I made a log of the client to see the different methods and variable that I can use. there is the client.conn.readystate property for the state of the connection "open/closed" and the client.onclose() function to capture the closing of the connection.
const server = require('http').createServer(app);
const io = require('socket.io')(server);
let clients = [];
io.on('connection', (client)=>{
clients.push(client);
console.log(client.conn.readyState);
client.onclose = ()=>{
// do something
console.log(client.conn.readyState);
clients.splice(clients.indexOf(client),1);
}
});
When deploying Socket.IO application on a multi-nodes cluster, that means multiple SocketIO servers, there are two things to take care of:
Using the Redis adapter and Enabling the sticky session feature: when a request comes from a SocketIO client (browser) to your app, it gets associated with a particular session-id, these requests must be kept connecting with the same process (Pod in Kubernetes) that originated their ids.
you can learn more about this from this Medium story (source code available) https://saphidev.medium.com/socketio-redis...

SocketIO on a Node.js cluster

I have a standalone Node.js app which has SocketIO server that listens on a certain port, e.g. 8888. Now I am trying to run this app in a cluster and because cluster randomly assigns workers to requests, SocketIO clients in XHR polling mode once handshaken and authorized with one worker get routed to another worker where they're not handshaken and the mess begins.
And because workers don't share anything, I can't find a workaround. Is there a known solution to this issue?
There is no "simple" solution. What you have to do is the following:
If a client connects to a worker, save the connection-id together with the worker-id and a potential additional identification-id in a global (=for all workers accessible) store (i.e. redis).
If a client gets routed to another worker, use the store to look up which worker is reponsible for this client (either with the connection-id or with the additional identification-id and then hand it over to that worker (either with the nodejs-worker-master-worker-communication or via redis-pub-sub)
I habe implemented such thing with sock.js and an additional degree of complexity: I have two node.js servers with four workers each, so I had to use redis-pub-sub for worker/worker communication, because it is not guaranteed that they are on the same machine.
Actually there is a simple solution: using Redis to store sockets states.
Everything is explained in Socket.IO documentation:
The default 'session' storage in Socket.IO is in memory (MemoryStore).
The MemoryStore only allows you to deploy socket.io on a single
process. If you want to scale to multiple process and / or multiple
servers you can use our RedisStore which uses the Redis NoSQL database
as man in the middle.
So in order to change the store instance to RedisStore we add this:
var RedisStore = require('socket.io/lib/stores/redis')
, redis = require('socket.io/node_modules/redis')
, pub = redis.createClient()
, sub = redis.createClient()
, client = redis.createClient();
// Needs to be done after 'listen()'
io.set('store', new RedisStore({
redisPub : pub
, redisSub : sub
, redisClient : client
}));
Of course you will need to have a redis server running.

socketio and redisstore scaling efficiency

I am working on a pretty big project that involves sending data between clients. So, I am just researching on some new technologies out there. Anyways, thought I'd give Nodejs a try. I just have a question about socketio and redis.
When we are using the pub/sub functions in socketio, does every client connection create a new connection to redis? Or, does socketio use a max of create three connections (in total, regardless of the number of clients) to do the pub/sub stuff?
From the source, it seems that each client connection has two associated subscriptions to Redis (this.store in the code), but that each socket.io server has only three connections to Redis (source).
this.store.subscribe('message:' + data.id, function (packet) {
self.onClientMessage(data.id, packet);
});
this.store.subscribe('disconnect:' + data.id, function (reason) {
self.onClientDisconnect(data.id, reason);
});
Redis should be able to handle a lot of connections as well as subscriptions, but benchmarking is recommended as always.

Resources