Subscribing Multiple Topics, using wildcards or creating instances? - node.js

I am currently in the middle of a project which include the use of MQTT. I'm writing my application in node.js. My project requires to connect/communicate to multiple devices which each device has specified and different topic(s). Later, the data from the each message will be stored into database (MongoDB). I'm using mqtt package from npmjs.com 1.
Below is an example of mqtt package code:
var mqtt = require('mqtt');
var client = mqtt.connect('mqtt://test.mosquitto.org');
client.on('connect', function () {
client.subscribe('presence');
client.publish('presence', 'Hello mqtt');
});
client.on('message', function (topic, message) {
// message is Buffer
console.log(message.toString());
client.end();
});
My problem is what should I do to get messages from the devices. I can easily list to a single topic using "#" to get all topics but I have to manually sort/split the topic and etc.
However, I'm thinking of another option in which I will create new mqtt client instance for each topic but I do not know if there is any limit of instances. I might use forever function from async package 2. My code might be like this:
var async = require('async');
var mqtt = require('mqtt');
var client = mqtt.connect("URL of MQTT broker");
var _topic = "";
var Subscriber = function(topic){
this._topic = topic;
client.on('connect', function () {
client.subscribe(this.topic.setter);
});
async.forever(
function(next){
client.on('message', function (topic, message) {
// TO DO store message
});
},
function(err){
client.end();
}
);
};
module.exports = Subscriber;
Does anyone have any recommendation?

I would not recommend creating a separate connection for each subscription you want to make. Each connection is a new TCP connection and would waste resources in both your application and the broker.
The normal pattern here would be to use a wildcard subscription. The message callback handler is handed the topic the message came on, so, as long as your sensibly structure your topic space, there is very little overhead in having to route the message appropriately in your application.

Related

Websocket + Redis: multiple channels, specific subscriptions/publishing

I'm new to websockets, and am wondering how best to go about this.
My scenario: I have a server that handles different classes of users. For this example, let's say the classes are "mice", "cats", and "dogs"
Each of those classes should have their own channels to listen to for changes e.g. "mice-feed", "cat-feed", and "dog-feed"
My question is: after the server authenticates and determines the class of the current user, what's the best way to have them subscribed to a specific channel, or channel(s), so that when I broadcast messages to said channel(s), I can make sure that only members of particular classes get them (as against everyone currently connected to that server)?
My current code setup looks like this:
var ws = require('ws');
var redis = require('redis');
/* LOCATION 1 */
// prep redis, for websocket channels
var pub = redis.createClient();
var sub = redis.createClient();
// subscribe to our channels
sub.subscribe('mice-feed');
sub.subscribe('cat-feed');
sub.subscribe('dog-feed');
// declare the server
const wsServer = new ws.Server({
noServer: true,
path: "/",
});
/* ... removing some code for brevity... */
wsServer.on("connection", function connection(websocketConnection, connectionRequest) {
/* LOCATION 2 */
});
Do I put the redis declarations in LOCATION 1 (where it currently is), or in LOCATION 2 (when a successful connection is established)? Or neither of the above?
(also: I know it's possible to do this on the websocket end directly i.e. iterate through every client and ws.send if some criterion is matched, but iteration can become costly, and I'm wondering if I can do it on a redis-channel wide operation instead)
If I were building this, my first approach would be this:
// connect to Redis
const client = createClient();
client.on('error', (err) => console.log('Redis Client Error', err));
await client.connect();
// declare the server
const wsServer = new ws.Server(...elided...);
// handle connection
wsServer.on('connection', async (websocketConnection, connectionRequest) => {
const sub = client.duplicate()
// figure out the feed
const feed = 'animal-feed';
await sub.subscribe(feed, message => {
...do stuff...
});
});
It's pretty straightforward but would result in ever user having a dedicated connect to Redis. That may or may not matter depending on how many users you anticipate having.

Socket IO Server Clusters working with Redis Pub/Sub

So firstly, I have built a microservice that fetches Football API, and thru pub/sub system of redis, it publishes any changes if there are any for livescores.
Now my server, with sockets and routes, will be in cluster mode. I already set this up with socketio-redis. Here is a snippet of this set up:
const io = require('socket.io')();
const sRedis = require('socket.io-redis');
const adapter = sRedis({ host: 'localhost', port: 6379 });
const { promisify } = require('util');
const Redis = require('ioredis');
const redis = new Redis();
redis.subscribe('livescore');
io.adapter(adapter);
const ioa = io.of('/').adapter;
ioa.clients = promisify(ioa.clients);
ioa.clientRooms = promisify(ioa.clientRooms);
ioa.remoteJoin = promisify(ioa.remoteJoin);
ioa.remoteLeave = promisify(ioa.remoteLeave);
ioa.allRooms = promisify(ioa.allRooms);
// notice this listener
redis.on('message', (channel, message) => {
io.emit('livescore', message);
})
io.on('connect', async (socket) => {
socket.clientRooms = () => ioa.clientRooms(socket.id);
socket.remoteJoin = (room) => ioa.remoteJoin(socket.id, room);
socket.remoteLeave = (room) => ioa.remoteLeave(socket.id, room);
socket.remoteDisconnect = () => ioa.remoteDisconnect(socket.id);
socket.on('join room', async (id) => {
await socket.remoteJoin(id);
socket.emit('join room', `You have joined room ${id}`)
socket.broadcast.emit('join room', `${socket.id} has joined.`)
});
socket.on('leave room', (id) => {
socket.remoteLeave(id);
});
})
module.exports = io;
So, if I run single instance of this node app, everything works perfectly.
But if I run it in cluster mode, let's say there are 4 clusters (I'm running cluster mode with pm2), the following happens:
Microservice publishes event.
Each cluster has a subscription on 'livescore' channel
Each cluster does io.emit() (to all clients)
Client get 4 same events at almost same time.
I figured out why the client gets 4 same events, but I wanna know what is the right way of handling this?
My only thought on solution is that I only do redis sub on one cluster, and publish everything from that one, but I fear that would be too much job for one cluster?
Any ideas?
There are probably multiple solutions to fix it, you could for example:
Use a message queue instead of pub/sub
Depending on the number of processing, you probably only want one node it process the message. A pub/sub is not what you want in that case. You could for example store your messages in a list and use the LPOP command to get and delete a message. Then you could say the "first one catches it" - this way only one of your servers will do the work, but a random one basically.
You could also use a distinct message queue like RabbitMQ, SQS, etc.
Use socket.io-emitter to send messages
Since you're using socket.io-redis anyway, your messages get distributed to your nodes. There's a project which is part of socket.io-redis, it's called socket.io-emitter. That can be used to send messages to all your nodes without being one itself. When you implement that in your worker microservice (the one that writes the message to "livescore" at the moment), you can send messages directly to your clients.
That might not work if you need to process the messages in your node app though.

Can't receive redis data from socket io

I'm building a realtime visualization using redis as pubsub messenger between python and node. There's a python script always running which sets a redis hash with hmset. That side of the app is working fine, if I enter the following example command: "HGETALL 'sellers-80183917'" in a redis client I end up getting the proper data.
The problem is in the js side. I'm using socketio and redis nodejs libraries to listen to the redis instance and publish the results online through a d3js viz.
I run the following code with node:
var express = require('express');
var app = express();
var redis = require('redis');
app.use(express.static(__dirname + '/public'));
var http = require('http').Server(app);
var io = require('socket.io')(http);
var sredis = require('socket.io-redis');
io.adapter(sredis({ host: 'localhost', port: 6379 }));
redisSubscriber = redis.createClient(6379, 'localhost', {});
redisSubscriber.on('message', function(channel, message) {
io.emit(channel, message);
});
app.get('/sellers/:seller_id', function(req, res){
var seller_id = req.params.seller_id;
redisSubscriber.subscribe('sellers-'.concat(seller_id));
res.render( 'seller.ejs', { seller:seller_id } );
});
http.listen(3000, '127.0.0.1', function(){
console.log('listening on *:3000');
});
And this is the relevant part of the seller.ejs file that's receiving the user requests and outputting the viz:
var socket = io('http://localhost:3000');
var stats;
var seller_key = 'sellers-'.concat(<%= seller %>);
socket.on(seller_key, function(msg){
stats = [];
console.log('Im in');
var seller = $.parseJSON(msg);
var items = seller['items'];
for(item in items) {
var item_data = items[item];
stats.push({'title': item_data['title'], 'today_visits': item_data['today_visits'], 'sold_today': item_data['sold_today'], 'conversion_rate': item_data['conversion_rate']});
}
setupData(stats);
});
The problem is that the socket_on() method never receives anything and I don't see where the problem is as everything seems to be working fine besides this.
I think that you might be confused as to what Pub/Sub in Redis actually is. It's not a way to listen to changes on hashes; you can have a Pub/Sub channel called sellers-1, and you can have a hash with the key sellers-1, but those are unrelated to each other.
As documented here:
Pub/Sub has no relation to the key space.
There is a thing called keyspace notifications that can be used to listen to changes in the key space (through Pub/Sub channels); however, this feature isn't enabled by default because it'll take up more resources.
Perhaps an easier method would be to publish a message after the HMSET, so any subscribers would know that the hash got changed (they would then retrieve the hash contents themselves, or the published message would contain the relevant data).
This brings us to the next possible issue: you only have one subscriber connection, redisSubscriber.
From what I understand from the Node.js Redis driver, calling .subscribe() on such a connection would remove any previous subscriptions in favor of the new one. So if you were previously subscribed to the sellers-1 channel and subscribe to sellers-2, you wouldn't be receiving messages from the sellers-1 channel anymore.
You can listen on multiple channels by either passing an array of channels, or by passing them as a arguments:
redisSubscriber.subscribe([ 'sellers-1', 'sellers-2', ... ])
// Or:
redisSubscriber.subscribe('sellers-1', 'sellers-2', ... )
You would obviously have to track each "active" seller subscription. Either that, or create a new connection for each subscription, which also isn't ideal.
It's probably a better idea to have a single Pub/Sub channel on which all changes would get published, instead of a separate channel for each seller.
Finally: if your seller id's aren't hard to guess (for instance, if it's based on an incremental integer value), it would be trivial for someone to write a client that would make it possible to listen in on any seller channel they'd like. It might not be a problem, but it is something to be aware of.

ZeroMQ: Can subscribers publish?

I am trying to implement an idea I have for a pub/sub app I'm doing in Node.js with 0mq. I want to use the publishing of messages as a kind of event system.
Here's an example: Say I have a publisher and two subscribers.
The publisher sends a message whenever a file is changed:
//publisher.js
'use strict';
const fs = require('fs'),
zmq = require('zmq'),
// create publisher endpoint
publisher = zmq.socket('pub'),
filename = 'target.txt';
fs.watch(filename, function(){
// send message to any subscribers
publisher.send('files changed');
});
// listen on TCP port 5432
publisher.bind('tcp://*:5432', function(err) {
console.log('Listening for zmq subscribers...');
});
Subscriber #1 logs everything that happens to the console:
//subscriber1.js
'use strict';
const zmq = require('zmq'),
subscriber = zmq.socket('sub');
// subscribe to all messages
subscriber.subscribe('');
// handle messages from publisher
subscriber.on("message", function(data) {
console.log('Got a message: ' + data);
});
// connect to publisher
subscriber.connect("tcp://localhost:5432");
Subscriber #2 validates the changes to the file and sends a message to let us know that it looks ok:
//subscriber2.js
'use strict';
const zmq = require('zmq'),
subscriber = zmq.socket('sub');
// subscribe to all messages
subscriber.subscribe('');
// handle messages from publisher
subscriber.on("message", function(data) {
//pretend I did some work here
subscriber.send('File looks great!');
});
// connect to publisher
subscriber.connect("tcp://localhost:5432");
What I'd expect is that sending a message back through the socket from subscriber 2 would forward it on to subscriber 1, who would log it. But that doesn't seem to be happening.
I am probably fundamentally misunderstanding pub/sub, but is it possible to do what I'm talking about? If so, do I need to use a different pattern, or am I just using the zmq API wrong?
I had to do a little extra plumbing to get the results I was looking for.
Had to add a publisher socket to subscriber-publisher.js, and publish from it to a different port. Then I had to subscribe subscriber.js to the two ports published by publisher.js and subscriber-publisher.js.
So now I understand that in order to do this with pub-sub, I have to build more sockets, and must do a bit more manual subscription the I thought.

Socket.io with multiple Node.js hosts, emit to all clients

I am new to Socket.io and trying to get my head around the best approach to solve this issue.
We have four instances of a Node.js app running behind a load balancer.
What I am trying to achieve is for another app to POST some data to the load balancer URL which will hand if off to one of the instances.
The receiving instance will store the data, then use Socket.io to emit the data to the connected clients.
The issue is that browser/client can only be connected to a single instance at one time.
I am trying to determine if there is a way to emit to all clients at once?
Or have the clients connect to multiple servers using io.connect?
Or is this a case for Redis?
Publish/Subscribe is what you need here. Redis will give you the functionality your looking for out of the box. You just need to create a redis client and subscribe to an update channel on each of your app server nodes. Then, publish the update when a POST is successful (or whatever). Finally, have the redis client subscribe to the update chanel and on message emit a socketio event:
(truncated for brevity)
var express = require('express')
, socketio = require('socket.io')
, redis = require('redis')
, rc = redis.createClient()
;
var app = express();
var server = http.createServer(app);
var io = socketio.listen(server);
server.listen(3000);
app.post('/targets', function(req, res){
rc.publish('update', res.body);
});
rc.on('connect', function(){
// subscribe to the update channel
rc.subscribe('update');
});
rc.on('message', function(channel, msg){
// util.log('Channel: ' + channel + ' msg: ' + msg);
var msg = JSON.parse(msg);
io.sockets.in('update').emit('message', {
channel: channel,
msg: msg
});
});
Then in the JS app, listen for that emitted message:
socket.on('message', function(data){
debugger;
// do something with the updated data
});
Of course, introducing this new Redis Server adds another single point of failure. A more robust implementation may use something like a message broker with AMQP or ZeroMQ or some similar networking library which provides pub/sub capabilities.

Resources