Mosca Mqtt Broker read published message - node.js

Hello I am using mosca mqtt broker I want to read the message that a client publish to a topic.
Is there any way to do that?
In published event I log the packet.payload and it prints only clientid and topic.
server.on('published', function(packet, client) {
console.log('Published', packet.payload);
});
Thank you

You could use the callback "on published " to log the published message either in a file or maybe in your database for accessing whenever you need it, like:
server.on('published', function(packet, client) {
// Do what you want with your message here
var msg = packet.payload.toString('utf8');
});
Be careful of internal messages though. this callback does not filter for you so you have to do it yourself.
Refer to this answer here

server.on('published', function(packet, client) {
console.log('Published: ', packet.payload.toString('utf8'));
});

Related

How do I let socket.io server re-emit message to a particular socket when that socket re-connect?

My server codes use io.to(socketId).emit('xxx'); to a message to a particular socket, refer to https://socket.io/docs/emit-cheatsheet/
But when the connection is bad and the client disconnects and connects again, socketId changes and io.to(socketId).emit('xxx'); then fails. So how do I implement a re-emit method to handle the reconnection ?
My question is kind of opposite to this one, socket.io stop re-emitting event after x seconds/first failed attempt to get a response, from it I learn that socket.io client can re-emit when it reconnects. But the server code seems to lack that support.
--- update ---
As the comment said I show here "the code context where you're saving the socketId value that gets stale."
First, each user has a unique id in our system, when socket connects the client will send a login message so I save the userId and socketId pair. When I get disconnect event I clear that info.
Second, the user will get messages through redis pubsub (from other service), and I will send that message to the client throught the socketid I record. So when socket connects I will subscribe a channel identified by userid and unsubscribe it in disconnect event.
Third, my problem happens when I get redis pub event while socketId value gets stale. Actually the value of socketid is the latest one but the socket may fail to send the message.
io.on('connection', async function (socket) {
socket.on('login', async function (user_id, fn) {
...
await redis
.multi()
.hset('websocket:socket_user', socket.id, user_id)
.hset(
`websocket:${user_id}`,
'socketid',
socket.id
)
.exec()
await sub.subscribe(user_id)
...
}
socket.on('disconnect', async function (reason) {
let user_id = await redis.hget('websocket:socket_user', socket.id)
await redis.del(`websocket:${user_id}`)
sub.unsubscribe(user_id)
...
}
//Here is what problem happens when socket is trying to reconnect
//while I got redis pub message
sub.on('message', async function (channel, message) {
let socketid = await redis.hget(`websocket:${channel}`, 'socketid')
if (!socketid) {
return
}
//so now I assume the socketid is valid,
//but it turns out clients still complain they didn't receive message
io.to(socketid).emit('msg', message) //When it fails how do I re-emit?
})
I would suggest you use your unique id as a room, that is the best solution I come up with, no matter how many times the user is joined the user will be joined in a single room only(with their unique id).
see my answer on
NodeJs Socket.io Rooms
You can generate custom socket.id for your socket clients by overwriting default method to generate socket id. in this case socket.id will be known for you and can be re-emit when same id comes online.
//The function is called with a node request object (http.IncomingMessage) as first parameter.
As per socket.io docs
io.engine.generateId = (req) => {
return "custom:id:" + user_unique_id; // custom id must be unique
}
I can't find any existing solution for that so this is what I come up with:
Record the messages that fail to send to the socket.
I can't figure out a way to deal with these failed messages, e.g. if they have failed I don't know if re-send will make any difference.
When I get a socket connect event I check if this is from the same client, if yes then check if I have failed messages for the old socket, if yes send those messages to the new socket.
How to record the failed message is another task I find socket.io doesn't support and this is what I do,
Record the message I want to send
use ack callback to delete the message, so the messages are not deleted are the failed ones.
to use ack I can't io.to(socketId).emit('xxx'); because "acknowledgements are not supported when emitting from namespace." so I need to first get socket object from id using io.sockets.connected[socketid]
I can start a timer to check if the message is still stored in redis, if yes I re-emit it. But I don't know if re-emit will make any difference. So I have not done that yet.
The codes to record failed one are somewhat like this
let socket = io.sockets.connected[socketid]
if (!socket) return
let list_key = `websocket:${socketid}`
await redis.sadd(list_key, message)
socket.emit('msg', message, async (ack) => {
await redis.srem(list_key, message)
})
If someone can come up with a better solution I am all ears! Thanks.

How get socket.io Broadcast messages on client side?

I'm trying send a broadcast message from Server to clients, follow the docs avaiable at https://socket.io/docs/
And doesn't work
I tryed of two forms at server side
io.emit('update_status',{ device: device.id, status: 'Dead'});
or
socket.broadcast.emit({ device: device.id, status: 'Dead'});
At Client Side
socket.on('update_status',function(data) {
console.log('update_status',data);
});
or
socket.on('message',function(data) {
console.log('message ',data);
});
Any message arrives at client side.
I need to understand what I am doing wrong.
UPDATE
I found the solution with the help of #Azka. I do not know why, but using the variable socket soon after initiate the socket on the client side works, started to receiving messages.
var socket = io ();
socket.on ('message', function (data) {
console.log ('message', data);
});
Another thing with helps me is using debug at client side.
Paste at Browser console to read all incoming messages
localStorage.debug = 'socket.io-client:socket';
The Problem is: After the socket was added at room stopped to Receive Broadcast
Any sugestions?

Socket.io + Redis - clients are joining each others' "private" rooms

I've just started working with Socket.io and Redis for pub/sub messaging and it's pretty great. One important feature of my application is that the server needs to be able to broadcast messages to all subscribers of a room, and also choose 1 subscriber in that room and narrowcast a message just to them. For now, that subscriber is chosen at random. Based on reading socket.io's documentation, I think I can accomplish this.
However, I've come across something I don't understand. In Socket.io's Default Room documentation (https://socket.io/docs/rooms-and-namespaces/#default-room), they say that each socket automatically joins a room named after its socket ID. This looks like it would solve my narrowcast requirement -- look at the list of client IDs connected to my "big" room, choose one at random, and then send a message to the room with the same name as the chosen ID.
However, it doesn't work because for some reason all clients are joining each others' default rooms. I'm not seeing any exceptions in my code, but my "narrowcast" messages are going to all clients.
Here's my server-side code:
var io = require('socket.io');
var redisAdapter = require('socket.io-redis');
var server = io();
server.adapter(redisAdapter({ host: 'localhost', port: 6379 }))
server.on('connect', (socket) => {
console.log(`${socket.id} connected!`);
socket.join('new');
server.emit('welcome', `Please give a warm welcome to ${socket.id}!`);
server.to(socket.id).emit('private', 'Just between you and me, I think you are going to like it here');
});
server.listen(3000);
setInterval(whisperAtRandom, 2000);
function whisperAtRandom() {
server.in('new').adapter.clients((err, clients) => {
if (err) throw err;
console.log('Clients on channel "new": ', clients);
chosenOne = clients[Math.floor(Math.random()*clients.length)];
console.log(`Whispering to ${chosenOne}`);
server.to(chosenOne).emit('private', { message: `Psssst... hey there ${chosenOne}`});
server.in(chosenOne).adapter.clients((err, clients) => {
console.log(`Clients in ${chosenOne}: ${clients}`)
})
});
}
It stands up the server, listens on port 3000, and then every 2 seconds sends a message to a random client in the "new" room. It also logs out the list of clients who are connected to the "new" room and the "" room.
Here's the client-side code:
var sock = require('socket.io-client')('http://localhost:3000');
sock.connect();
sock.on('update', (data) => {
console.log('Updating...');
console.log(data);
});
sock.on('new', (data) => {
console.log(`${sock.id} accepting request for new`);
console.log(data.message);
});
sock.on('welcome', (data) => {
console.log('A new challenger enters the ring');
console.log(data);
});
sock.on('private', (data) => {
console.log(`A private message for me, ${sock.id}???`);
console.log(data);
});
My problem is that all my clients are connected to each others' "" room. Here's a sample from my logs:
0|socket-s | Clients on channel "new": [ 'dJaoZd6amTfdQy5NAAAA', 'bwG1yTT46dr5R_G6AAAB' ]
0|socket-s | Whispering to bwG1yTT46dr5R_G6AAAB
2|socket-c | A private message for me, dJaoZd6amTfdQy5NAAAA???
2|socket-c | Psssst... hey there bwG1yTT46dr5R_G6AAAB
1|socket-c | A private message for me, bwG1yTT46dr5R_G6AAAB???
1|socket-c | Psssst... hey there bwG1yTT46dr5R_G6AAAB
0|socket-s | Clients in bwG1yTT46dr5R_G6AAAB: dJaoZd6amTfdQy5NAAAA,bwG1yTT46dr5R_G6AAAB
You can see that the "private" message is received by both clients, dJaoZ... and bwG1y..., and that both clients are connected to the default room for bwG1y....
Why is this? Does it have something to do with the fact that both of my clients are running on the same machine (with different Node processes)? Am I missing something in Socket.io's documentation? Any help is appreciated!
PS -- to add even more confusion, the private messaging that occurs in server.on('connect', ... works! Each clients receives the "Just between you and me ..." message exactly once, right after they connect to the server.
Your main problem could caused by server.to in whisperAtRandom function, use socket instead of server for the private message:
socket.to(chosenOne).emit('private', { message: `Psssst... hey there ${chosenOne}`});
To answer your other problems, try to change these:
server.emit('welcome', `Please give a warm welcome to ${socket.id}!`);
server.to(socket.id).emit('private', 'Just between you and me, I think you are going to like it here');
To:
socket.broadcast.emit('welcome', `Please give a warm welcome to ${socket.id}!`);
socket.emit('private', 'Just between you and me, I think you are going to like it here');
Check out my Socket.IO Cheatsheet:
// Socket.IO Cheatsheet
// Add socket to room
socket.join('some room');
// Remove socket from room
socket.leave('some room');
// Send to current client
socket.emit('message', 'this is a test');
// Send to all clients include sender
io.sockets.emit('message', 'this is a test');
// Send to all clients except sender
socket.broadcast.emit('message', 'this is a test');
// Send to all clients in 'game' room(channel) except sender
socket.broadcast.to('game').emit('message', 'this is a test');
// Send to all clients in 'game' room(channel) include sender
io.sockets.in('game').emit('message', 'this is a test');
// sending to individual socket id
socket.to(socketId).emit('hey', 'I just met you');

Get mqtt message header in node.js server with mosquitto broker

How to get header agent or device/browser details in mqtt message with mosquitto broker. My mqtt code sample:
var mqtt = require('mqtt');
var client = mqtt.connect('mqtt://127.0.0.1:1883',{
username: 'xxxx',
password: 'xxxx'
});
client.on('connect', function (err,done) {
if(err){
console.log(err)
}else{
console.log("Connected...")
client.subscribe('test');
}
})
client.on('message', function (topic, message) {
// want to get the header details here.
})
You can not get anything other than the message payload and topic from a MQTT message because no other information is included in the message format. This is by design, in Pub/Sub messaging the only thing that matters in the topic and the payload, not who sent it.
The on('message',function(){}) callback can take a 3rd parameter which is the raw mqtt-packet object. You can see full list of data available in the doc here. But the only extra information is about duplicate status, qos and if the message is retained.
client.on('message',function(topic, message, mqtt-packet) {
...
});
If you need more information you need to manually include it in the message payload published by the client yourself.

Get message to persist in RabbitMQ when there a no consumers

My scenario: I want my app to publish logs to RabbitMQ and have another process consume those logs and write to a DB. Additionally, logs should persist in RabbitMQ even if there is no consumer at the moment. However, with the code I have now, my logs don't show up in RabbitMQ unless I start a consumer. What am I doing wrong?
My code:
var amqp = require('amqp');
var connection = amqp.createConnection({
host: "localhost",
port: 5672
});
connection.on('ready', function() {
// Immediately publish
setTimeout(function() {
connection.publish('logs',
new Buffer('hello world'), {},
function(err, res) {
console.log(err, '|', res);
});
}, 0);
// Wait a second to subscribe
setTimeout(function() {
connection.queue('logs', function(q) {
q.subscribe(function(message) {
console.log(message.data);
});
});
}, 1000);
});
Many times the general set up with rabbit MQ is for the publisher to declare and exchange and publish to it. Then the consumer declares the same exchange (which will just ensure it exists if it is already there and create it if the consumer starts first). This is incorrect for your usage. You need to have the queue created from the moment that you start publishing to it.
The publisher must create the exchange and the queue, the queue needs to be
autodelete=false
, durable only helps if you plan to restart your RabbitMQ server. It then publishes to the exchange and the messages will be delivered to the queue where they will wait for your consumer to connect to it and then read all the messages that it missed. It must use the exact same queue declare parameters as the producer did when it declared the queue. As it is
autodelete=false
It will ensure that no matter when the the consumer comes up and down it will stay alive and retain the messages.

Resources