Fetching data from MQTT to Vue app instance - node.js

How do can I fetch data from MQTT to Vue app, I've established a properly working connection and I can console log the data but I'm not able to load the data to component's data property.
created() {
client.on("connect", function() {
console.log("MQTT Connected");
client.subscribe("#", function(err) {
console.log(err);
});
});
client.on("message", (topic, message) => {
console.log("topic:", topic);
console.log(message.toString());
this.mqttData = JSON.parse(message.toString());
});
},
data() {
return {
mqttData: {}
}
};
Whenever I try to log the mqttData in console it seems to be a empty object. When I printed this inside of the client.on function I've got the correct Vue instance with all of it's fields and methods. This really bothers me because I can access the Vue object but I cannot modify it's contents.

Maybe try this in the "mounted" lifecycle hook. Here's an example of something I use that's listening to a websocket. It should be similar implementation to your application
mounted() {
let connection = new WebSocket('wss://somesocket.net/ws')
connection.onmessage = (event) => {
console.log("Data received!")
console.log(event.data)
const data = JSON.parse(event.data)
this.ws_data = data
}
connection.onopen = (event) => {
console.log("Connected! Waiting for data...")
}
},

This is how I did it using vue-mqtt package.
export default {
data () {
return {
sensor: ''
}
},
mqtt: {
/** Read incoming messages from topic test*/
'test' (data) {
this.sensor = data
console.log(data)
}
},
created () {
},
async mounted () {
this.$mqtt = await this.$mqtt
this.$mqtt.publish('test', 'hello from vuemqtt yo !!!!')
this.$mqtt.subscribe('test')
}
}

Related

NodeJS net.socket pipeline transform while loop not wait

My transform pipeline not wait until message buffer loop end and not receive message from destination.
It can send out only when another message trigger and send as two messages together. The destination can receive two messages at that time.
How will I make it await until message buffer read until the end to able to send out individual messages, please ?
Start.js
import Proxy from "./Proxy.js";
const proxy = Proxy();
proxy.listen('4000');
Proxy.js
import net from "net";
import {filter} from '../utils/transformStream.js';
export default () =>
net.createServer((con1) => {
const con2 = net.connect(
'1234',
'127.0.0.1'
);
pipeline(con1, con2, filter(), con1, (err) => {
console.log("pipeline closed", err);
});
con2.on("data", async (data) => {
try {
console.log("sent from con2:", data);
}
}
}
con1.on("data", async (data) => {
try {
console.log("sent from con1:", data);
}
}
}
transformStream.js
import { Transform } from 'stream';
import { messageBuffer2 } from './msgBuffer2.js';
const filter = () => {
return new Transform({
async transform(chunk, encoding, cb) {
if (chunk.toString("utf-8").includes("Unprocessed")) {
cb(null, '');
} else {
messageBuffer2.push(chunk.toString("utf-8"));
while (!messageBuffer2.isFinished()) {
const message = messageBuffer2.handleData();
cb(null, message);
}
}
}
});
};
export { filter };
Thanks.

How to stream json data from one nodejs server to another and process it at the receiver at runtime?

what I'm basically trying to achieve is to get all items of a mongodb collection on a nodejs server, stream these items (in json format) over REST to another nodejs server and pipe the incoming readstream to stream-json to persist the parsed objects afterwards in another mongodb.
(I need to use streams because my items can be deeply nested objects which would consume a lot of memory. Additionally I'm unable to access the first mongodb from the second server directly due to a strict network segmentation.)
Well, the code I got so far is actually already working for smaller amounts of data, but one collection has about 1.2 GB. Therefore the processing at the receiving side continues to fail.
Here's the code of the sending server:
export const streamData = async (res: Response) => {
try {
res.type('json');
const amountOfItems = await MyModel.count();
if (JSON.stringify(amountOfItems) !== '0'){
const cursor = MyModel.find().cursor();
let first = true;
cursor.on('error', (err) => {
logger.error(err);
});
cursor.on('data', (doc) => {
if (first) {
// open json array
res.write('[');
first = false;
} else {
// add the delimiter before every object that isn't the first
res.write(',');
}
// add json object
res.write(`${JSON.stringify(doc)}`);
});
cursor.on('end', () => {
// close json array
res.write(']');
res.end();
logger.info('REST-API-Call to fetchAllItems: Streamed all items to the receiver.');
});
} else {
res.write('[]');
res.end();
logger.info('REST-API-Call to fetchAllItems: Streamed an empty response to the receiver.');
}
} catch (err) {
logger.error(err);
return [];
}
};
And that's the receiving side:
import { MyModel } from '../models/my-model';
import axios from 'axios';
import { logger } from '../services/logger';
import StreamArray from 'stream-json';
import { streamArray } from 'stream-json/streamers/StreamArray';
import { pipeline } from 'stream';
const persistItems = async (items:Item[], ip: string) => {
try {
await MyModel.bulkWrite(items.map(item => {
return {
updateOne: {
filter: { 'itemId': item.itemId },
update: item,
upsert: true,
},
};
}));
logger.info(`${ip}: Successfully upserted items to mongoDB`);
} catch (err) {
logger.error(`${ip}: Upserting items to mongoDB failed due to the following error: ${err}`);
}
};
const getAndPersistDataStream = async (ip: string) => {
try {
const axiosStream = await axios(`http://${ip}:${process.env.PORT}/api/items`, { responseType: 'stream' });
const jsonStream = StreamArray.parser({ jsonStreaming : true });
let items : Item[] = [];
const stream = pipeline(axiosStream.data, jsonStream, streamArray(),
(error) => {
if ( error ){
logger.error(`Error: ${error}`);
} else {
logger.info('Pipeline successful');
}
},
);
stream.on('data', (i: any) => {
items.push(<Item> i.value);
// wait until the array contains 500 objects, than bulkWrite them to database
if (items.length === 500) {
persistItems(items, ip);
items = [];
}
});
stream.on('end', () => {
// bulkwrite the last items to the mongodb
persistItems(items, ip);
});
stream.on('error', (err: any) => {
logger.error(err);
});
await new Promise(fulfill => stream.on('finish', fulfill));
} catch (err) {
if (err) {
logger.error(err);
}
}
}
As I said, the problem occurs only on a bigger collection holding about 1.2 Gb of data.
The problem seems to occur a few seconds after the sending side server is closing the stream.
This is the error message I get at the receiving server:
ERROR: Premature close
err: {
"type": "NodeError",
"message": "Premature close",
"stack":
Error [ERR_STREAM_PREMATURE_CLOSE]: Premature close
at IncomingMessage.onclose (internal/streams/end-of-stream.js:75:15)
at IncomingMessage.emit (events.js:314:20)
at Socket.socketCloseListener (_http_client.js:384:11)
at Socket.emit (events.js:326:22)
at TCP.<anonymous> (net.js:676:12)
"code": "ERR_STREAM_PREMATURE_CLOSE"
}
Can I somehow prevent the read stream from closing too early?
The only workaround I can imagine right now is to save the stream locally to a file first, then create a new readstream from that file, process/persist the data and delete the file afterwards, although I would prefer not to do that. Additionally I'm not quite sure if that's going to work out or if the closing read stream issue will remain if I try to save a large dataset to a file.
Edit: Well, as I guessed, this approach results in the same error.
Is there a better approach I'm not aware of?
Thanks in advance!
Found a solution using a combination of:
Websockets with stream api and websocket-express to trigger the streaming over websockets via routes
Backend
app.ts
import router from './router/router';
import WebSocketExpress from 'websocket-express';
const app = new WebSocketExpress();
const port = `${process.env.APPLICATION_PORT}`;
app.use(router);
app.listen(port, () => {
console.log(`App listening on port ${port}!`);
});
router.ts
import { Router } from 'websocket-express';
import streamData from './streamData';
const router = new Router();
router.ws('/my/api/path', streamData);
export default router;
streamData.ts (did some refactoring to the above version)
import { MyModel } from '../models/my-model';
import { createWebSocketStream } from 'ws';
export const streamData = async (res: Response) => {
const ws = await res.accept();
try {
const duplex = createWebSocketStream(ws, { encoding: 'utf8' });
duplex.write('[');
let prevDoc: any = null;
// ignore _id since it's going to be upserted into another database
const cursor = MyModel.find({}, { _id: 0 } ).cursor();
cursor.on('data', (doc) => {
if (prevDoc) {
duplex.write(`${JSON.stringify(prevDoc)},`);
}
prevDoc = doc;
});
cursor.on('end', () => {
if (prevDoc) {
duplex.write(`${JSON.stringify(prevDoc)}`);
}
duplex.end(']');
});
cursor.on('error', (err) => {
ws.close();
});
duplex.on('error', (err) => {
ws.close();
cursor.close();
});
} catch (err) {
ws.close();
}
};
Client (or the receiving server)
import { MyModel } from '../models/my-model';
import StreamArray from 'stream-json';
import { streamArray } from 'stream-json/streamers/StreamArray';
import { pipeline } from 'stream';
import WebSocket, { createWebSocketStream } from 'ws';
export const getAndPersistDataStream = async (ip: string) => {
try {
const ws = new WebSocket(`ws://${ip}:${process.env.PORT}/my/api/path`);
try {
const duplex = createWebSocketStream(ws, { encoding: 'utf8' });
const jsonStream = StreamArray.parser({ jsonStreaming: true });
let items: Items[] = [];
const stream = pipeline(duplex, jsonStream, streamArray(), error => {
if (error) {
ws.close();
}
});
stream.on('data', (i: any) => {
items.push(<Items>i.value);
if (items.length === 500) {
persistItems(items, ip);
items = [];
}
});
stream.on('end', () => {
persistItems(items, ip);
ws.close();
});
stream.on('error', (err: any) => {
ws.close();
});
await new Promise(fulfill => stream.on('finish', fulfill));
} catch (err) {
ws.close();
}
} catch (err) {
}
(I removed a lot of (error)-logging stuff, because of that the catch block is empty...)

Socket.io and React: Warning: Can't perform a React state update on an unmounted component

I am a relative newcomer to React and implementing a chat app with react, socket.io, and node.
When the user is chatting with someone, he would connect to a socket. It works fine, but if the user leaves to another page, and RE-ENTERS the same chat again, a second socket is connected(bad), and I get the "Can't perform a React state update on an unmounted component" error.
I did implement code to have socket leave the room (on the second useEffect), if the user leaves the page. But that doesn't seem to work.
Thanks in advance,
Error:
Partial React frontend code:
useEffect(() => {
socket.emit("enter chatroom", { conversationId: props.chatId });
setChatId(props.chatId);
getMessagesInConversation(props.chatId, (err: Error, result: any) => {
if (err) {
setCerror(err.message);
} else {
setMessages(buildMessages(result.messages).reverse());
}
});
socket.on("received", (data: any) => {
setMessages((messages: any) => {
console.log("messages");
console.log(messages);
return [...messages.slice(-4), ...buildMessages(data.newMsg)];
});
});
}, []);
useEffect(() => {
return () => {
socket.emit("leaveChatroom", {
conversationId: chatId,
});
};
}, [chatId]);
simplified nodejs code
private ioMessage = () => {
this._io.on("connection", (socket) => {
console.log("socket io connected");
const socketUser = socket.request.user;
socket.on("enter chatroom", (data) => {
const room_status = Object.keys(socket.rooms).includes(
data.conversationId
);
if (!room_status) { // should only join if socket is not already joined.
socket.join(data.conversationId);
}
});
socket.on("leaveChatroom", (data) => {
socket.leave(data.conversationId);
});
socket.on("chat message", async (msg) => {
const database = getDB();
const newMessage = {
...msg,
createdAt: Date.now(),
};
await database.collection("message").insertOne(newMessage);
const messages = await database
.collection("message")
.find({ conversationId: msg.conversationId })
.toArray();
this._io.to(msg.conversationId).emit("received", {
newMsg: [messages[messages.length - 1]],
});
});
socket.on("disconnect", (socket) => {
delete this._users[socketUser.userId];
console.log("--- disconnect : ", socketUser);
console.log("--- active users: ", this._users);
});
});
};
detailed error log:

Acknowledge RabbitMQ message after socket IO event received from React browser

I have a Node server which consumes messages from a RabbitMQ queue and forwards them to a React frontend as a socket.io event. In the frontend, I have a button click which sends a socket.io event back to the Node server.
Currently, the Node server only logs the receipt of the socket.io event. In addition to logging, I would like to send a message ack to the RabbitMQ server upon receipt of the socket.io event.
The logging is working fine, but I've been struggling with the message acknowledgement part.
My node server looks like this:
server.js
const io = require('./socket');
const amqp = require('amqplib/callback_api');
const CONFIG = require('./config.json');
amqp.connect(`amqp://${CONFIG.host}`, (err, connection) => {
if (err) {
throw err;
}
connection.createChannel((err, channel) => {
if (err) {
throw err;
}
const queue = CONFIG.queueName;
channel.assertQueue(queue, {
durable: true
});
console.log(` [*] Waiting for messages in ${queue}.`);
channel.consume(queue, function(msg) {
console.log(' [x] Request received from RabbitMQ: %s', msg.content.toString());
io.client.emit('sendReview', msg.content.toString());
}, {
noAck: false
});
})
});
socket.js
const io = require('socket.io')();
const port = process.env.PORT || 8000;
module.exports = {
client : any = io.on('connection', (client) => {
console.log(' [*] New client connected with ID: ' + client.id);
client.on('reportReview', (msg) => {console.log(` [x] Response received from browser: ${msg}`)});
client.on('disconnect', () => console.log(` [*] User ${client.id} disconnected.`));
})
};
io.listen(port);
console.log(`Listening on port ${port}`);
My frontend looks like this:
App.js
import React, { Component } from "react";
import * as API from './api';
export default class App extends Component {
constructor(props, context) {
super(props, context);
this.state = {
data: ["Whoops - no reviews available"],
};
this.updateReview = this.updateReview.bind(this);
this.onMessageReceived = this.onMessageReceived.bind(this);
this.handleClick = this.handleClick.bind(this);
}
handleClick() {
API.reportClick(this.state.data[0]);
this.updateReview()
}
updateReview() {
const newArray = this.state.data.slice(1);
if (newArray.length === 0) {
this.setState({data: ["Whoops - no reviews available"]})
} else {
this.setState({data: newArray})
}
}
onMessageReceived(msg) {
console.log(`Request for review received: ${msg}`);
const updatedData = this.state.data.concat(msg);
this.setState({data: updatedData});
if (this.state.data[0] === "Whoops - no reviews available") {
this.updateReview()
}
}
componentDidMount() {
API.subscribe(this.onMessageReceived)
}
render() {
return (
<div className="App">
<p className="App-intro">
Click to confirm review #: {this.state.data[0]}
</p>
<button onClick={this.handleClick}>Click</button>
</div>
);
}
}
Api.js
import clientSocket from 'socket.io-client';
const socket = clientSocket('http://localhost:8000');
function subscribe(onMessageReceived) {
socket.on('sendReview', onMessageReceived);
}
function reportClick(msg) {
socket.emit('reportReview', msg);
}
export { reportClick, subscribe };
As far as I understand, in order to send a message ack I would have to call channel.ack(msg); somewhere on the Node server. However, I am not sure how to pass the channel object to the io module? I have also tried having the socket.io code in server.js so I would have access to the channel object but have not been able to get this to work, either - I have not been able to get the amqp connection and socket.io connection to work together other than using my current approach of having an io module.
Any help would be very much appreciated.
I ended up getting it to work by having the socket code in server.js like this:
const io = require('socket.io')();
function socketIOHandler(callback) {
io.on('connection', (socket) => {
socket.on('error', function(err) {
console.log(err.stack);
});
callback(socket);
});
}
var amqpConn = null;
// start amqp connection to rabbit mq
function start() {
amqp.connect(`amqp://${CONFIG.host}`, (err, connection) => {
if (err) {
throw err;
}
amqpConn = connection;
// start consume worker when connected
startWorker();
});
}
function startWorker() {
socketIOHandler((socket) => {
amqpConn.createChannel((error, channel) => {
... <---- all the bits as before
socket.on('msgSent', (msg) => {
channel.ack(msg);
});
})
});
io.listen(port);
}
start();

Node/RabbitMQ - Send consumer response to nodejs route

I am handling nodejs requests through RabbitMQ. My producer receives requests through nodejs route and sends them to consumer, which then creates a document in the db by the data received from the request.
Here is my route
router.post("/create-user", async(req: Request, res: Response) => {
const msg = JSON.stringify(req.body);
const send = await Producer(msg);
});
Here is my Producer class
import amqp from "amqplib/callback_api";
export async function Producer(message: string) {
amqp.connect("amqp://localhost", (error0, connection) => {
if (error0) {
throw error0;
}
connection.createChannel((error1, channel) => {
if (error1) {
throw error1;
}
let queue = "hello";
channel.assertQueue(queue, {
durable: false,
});
channel.sendToQueue(queue, Buffer.from(message));
console.log(" [x] Sent %s", message);
});
});
}
And my consumer
import amqp from "amqplib/callback_api";
import {
User
} from "../models/user";
export class ConsumerClass {
public async ConsumerConnection() {
amqp.connect("amqp://localhost", (error0, connection) => {
if (error0) {
throw error0;
} else {
this.ConsumerTask(connection);
}
});
}
public async ConsumerTask(connection: amqp.Connection) {
connection.createChannel((error1, channel) => {
if (error1) {
throw error1;
}
let queue = "hello";
channel.assertQueue(queue, {
durable: false,
});
channel.prefetch(1);
console.log(" [*] Waiting for messages in %s. To exit press CTRL+C", queue);
channel.consume(queue, async(msg) => {
console.log(" [x] Received %s", msg.content.toString());
const data = JSON.parse(msg.content.toString());
const user = new User({
name: data.name,
phone: data.phone,
company: data.company,
});
await user.save();
}, {
noAck: true,
});
});
}
}
I want to send the Json document of the user created from the consumer to the route so that the client can get the created user as a response. How can i achieve this and what am i doing wrong?
What you want is a response event from the consumer to the producer. Now, this is where you can create a function that acts as a Remote Procedure Call.
Thus instead of two events, there will be 2 events e1 and e2. Here's a small diagram to explain this stuff ( disclaimer - I am bad at drawing). I guess you can manage the coding part of this.

Resources