Node Stream to emulate duplex socket - node.js

I'm using node with the mqtt-connection and aedes mqtt libraries. I am using aedes to run a mqtt server, and I wish to connect to the mqtt server from within node using a stream, rather than open a tcp socket. Both libraries will accept a Duplex stream.
Why does something like this work:
const mqttCon = require('mqtt-connection');
const duplex = require('net').createConnection(1883);
const client = mqttCon(duplex, {
protocolVersion: 3
});
While something like this fails (closes the stream)after the first data exchange?
const mqttCon = require('mqtt-connection');
const duplex = new stream.Transform();
duplex._transform = (chunk, encoding, callback) => {
duplex.push(chunk);
console.log(chunk);
callback();
};
const client = mqttCon(duplex, {
protocolVersion: 3
});
aedes.handle(duplex);
I feel like I must have some fundamental misconception about how streams are supposed to work. Basically I want to create something that acts like a TCP socket, allowing these two "processes" to communicate internal to node.
Typical use of aedes to create a mqtt server would looking this this:
const aedes = require('aedes')(
{
concurrency: 500,
maxClientsIdLength: 100
}
);
const server = require('net').createServer(aedes.handle);
Edit: More details about the failure.
Then client feeds data to the stream, then as soon as aedes finishes responding, duplex.on('close') fires. There are no error messages, and no indication from either side that there has been an error, the stream just closes so each side then closes gracefully. I'm guessing that one side or the other sees an "end" to the stream so it closes.

The problem is that "duplex" is a single stream, where it really needs two streams bridged. Here is what works:
const duplexAedes = new stream.Duplex({
write: function (chunk, encoding, next) {
setImmediate(function () {
duplexClient.push(chunk);
});
next();
},
read: function (size) {
// Placeholder
}
});
const duplexClient = new stream.Duplex({
write: function (chunk, encoding, next) {
setImmediate(function () {
duplexAedes.push(chunk);
});
next();
},
read: function (size) {
// Placeholder
}
});
duplexAedes.authentication = clientId;
const client = mqttCon(duplexClient, {
protocolVersion: 3
});
aedes_handle(duplexAedes);

Related

Advanced - Socket.io Join Room in external file on HTTP request

I am getting confused with this Node.js, Angular 13 and Socket IO scenario.
First of all let's asume we are already saving all required info in a Database, like roomId, roomOwner, username etc.
So, let's say we want to create an Online Quizz game using sockets to sync all players, 6 max for this scenario. HOWEVER, this is the problem...
On the Angular code there is this Service which is connecting Client
with Back-End
SocketService.ts
export class SocketService {
socket: any;
readonly url: string = "ws://localhost:3000";
constructor() {
this.socket = io(this.url)
}
}
On the Server side index.js inits webSocket
index.js
const app = express();
const io = require('./sockets/websocket')(app);
Inside webSocket.js we create the instance of socketIO to be exported and used across the whole back-end controllers as needed
webSocket.js
module.exports = function(app){
this.server = require('http').createServer(app);
this.socket = require('socket.io');
this.io = socket(server, {
cors: {
origin: "https://localhost:4200",
credentials: true
}
});
this.server.listen(3000, () => {
console.log("Socket IO is lestineng on port 3000");
});
io.on("connection", function (socket) {
console.log("A user connected");
});
this.registerSocketToRoom = function(roomId){
try{
console.log('[socket]','join room :',roomId)
io.join(roomId);
io.sockets.to(roomId).emit('user joined', socket.id);
}catch(e){
console.log('[error]','join room :',e);
io.emit('error','couldnt perform requested action');
}
} }
This is an example controller. We import the exported instance of SocketIO exported from webSocket.js file. Let's say we want to join a room if Client makes an http request to join a room HOWEVER, WE DID NOT joined the room "on socket connection" so we have to do it now. We try to use the exported method {registerSocketToRoom}.
GameRoomManagerController.js
require('../../sockets/websocket');
... // Some code here
exports.joinGameRoom = function(req, res){
const roomId = req.params.roomId;
console.log(roomId);
registerSocketToRoom(roomId);
return res.send({status: "success", msg: `joined Room: ${roomId}` });
}
When executing the process of creating a room -> saving the info to the DB -> Join Room the following error occurs.
TypeError: io.sockets.join is not a function
In theory this sound right to me, but I think I am misunderstanding the difference between io and socket.
Can someone explain to me what's going on here? Is it even possible
to export the same instance of io to be used in any place of the
back-end?
Is it even possible to join a room AFTER the connection was
created?
What's the difference between io and socket?
Before starting the topic, it is better to get acquainted with some terms from the socket.io library
io
In fact, it refers to all sockets connected to the server. You can
send messages individually, in groups, or to all sockets.
Your idea of ​​the socket that is written in this way
io.on('connection', socket => {
socket.on('message', data => {
});
});
In this section, you can only read the information related to this
event or you can transfer this information between sockets
Well, now we are going to solve this problem. The reason for this error is not following this hierarchy in your coding. I suggest you refer to the socket.io document next time and strengthen your foundation.
And finally, I will provide you with a simple example of the correct implementation method
let app = require('express')(),
http = require('http').Server(app),
io = require('socket.io')(http);
let listOfRoom = [];
io.on('connection', socket => {
let joinUserInRoom = (roomId) => {
if (socket.adapter.rooms.has(roomId) === false) {
listOfRoom.push(roomId);
socket.join(roomId);
}
},
leaveUserInRoom = (roomId) => {
if (listOfRoom.includes(roomId)) {
listOfRoom.splice(listOfRoom.indexOf(roomId), 1);
socket.leave(roomId);
}
};
socket.on('joinRoom', data => {
joinUserInRoom(data.roomId);
})
socket.on('disconnect', data => {
leaveUserInRoom(data.roomId);
});
socket.on('messageRoom', data => {
io.to(data.roomId).emit('eventMessageRoom', data); // send data in special room
});
});

Socket IO not sending new sql data after client connects

I am creating a real-time app using. Socket IO, Node.js, Express.js, React for frontend, and Microsoft SQL for database. I only want to send data when the database is updated or when a new client is connected. Though when the client first connects, the IO connection fires off sending my data to the new client. But when I make a change to my database. The data never gets sent. My code is below. I feel as though I am close, but I am just missing something that makes the code work. I appreciate any kind of help.
const app = express();
const httpServer = require('http').createServer(app);
const io = require('socket.io')(httpServer);
const path = __dirname + '/views/';
let sqlQuery = require('./controllers/sqlController').queryDatabase;
let currentQueryData = {};
let connectedSocketID
let objectMatched = true;
app.use(express.static(path))
app.get('/', function (req,res) {
res.sendFile(path + "index.html");
});
// Function to emit data only when a change is found.
const sendData = (data, socket) => {
socket.emit('markerCreation', data);
}
// compare both objects and return a boolean value.
const compareObjects = (object1, object2) => {
return JSON.stringify(object1) === JSON.stringify(object2);
}
httpServer.listen(3001, () => {
console.log(`Server listening at ${3001}`);
})
io.on('connection', async socket => {
// Get new Query Data than compare the object with the currently saved Query data
let newQueryData = await sqlQuery();
objectMatched = compareObjects(currentQueryData, newQueryData)
if(!objectMatched) { // If objects matched is not true take the new data and save it in currentQueryData and send data to client.
currentQueryData = newQueryData;
sendData(currentQueryData, socket);
} else if (connectedSocketID !== socket.id) { // If socket is not already connected saved it in connected sockets and send data to client
connectedSocketID = socket.id;
sendData(currentQueryData, socket);
};
// Issue: Socket IO will stop sending to connected Client. If a new update happens on the sql database the change isn't passed along to
// the client.
});```

How to disconnect a socket after streaming data?

I am making use of "socket.io-client" and "socket.io stream" to make a request and then stream some data. I have the following code that handles this logic
Client Server Logic
router.get('/writeData', function(req, res) {
var io = req.app.get('socketio');
var nameNodeSocket = io.connect(NAMENODE_ADDRESS, { reconnect: true });
var nameNodeData = {};
async.waterfall([
checkForDataNodes,
readFileFromS3
], function(err, result) {
if (err !== null) {
res.json(err);
}else{
res.json("Finished Writing to DN's");
}
});
function checkForDataNodes(cb) {
nameNodeSocket.on('nameNodeData', function(data) {
nameNodeData = data;
console.log(nameNodeData);
cb(null, nameNodeData);
});
if (nameNodeData.numDataNodes === 0) {
cb("No datanodes found");
}
}
function readFileFromS3(nameNodeData, cb) {
for (var i in nameNodeData['blockToDataNodes']) {
var IP = nameNodeData['blockToDataNodes'][i]['ipValue'];
var dataNodeSocket = io.connect('http://'+ IP +":5000");
var ss = require("socket.io-stream");
var stream = ss.createStream();
var byteStartRange = nameNodeData['blockToDataNodes'][i]['byteStart'];
var byteStopRange = nameNodeData['blockToDataNodes'][i]['byteStop'];
paramsWithRange['Range'] = "bytes=" + byteStartRange.toString() + "-" + byteStopRange.toString();
//var file = require('fs').createWriteStream('testFile' + i + '.txt');
var getFileName = nameNodeData['blockToDataNodes'][i]['key'].split('/');
var fileData = {
'mainFile': paramsWithRange['Key'].split('/')[1],
'blockName': getFileName[1]
};
ss(dataNodeSocket).emit('sendData', stream, fileData);
s3.getObject(paramsWithRange).createReadStream().pipe(stream);
//dataNodeSocket.disconnect();
}
cb(null);
}
});
Server Logic (that gets the data)
var dataNodeIO = require('socket.io')(server);
var ss = require("socket.io-stream");
dataNodeIO.on('connection', function(socket) {
console.log("Succesfully connected!");
ss(socket).on('sendData', function(stream, data) {
var IP = data['ipValue'];
var blockName = data['blockName'];
var mainFile = data['mainFile'];
dataNode.makeDir(mainFile);
dataNode.addToReport(mainFile, blockName);
stream.pipe(fs.createWriteStream(mainFile + '/' + blockName));
});
});
How can I properly disconnect the connections in function readFileFromS3. I have noticed using dataNodeSocket.disconnect() at the end does not work as I cannot verify the data was received on the 2nd server. But if I comment it out, I can see the data being streamed to the second server.
My objective is to close the connections in Client Server side
It appears that the main problem with closing the socket is that you weren't waiting for the stream to be done writing before trying to close the socket. So, because the writing is all asynchronous and finishes sometime later, you were trying to close the socket before the data had been written.
Also because you were putting asynchronous operations inside a for loop, you were also running all your operations in parallel which may not be exactly what you want as it makes error handling more difficult and server load more difficult.
Here's the code I would suggest that does the following:
Create a function streamFileFromS3() that streams a single file and returns a promise that will notify when it's done.
Use await in a for loop with that streamFileFromS3() to serialize the operations. You don't have to serialize them, but then you would have to change your error handling to figure out what to do if one errors while the others are already running and you'd have to be more careful about concurrency issues.
Use try/catch to catch any errors from streamFileFromS3().
Add error handling on the stream.
Change all occurrences of data['propertyName'] to data.propertyName. The only time you need to use brackets is if the property name contains a character that is not allowed in a Javascript identifier or if the property name is in a variable. Otherwise, the dot notation is preferred.
Add socket.io connection error handling logic for both socket.io connections.
Set returned status to 500 when there's an error processing the request
So, here's the code for that:
const ss = require("socket.io-stream");
router.get('/writeData', function(req, res) {
const io = req.app.get('socketio');
function streamFileFromS3(ip, data) {
return new Promise((resolve, reject) => {
const dataNodeSocket = io.connect(`http://${ip}:5000`);
dataNodeSocket.on('connect_error', reject);
dataNodeSocket.on('connect_timeout', () {
reject(new Error(`timeout connecting to http://${ip}:5000`));
});
dataNodeSocket.on('connection', () => {
// dataNodeSocket connected now
const stream = ss.createStream().on('error', reject);
paramsWithRange.Range = `bytes=${data.byteStart}-${data.byteStop}`;
const filename = data.key.split('/')[1];
const fileData = {
'mainFile': paramsWithRange.Key.split('/')[1],
'blockName': filename
};
ss(dataNodeSocket).emit('sendData', stream, fileData);
// get S3 data and pipe it to the socket.io stream
s3.getObject(paramsWithRange).createReadStream().on('error', reject).pipe(stream);
stream.on('close', () => {
dataNodeSocket.disconnect();
resolve();
});
});
});
}
function connectError(msg) {
res.status(500).send(`Error connecting to ${NAMENODE_ADDRESS}`);
}
const nameNodeSocket = io.connect(NAMENODE_ADDRESS, { reconnect: true });
nameNodeSocket.on('connect_error', connectError).on('connect_timeout', connectError);
nameNodeSocket.on('nameNodeData', async (nameNodeData) => {
try {
for (let item of nameNodeData.blockToDataNodes) {
await streamFileFromS3(item.ipValue, item);
}
res.json("Finished Writing to DN's");
} catch(e) {
res.status(500).json(e);
}
});
});
Other notes:
I don't know what paramsWithRange is as it is not declared here and when you were doing everything in parallel, it was getting shared among all the connections which is asking for a concurrency issue. In my serialized implementation, it's probably safe to share it, but the way it is now bothers me as it's a concurrency issue waiting to happen.

How to stream data over socket.io to client

I have socket.io sending a basic object from server to client. This bit works fine.
Now want to send a stream from server to client, using event-stream (specifically the results of a block-chain query). I am getting unexpected results in the browser console..
var io = require('socket.io')(server);
var dsteem = require('dsteem')
var es = require('event-stream')
var util = require('util')
var client = new dsteem.Client('https://api.steemit.com')
var stream = client.blockchain.getBlockStream()
/* This sends results to stdout, fine
io.on('connection', function(socket){
stream.pipe(es.map(function(block, callback) {
callback(null, util.inspect(block) + '\n')
})).pipe(process.stdout);
// And this sends a simple object to the client
socket.emit('blockchainOps', {"Foo!":"Doo!"} );
});
*/
// Putting both together sends strange connection data to client
io.on('connection', function(socket){
socket.emit('blockchainOps', function() {
stream.pipe(es.map(function(block, callback) {
callback(null, util.inspect(block) + '\n');
}))
})
});
What I get in the client console appears to be some kind of TCP socket function,
ƒ (){if(!n){n=!0;var r=a(arguments);u("sending ack %j",r),e.packet({type:i.ACK,id:t,data:r})}}
Can anyone help me understand what's going on and what I'm doing wrong?
== EDIT UPDATE ==
As suggested in comments, I've tried socket.io-stream to augment event-stream.
var es = require('event-stream')
var util = require('util')
var ss = require('socket.io-stream');
var stream = ss.createStream();
io.on('connection', function(socket){
ss(socket).emit('blockchainOps', stream, function(){
client.blockchain.getBlockStream()
.pipe(es.map(function(block, callback) {
callback(null, util.inspect(block) + '\n')
}))
.pipe(process.stdout)
}());
});
This time I get a socket object returned in the browser console which does not seem to be the stream data I was hoping for.
If anyone is looking for a working socket.io stream example
// server side
const { pipeline } = require('stream')
const server = require('http').Server().listen(8080)
const io = require('socket.io')(server)
const ss = require('socket.io-stream')
io.on('connection', (socket) => ss(socket).on('stream', (stream) => {
pipeline(stream, process.stdout, (err) => err && console.log(err))
}));
// client side
const client = require('socket.io-client')
const socket = client.connect('http://localhost:8080')
socket.on('connect', () => {
const stream = ss.createStream()
ss(socket).emit('stream', stream)
pipeline(process.stdin, stream, (err) => err && console.log(err))
});
You're using socket.emit wrong, you're passing the ACK callback to the client instead of your stream. Have a look at socket.emit signature :socket.emit(eventName[, ...args][, ack]).
You probably want something like
socket.emit('blockchainOps', client.blockchain.getBlockStream());
However, I don't think plain socket io supports passing a Stream like that. To pipe a stream down to the client you could use socketio-stream. It would look like this:
var ss = require('socket.io-stream');
var stream = ss.createStream();
ss(socket).emit('blockchainOps', stream);
client.blockchain.getBlockStream().pipe(stream);
EDIT:
On the client, you should be able to read your stream like this:
<script src="socket.io/socket.io.js"></script>
<script src="socket.io-stream.js"></script>
...
ss(socket).on('blockchainOps', function(stream) {
var binaryString = "";
stream.on('data', function(data) {
for(var i=0;i<data.length;i++) {
binaryString+=String.fromCharCode(data[i]);
}
});
stream.on('end', function(data) {
console.log(binaryString);
binaryString = "";
});
});

How i can resume after error event in piped stream in nodejs?

After i emit error event in MyWritableStream, data transmission stops. What i need to do to resume data transfer?
var readable = fs.createReadStream('test.txt');
var writable = new MyWritableStream();
writable.on('error', function(error) {
console.log('error', error);
// How i can resume?
});
writable.on('finish', function(){
console.log('finished');
})
readable.pipe(writable);
I know this question is old, but you might wanna check out https://github.com/miraclx/xresilient
I built this for this exact same reason (works best with seekable streams).
You define a function that returns a readable stream, the library measures the number of bytes that have passed through until an error is met.
Once the readable stream encounters an error event, it recalls the defined function with the number of bytes read so you can index the stream source.
Example:
const fs = require('fs');
const xresilient = require('xresilient');
const readable = xresilient(({bytesRead}) => {
return generateSeekableStreamSomehow({start: bytesRead});
}, {retries: 5});
const writable = fs.createWriteStream('file.test');
readable.pipe(writable);
File streams are indexable with the start option of the fs.createReadStream() function.
HTTP Requests are indexable with the Range HTTP Header.
Check it out.
https://www.npmjs.com/package/xresilient
I am not sure, if it is a normal practice, but i can't see another solution for now & it works for me. If you can advise more accurate solution, please do it.
We can track readable stream instance using pipe event in writeable one:
function WriteableStream(options) {
Writable.call(this, options);
this.source = null;
var instance = this;
this.on('pipe', function(source){
instance.source = source;
});
}
util.inherits(WriteableStream, Writable);
So, when we emit error event, and readable stream is unpiped automatically, we can re-pipe it ourself:
WriteableStream.prototype._write = function(chunk, encoding, done) {
this.emit('error', new Error('test')); // unpipes readable
done();
};
WriteableStream.prototype.resume = function() {
this.source.pipe(this); // re-pipes readable
}
Finally, we will use it the following way:
var readable = fs.createReadStream(file);
var writeable = new WriteableStream();
writeable.on('error', function(error) {
console.log('error', error);
writeable.resume();
});
readable.pipe(writeable);

Resources