So, I want to move a websocket instance on my Node.js server from one array to the other. I use JSON.parse(JSON.stringify(theitem)), and it sticks in there just fine - but then it gives me a peculiar error.
var ws = require("ws");
var server = new ws.Server({server:app.server, path:"/whatever", port:8080})
var clients = [];
var OtherClients = [];
server.on("connection", function(socket){
socket.send("Hello!"); // This works.
socket.on("message", function(msg){console.log("whatever");})
clients.push({socket:socket, OtherInformation:"whatever!"});
NextFunction();
})
function NextFunction(){
for(let i = 0; i < clients.length; i+=2){ // Every second one, meant to represent arbitrary logic...
clients.forEach(TheClient=>{
TheClient.socket.send("Arbitrary string!"); // This one works.
})
}
OtherClients[0] = JSON.parse(JSON.stringify(clients[clients.length-1]));
clients.splice(0,1);
OtherClients[0].socket.send("Another arbitrary string!"); // Crash here.
}
This gives me the error TypeError: OtherClients[0].socket.send is not a function. How?
Moreover, I noticed that socket.send is not actually a member of the socket object (even in the original version where I can send it). What's going on?
Am I really not allowed to copy a WebSocket object, or am I missing the point entirely?
You can't stringify a webSocket object and then parse it back into a functioning object. A webSocket object represents a local TCP connection and can't easily be serialized like that.
If you just want to move the socket object from one array to another, you can just take the socket object and add it to another array, no serialization to JSON necessary. You could do it like this:
// remove last socket from clients array and
// add it to the beginning of the OtherClients array
OtherClients.unshift(clients.pop());
Related
Get Socket Ref On Socket Close
Following this SO answer, I expect to be able to shutdown the server and restart it again. To do that accountably, it sounds like I need to not just call close() on the server instance object but also any remaining open sockets. Memory Leaks are an issue so I would like to store the socket instances in a Set and clean up expired connections as they close.
You need to
subscribe to the connection event of the server and add opened sockets to an array
keep track of the open sockets by subscribing to their close event and removing the closed ones from your array
call destroy on all of the remaining open sockets when you need to terminate the server
...
const sockets = new Set(); // use a set to easily get/delete sockets
...
function serve(compilations) {
const location = './dist', directory = express.static(location);
const server = app
.use('/', log, directory)
.get( '*', (req, res) => res.redirect('/') )
.listen(3000)
;
server.on('connection', handleConnection); // subscribe to connections (#1)
console.log(`serving...(${location})`);
return server;
}
...
function handleSocketClose(socket, a, b, ...x) { // #param#socket === false (has-error sentinel) (#2.2)
const closed = sockets.delete(socket);
console.log(`HANDLE-SOCKET-CLOSE...(${sockets.size})`, closed, a, b); // outputs "HANDLE-SOCKET-CLOSE...(N) false undefined undefined"
}
function handleConnection(socket) {
sockets.add(socket); // add socket/client to set (#1)
console.log(`HANDLE-CONNECTION...(${sockets.size})`);
socket.on('close', handleSocketClose); // subscribe to closures on individual sockets (#2.1)
}
From the SO answer (above), I have #1 & #3 completed. In fact, I even have, say, #2.1 done as I am currently listening on each socket to for the close event -- which does fire. The problem is, say, #2.2: "...and removing the closed ones from your array".
My expectation while implementing the handleSocketClose callback was that one of the argument values would be the socket object, itself, so that I could then cleanup the sockets collection.
Question
How in Laniakea can I obtain a reference to the socket in question at close time (without iteration and checking the connection status)?
var express = require('express');
var app = express();
var server = app.listen(3000);
var replyFromBot;
app.use(express.static('public'));
var socket = require('socket.io');
var io = socket(server);
io.sockets.on('connection' , newConnection);
function newConnection(socket) {
console.log(socket.id);
listen = true;
socket.on('Quest' ,reply);
function reply(data) {
replyFromBot = bot.reply("local-user", data);
console.log(socket.id+ " "+replyFromBot);
socket.emit('Ans' , replyFromBot);
}
}
i've created a server based chat-bot application using node.js socket.io and express but the thing is for first time when i call socket.on it gets executed once and for 2nd time it gets executed twice for 3rd thrice and so on i've tackled this issue by setting a flag on my client so that it would display only once. i just wants to know is my code logically correct i mean is this a good code? because if the client ask a question for 10th time than listeners array will have 10+9+8....+1 listeners it would go on increasing depending upon number of questions clients asked. which is not good
i tried using removeListener it just removes listener once and it dosent call back for 2nd time. what do you guys recommend? do i go with this or is there any other way to add the listener when socket.on called and remove it when it gets executed and again add listener for the next time it gets called
thank-you.
client code:
function reply() {
socket.emit('Quest' , Quest);
flag = true;
audio.play();
socket.on('Ans', function(replyFromBot) {
if(flag) {
console.log("hi");
var para = document.createElement("p2");
x = document.getElementById("MiddleBox");
para.appendChild(document.createTextNode(replyFromBot));
x.appendChild(para);
x.scrollTop = x.scrollHeight;
flag = false;
}
});
}
The problem is caused by your client code. Each time you call the reply() function in the client you set up an additional socket.on('Ans', ...) event handler which means they accumulate. You can change that to socket.once() and it will remove itself each time after it get the Ans message. You can then also remove your flag variable.
function reply() {
socket.emit('Quest' , Quest);
audio.play();
// change this to .once()
socket.once('Ans', function(replyFromBot) {
console.log("hi");
var para = document.createElement("p2");
x = document.getElementById("MiddleBox");
para.appendChild(document.createTextNode(replyFromBot));
x.appendChild(para);
x.scrollTop = x.scrollHeight;
});
}
Socket.io is not really built as a request/response system which is what you are trying to use it as. An even better way to implement this would be to use the ack capability that socket.io has so you can get a direct response back to your Quest message you send.
You also need to fix your shared variables replyFromBot and listen on your server because those are concurrency problems waiting to happen as soon as you have multiple users using your server.
Better Solution
A better solution would be to use the ack capability that socket.io has to get a direct response to a message you sent. To do that, you'd change your server to this:
function newConnection(socket) {
console.log(socket.id);
socket.on('Quest', function(data, fn) {
let replyFromBot = bot.reply("local-user", data);
console.log(socket.id+ " "+replyFromBot);
// send ack response
fn(replyFromBot);
});
}
And, change your client code to this:
function reply() {
audio.play();
socket.emit('Quest', Quest, function(replyFromBot) {
console.log("hi");
var para = document.createElement("p2");
x = document.getElementById("MiddleBox");
para.appendChild(document.createTextNode(replyFromBot));
x.appendChild(para);
x.scrollTop = x.scrollHeight;
});
}
Doing it this way, you're hooking into a direct reply from the message so it works as request/response much better than the way you were doing it.
Instead of socket.on('Quest' ,reply); try socket.once('Quest' ,reply);
The bug in your code is that each time newConnection() is called node registers a event listener 'Quest'. So first time newConnection() is called the number of event listener with event 'Quest' is one, the second time function is called, number of event listener increases to two and so on
socket.once() ensures that number of event listener bound to socket with event 'Quest' registered is exactly one
In my node.js application I have a collection of client sockets as an array. When a communication error occurs, I simply call destroy on the socket.
My question is: should I destroy the socket before or after removing it from the array? The documentation doesn't say much.
var clientSockets = []
var destroySocketBefore = function(socket) {
socket.destroy()
var socketIdx = clientSockets.indexOf(socket)
if (socketIdx > -1) {
clientSockets.splice(socketIdx, 1)
}
}
var destroySocketAfter = function(socket) {
var socketIdx = clientSockets.indexOf(socket)
if (socketIdx > -1) {
clientSockets.splice(socketIdx, 1)
}
socket.destroy()
}
In the case of destroySocketBefore, I am not sure if the socket will be found in the array if I destroy it before searching for it, so there is a possibility that array still incorporates invalid sockets in subsequent logic.
In the case of destroySocketAfter, I am not sure if calling destroy on a socket that was removed from array will have the desired result. Is there a possibility that the system will delete the socket object after splicing the array, so sometimes I get to call destroyon a null object.
I tested and it seems that both methods work as there is no difference between them, so I am not sure which method is the correct one.
Either solution is valid and the two are effectively the same. The destroyed socket will get removed no matter what since there are no race conditions or anything like that (since javascript execution in node all happens on the same thread).
splice will only remove the socket from an user defined array and will have no effect to it been closed, therefore the second method is the best option according to your answers.
It seems that socket.io cannot send the list of connected users, like this:
socket.emit('users', sIo.sockets.clients());
It gives me the following error:
/Users/enrico/Desktop/helloExpress/node_modules/socket.io/lib/parser.js:75
data = JSON.stringify(ev);
^
TypeError: Converting circular structure to JSON
Apparently it cannot stringify the returned value from sIo.sockets.clients() Any ideas on how to fix this? Thanks in advance.
Since the problem is a circular reference there no real 'fixing it'. Circular reference means that some object in the structure points to another part of the object, making an infinite loop. What you can do is something like this.
var returnList = [];
var clients = sIo.sockets.clients(); // grab list of clients, assuming its an array
for(var i = 0; i < clients.length; i++) {
var client = clients[i]; // next client in array
// push values into return list
returnList.push({
name: client.name,
someOther: client.value,
another: client.thing
});
}
// emit cleaned up list
socket.emit('users', returnList);
With this code you can cherry pick the values you want and send only those. This is good for several other reasons. Since this clients list is likely an internal implementation is might also send information about other clients connection.
This is all also pretty speculative as I'm not 100% what libraries you're using, looks like Socket.IO but I cannot find any socket.clients() method.
The problem is that sending large serialized JSON (over 16,000 characters) over a net socket gets split into chunks. Each chunk fires the data event on the receiving end. So simply running JSON.parse() on the incoming data may fail with SyntaxError: Unexpected end of input.
The work around I've managed to come up with so far is to append a null character ('\u0000') to the end of the serialized JSON, and check for that on the receiving end. Here is an example:
var partialData = '';
client.on( 'data', function( data ) {
data = data.toString();
if ( data.charCodeAt( data.length - 1 ) !== 0 ) {
partialData += data;
// if data is incomplete then no need to proceed
return;
} else {
// append all but the null character to the existing partial data
partialData += data.substr( 0, data.length - 1 );
}
// pass parsed data to some function for processing
workWithData( JSON.parse( partialData ));
// reset partialData for next data transfer
partialData = '';
});
One of the failures of this model is if the receiver is connected to multiple sockets, and each socket is sending large JSON files.
The reason I'm doing this is because I need to pass data between two processes running on the same box, and I prefer not to use a port. Hence using a net socket. So there would be two questions: First, is there a better way to quickly pass large JSON data between two Node.js processes? Second, if this is the best way then how can I better handle the case where the serialized JSON is being split into chunks when sent?
You can use try...catch every time to see if it is a valid json. Not very good performance though.
You can calculate size of your json on sending side and send it before JSON.
You can append a boundary string that's unlikely be in JSON. Your \u0000 - yes, it seems to be a legit way. But most popular choice is newline.
You can use external libraries like dnode which should already do something I mentioned before. I'd recommend trying that. Really.
One of the failures of this model is if the receiver is connected to multiple sockets, and each socket is sending large JSON files.
Use different buffers for every socket. No problem here.
It is possible to identify each socket individually and build buffers for each one. I add an id to each socket when I receive a connection and then when I receive data I add that data to a buffer.
net.createServer( function(socket) {
// There are many ways to assign an id, this is just an example.
socket.id = Math.random() * 1000;
socket.on('data', function(data) {
// 'this' refers to the socket calling this callback.
buffers[this.id] += data;
});
});
Each time you can check if you have received that "key" delimiter that will tell you that a buffer is ready to be used.