NodeJS 0.10. No 'data' event emitted on net.Socket - node.js

I have code to log every connection to my HTTP-server on a socket level and also log any incoming data.
This code was originally written for NodeJS 0.8 and works good there.
No my project is migrated to 0.10.24 and socket logging code stopped working.
Here is my code:
var netLogStream = fs.createWriteStream('net.log');
(function(f) {
net.Server.prototype.listen = function(port) {
var rv = f.apply(this, arguments); // (1)
rv.on('connection', function(socket) { // (2)
socket.on('data', function(data) {
data.toString().split('\n').forEach(function(line) { // (3)
netLogStream.write('... some logging here ... ' + line);
});
});
});
return rv;
};
})(net.Server.prototype.listen);
On 0.10 I can get to (1) and get Socket instance on (2) but I never get to (3). Same time my whole application works fine without any issues.
ADD: My server is created with Express#3.4.x

I'm not sure why the results are different between node v0.8 and v0.10, but if I had to guess, I'd be looking at the return value of net.Server.prototype.listen.
According to the documentation, this is an asynchronous method which emits the 'listen' event and invokes its callback when the listening is bound. You're not looking for that event, but rather, capturing the return value of listen, which for an async function, may not be well-defined. It's obviously not null or undefined since you don't get a runtime error, but the return value may not be the same between v0.8 and v0.10.
I honestly don't know for sure because I don't do low-level socket coding, but I have 2 suggestions to try:
Since the connection event is emitted from the Server object, perhaps you need this.on instead of rv.on.
Setup the connection event listener before you invoke listen just to minimize risk of race conditions.
Try this and see what happens:
var netLogStream = fs.createWriteStream('net.log');
(function(f) {
net.Server.prototype.listen = function(port) {
this.on('connection', function(socket) { // (2)
socket.on('data', function(data) {
data.toString().split('\n').forEach(function(line) { // (3)
netLogStream.write('... some logging here ... ' + line);
});
});
});
return f.apply(this, arguments); // (1)
};
})(net.Server.prototype.listen);

Related

Express.js - while loop before sending response

I'm trying to implement and existing solution in node.js, specifically, using express.js framework. Now, the existing solution works as follows:
server exposes a GET service that clients can connect to
when a client calls the GET service, the client number increments (a global variable) and then the number of clients is checked;
if there are not at least 3 clients connected, the service is in endless loop, waiting for other clients to connect
if (or rather, when) the rest of the two clients connect, the service sends respond to everyone that enough clients are connected (a 'true' value).
So what basically happens is, the client connects and the connection is active (in a loop) until enough clients connect, then and only then there is a response (to all clients at the same time).
Now I'm not expert in these architectures, but from what I think, this is not a correct or good solution. My initial thought was: this must be solved with sockets. However, since the existing solution works like that (it's not written in node.js), I tried to emulate such behaviour:
var number = (function(){
var count = 0;
return {
increase: function() {
count++;
},
get: function(){
return count;
}
};
})();
app.get('/test', function(req, res){
number.increase();
while (number.get() < 3) {
//hold it here, until enough clients connect
}
res.json(number.get());
});
Now while I think that this is not a correct solution, I have a couple of questions:
Is there any alternative to solving this issue, besides using sockets?
Why does this "logic" work in C#, but not in express.js? The code above hangs, no other request is processed.
I know node.js is single-threaded, but what if we have a more conventional service that responds immediately, and there are 20 requests all at the same time?
I would probably use an event emitter for this:
var EventEmitter = require('events').EventEmitter;
var emitter = new EventEmitter();
app.get('/', function(req, res) {
// Increase the number
number.increase();
// Get the current value
var current = number.get();
// If it's less than 3, wait for the event emitter to trigger.
if (current < 3) {
return emitter.once('got3', function() {
return res.json(number.get());
});
}
// If it's exactly 3, emit the event so we wake up other listeners.
if (current === 3) {
emitter.emit('got3');
}
// Fall through.
return res.json(current);
});
I would like to stress that #Plato is correct in stating that browsers may timeout when a response takes too much time to complete.
EDIT: as an aside, some explanation on the return emitter.once(...).
The code above can be rewritten like so:
if (current < 3) {
emitter.once('got3', function() {
res.json(number.get());
});
} else if (current === 3) {
emitter.emit('got3');
res.json(number.get());
} else {
res.json(number.get());
}
But instead of using those if/else statements, I return from the request handler after creating the event listener. Since request handlers are asynchronous, their return value is discarded, so you can return anything (or nothing). As an alternative, I could also have used this:
if (current < 3) {
emitter.once(...);
return;
}
if (current === 3) {
...etc...
Also, even though you return from the request handler function, the event listener is still referencing the res variable, so the request handler scope is maintained by Node until res.json() in the event listener callback is called.
Your http approach should work
You are blocking the event loop so node refuses to do any other work while it is in the while loop
You're really close, you just need to check every now and then instead of constantly. I do this below with process.nextTick() but setTimeout() would also work:
var number = (function(){
var count = 0;
return {
increase: function() {
count++;
},
get: function(){
return count;
}
};
})();
function waitFor3(callback){
var n = number.get();
if(n < 3){
setImmediate(function(){
waitFor3(callback)
})
} else {
callback(n)
}
}
function bump(){
number.increase();
console.log('waiting');
waitFor3(function(){
console.log('done');
})
}
setInterval(bump, 2000);
/*
app.get('/test', function(req, res){
number.increase();
waitFor3(function(){
res.json(number.get());
})
});
*/

Meteor, Future and Node.JS net

I'm trying to write a simple asynchronous TCP/IP client that runs alongside a Meteor server for communicating to a remote server and posting data to MongoDB. I got it working using net.on callbacks, but the code was messy and it was failing at random times. I decided to try writing it using fibers/Futures to clean it up so I could focus on the failures. The code currently looks like:
var Future = Npm.require('fibers/future'), wait = Future.wait;
var coreComm = function(coreClient) {
console.log('coreCommm started')
try {
var running = true
while (running) {
console.log('calling onF.wait()')
var ev = onF.wait();
console.log('ev received', ev)
switch(ev[0]) {
default:
console.log('unknown event from coreClient: ', ev)
break;
case 'readable':
console.log('read', ev)
break;
}
}
} catch(err) {
console.log('comm error: ', err)
}
}.future()
function tryConnect(options) {
var connect = new Future
onF = new Future
coreClient = net.connect(options, function() {
console.log('connected,')
connect.return()
})
connect.wait()
coreClient.on('readable',
function() { console.log('readable event,'); onF.return(['readable'])})
console.log('coreClient connected to core');
coreComm(coreClient)
}
Meteor.startup(function () {
tryConnect({port: 9987});
}
The output when a message is sent looks like:
=> Meteor server running on: http://localhost:3000/
I2038-10:42:18.160(-5)? starting
I2038-10:42:18.392(-5)? connected,
I2038-10:42:18.398(-5)? coreClient connected to core
I2038-10:42:18.402(-5)? coreCommm started
I2038-10:42:18.409(-5)? calling onF.wait()
I2038-10:42:18.413(-5)? readable event,
As far as I can tell, the message is received from the remote server, the readable event is sent, I call onF.return(...) and nothing happens except Meteor goes to 100% CPU.
Any suggestions as to why the onF.wait() call isn't returning like it's suppose to?

socket.io data seems to be sent multiple times(nodejs and craftyjs)

I am following this tutorial on making HTML5 games. I wanted to try and mix node in to make it multiplayer. I am using node.js(v0.10.4) on server and crafty.js on front end.
I am using socket.io to send and receive messages. For now it's just me(not multiple clients). The weird thing that happens is that the message that comes from the server seems to be sent multiple times. I turned on debug mode in socket.io but it only seems to be sending the data once, yet on the front end the data seems to be coming in, in multiples. I set an incrementor on the data and it seems as if the incrementor is not incrementing multiple times but instead I am getting multiple copies of the same data.
here's node code:
var http = require('http').createServer(handler),
static = require('node-static'),
io = require('socket.io').listen(http);
io.set('log level', 3);
http.listen(80);
//attach the socket to our server
var file = new static.Server(); //Create a file object so we can server the files in the correct folder
function handler(req, res) {
req.addListener('end', function() {
file.serve(req, res);
}).resume();
}
io.sockets.on('connection', function (socket) { //listen for any sockets that will come from the client
socket.on('collected', function(data) {
/**** here's where the data is being sent back to the client *****/
socket.emit('messageFromServer', { data: data.number });
});
});
and here's front end code:
//messenger entity
Crafty.c('SendRecieveMessages',{
count: 0,
sendMessageToServer : function() {
console.log('got a village');
/**** Here's where we send the message to the server ****/
socket.emit('collected', { village : "The message went to the server and back. it's collected", number : this.count });
this.count++;
},
recieveMessageFromServer : function() {
socket.on('messageFromServer', function(data) {
/*** This data seems to be coming back or logging multiple times? ***/
console.log(data);
});
}
});
Lastly here's a screenshot of the debug in process. As you can see number is not always incrementing, it almost looks like the data is getting stored. Thanks!
http://cl.ly/image/0i3H0q2P1X0S
It looks like every time you call Crafty.c, recieveMessageFromServer() is getting called too. Every time recieveMessageFromServer is invoked, it attaches an additional event listener on the socket. That's why the first time data comes back you get one copy, then the second time you get two, the third time you get three, and so on.
You either need to prevent recieveMessageFromServer from being called multiple times, or use removeListener or removeAllListeners to remove the previously attached listeners.
Thanks to #Bret Copeland for helping me figure this one out. As he pointed out, every time socket.on() is called, it seems to add another listener. To prevent this...
I declared a global variable:
I declared a variable as a property in my Game object(in craftyjs, so use whatever you want in your setup)
Game = {
//lots of other code here...
//need this to use later for socket.io
send_message : true
}
then edited my recieveMessageFromServer() function to check whether its ok to send the message or not:
recieveMessageFromServer : function() {
console.log('does this show up multiple times?');
/* Check whether the send_message is true before sending */
if (Game.send_message) {
socket.on('messageFromServer', function(data) {
console.log(data);
Game.send_message = false;
});
}
}

Adding handlers to socket in loop doesnt work :/

To handle events send to socket in more orginased way, I've made a router. In that router I want to assign each module to specific event. I've assigned event strings and its handlers to "handlers" object. Then I wanted to assign listeners to given socket in a loop. After assignment I listed all events and it handlers in given socket to be clear. Everything seemd to be fine. Unfortunently, it doesnt work. Socket acts like it would assign every event in handlers object to first handler in that object. Handmade verion works fine, but I just cant get it why simple loop fails :/
Here is code of the socketio handling socket by router:
var socketOptions = {transports:['flashsocket', 'websocket', 'htmlfile', 'xhr-polling', 'jsonp-polling']};
var io = socketio.listen(server,socketOptions).on('connection', function (socket) {
streamRouter(io,socket);
});
And here is code of the router. I've wrote how looks handmade version of assigning sockets and how did look like looped verion. Normally, second one is commented.
var handlers = {
"message": require("./message").doAction,
"subscribe": require("./subscribe").doAction
}
exports.handleConnection = function(io,socket) {
//handmade version
socket.on("subscribe", function(msg){
require("./subscribe").doAction(io,socket,msg);
});
socket.on("message", function(msg){
require("./message").doAction(io,socket,msg);
});
//loop version
for( var event in handlers ) {
socket.on(event, function(msg){
handlers[event](io,socket,msg);
});
}
}
I'ld be greatful for any advice where the bug lies. In short time I'll have many handlers and assigning them one by one will be an ugly copying-pasting through many lines of code :/
In your for-in loop, you're constructing functions that are all closed over the same event variable. As a result, when those functions get executed, they will all refer to the same value. Additionally, you're not guarding your for-in loop against prototype members (which may or may not be intended).
Instead, try this:
Object.keys(handlers).forEach(function(event){
socket.on(event, function(msg){
handlers[event](io, socket, msg);
});
});
For your loop to work, you need to create a new scope for each handler:
for( var event in handlers ) {
(function(handler) {
socket.on(event, function(msg){
handler(io,socket,msg);
});
})(handlers[event]);
}
This has to do with scoping: Javascript doesn't create a 'new' event variable for each loop, and by the time the event handler is called, event will be overwritten (and will contain the value it had in the last iteration of the loop).
This page offers more explaination.

Send out real time data to webclients error trapping

Trying to send data from a serial device to web clients. I am using a serial to network proxy, ser2Net to make the data available to a server that acts on the data and sends a manipulated version of the data to web clients. The clients specify the location of the ser2net host and port. The core of this action is coded in node.js as shown here:
function getDataStream(socket, dataSourcePort, host) {
var dataStream = net.createConnection(dataSourcePort, host),
dataLine = "";
dataStream.on('error', function(error){
socket.emit('error',{message:"Source not found on host:"+ host + " port:"+dataSourcePort});
console.log(error);
});
dataStream.on('connect', function(){
socket.emit('connected',{message:"Data Source Found"});
});
dataStream.on('close', function(){
console.log("Close socket");
});
dataStream.on('end',function(){
console.log('socket ended');
dataConnection.emit('lost',{connectInfo:{host:host,port:dataSourcePort}});
});
dataStream.on('data', function(data) {
// Collect a line from the host
line += data.toString();
// Split collected data by delimiter
line.split(delimiter).forEach(function (part, i, array) {
if (i !== array.length-1) { // Fully delimited line.
//push on to buffer and emit when bufferSendCommand is present
dataLine = part.trim();
buffer.push(part.trim());
if(part.substring(0, bufferSendCommand.length) == bufferSendCommand){
gotALine.emit('new', buffer);
buffer=[];
}
}
else {
// Last split part might be partial. We can't announce it just yet.
line = part;
}
});
});
return dataStream;
}
io.sockets.on('connection', function(socket){
var stream = getDataStream(socket, dataSourcePort, host);
//dispense incoming data from data server
gotALine.on('new', function(buffer){
socket.emit('feed', {feedLines: buffer});
});
dataConnection.on('lost', function(connectInfo){
setTimeout(function(){
console.log("Trying --- to reconnect ");
stream = getDataStream(socket, connectInfo.port, connectInfo.host);
},5000);
});
// Handle Client request to change stream
socket.on('message',function(data) {
var clientMessage = JSON.parse(data);
if('connectString' in clientMessage
&& clientMessage.connectString.dataHost !== ''
&& clientMessage.connectString.dataPort !== '') {
stream.destroy();
stream = getDataStream(socket,
clientMessage.connectString.dataPort,
clientMessage.connectString.dataHost);
}
});
});
This works well enough until the serial device drops off and ser2net stops sending data. My attempt to catch the end of the socket and reconnect is not working. The event gets emitted properly but the setTimeout only goes once. I would like to find a way to keep on trying to reconnect while sending a message to the client informing or retry attempts. I am node.js newbie and this may not be the best way to do this. Any suggestions would be appreciated.
Ok I think I figured it out in the dataStream.on('data' ... I added a setTimeout
clearTimeout(connectionMonitor);
connectionMonitor = setTimeout(function(){doReconnect(socket);}, someThresholdTime);
The timeout executes if data stops coming in, as it is repeatedly cleared each time data comes in. The doReconnect function keeps trying to connect and sends a message to the client saying something bad is going on.

Resources