NodeJS HTTPServer takes a long time to close - node.js

I'm working on a Zappa app, and I'm currently trying to make a little watch script that stops the server, clears require.cache then re-requires and restarts the server when a file changes, something like:
# watch all dependent files
for file of require.cache
fs.watch file, ->
# attach a handler to 'close'
# -- here's the issue: this takes far too long to trigger
server.on 'close', ->
server = require './server'
server.start()
# log the new server's id
console.log server.id
# stop the current server instance
server.stop()
# clear require's cache
delete require.cache[f] for f of require.cache
I also have a console.log server.id line in my request handler so I can check if the IDs match.
So, what happens is: when I change a dependency, the server stops, a new one starts and the new ID is logged, which is all gravy. However, for a random amount of time after, requests to the server still log the old ID, indicating that the old listener is still attached somehow. Eventually, the listener seems to 'switch over' and the new ID is logged.
Update: it seems this is related to the close event (unsurprisingly) - if I attach a simple console.log 'close' callback to the close event, the ID starts changing after 'close' appears. However, it can take a long time (10s+) for the close event to be fired, why might it take so long?

According to the node.js docs:
server.close()
Stops the server from accepting new connections. See net.Server.close().
So, your server would stop accepting new connections, but it won't actually close (and emit close event) until current connections are closed. My guess is that you have clients connected to it, perhaps with keep-alive requests.

I was searching for the same problem. To give you and others searching for this a solution:
You can close all connections immediately, if this is not a problem in your project. This solves the closing problem for me completly.
app.use((req, res, next) => {
res.setHeader('Connection', 'close');
next();
});

Related

Why isn't my simple socket.io event system working?

I am running into a problem while using socket.io to do some event handling. For some reason, the following code snippet does not handle the event 'update', or any event for that matter. Let me explain the situation.
I have created a file named updates.js to create a socket.io socket variable named socket_8888 that is bound to port 8888. I then use module.exports to make that socket variable available to any other file that imports updates.js using require('updates.js'). I structured my application this way because I need to emit events from several different files.
In app.js:
var updates = require('updates.js');
setTimeout(function() {
updates.regular.on("update", function () {
console.log("Updated.");
})
}, 1000);
setTimeout(
function () {
console.log(updates.regular.eventNames()); // Verifying that there is actually a listener bound to the socket -> prints ['update']
updates.regular.emit("update", 100)
}, 1500);
In updates.js:
var io = require("socket.io");
var socket_8888 = io(8888);
var updates = {
regular: socket_8888
};
module.exports = updates;
However, a few simple tests have uncovered that events are not being handled, and I really cannot figure out why. The word "Updated" should print a second and a half after I run the application using "node www", but it does not.
The reason I started doing this simple testing was because I am trying to revive an old website of mine, but after a couple years, API updates have rendered a lot of my code useless. So I am trying to rebuild. I am not trying to send events between different files on the server. I am only testing the events locally because the events were not firing to the browser client. For this reason, I decided to investigate using this simple test, and it turns out the events can not even be emitted/listened to on the actual server, let alone be handled on a client that is on a whole different network.
I have already verified that the listener is actually binding to the socket. However, I do not know how to check whether or not the socket is actually emitting the event "update".
I have written the listener to bind only after one second because attempting to bind the moment the application starts does not give Express enough time to set everything up. Otherwise, the socket would still be undefined.
I do not get any error messages. The code just does not work as I expected.
I would really appreciate it if the community can tell me why the event 'update' is not being handled.
To include update module (update.js)
Try this
It work's Perfectly
module.exports = updates
var updates = require('./updates');

restart node.js forever process if response time too big

I got forever script for managing node.js site.
Sometimes node.js site hangs and response time go above 30 seconds. And in fact site is down. Then fast cure for it is restarting forever:
$ forever restart 3
where 3 is script number in forever list.
Is it possible to make it automatically? Is there option in forever which make it restart if response time will be more than 2 seconds for example?
Or maybe I got to run external script which will check response time and make descension to restart hanging forever script.
Or maybe I need to write this logic inside my node.js site?
I am assuming you want to restart the server if most of the reply are taking longer than x seconds. There are many tools that helps you to restart your instances based on their health. Monit is one of them. In this guide, monit restart the instance if reply doesn't come back in 10 seconds.
If you want to kill the instance if one request if any of the requests are taking too long, then you note down the time when you take in the request, and note down the time when request leaves. If the time is too long, throw an exception that you know will not get caught and the server would restart by itself. If you use express, then check the code for their logger under development mode, as it tracks the response time.
Aside leorex solution, I have something like this before to send 500 on timed out requests:
var writeHead = res.writeHead;
var timeout = setTimeout(function () {
res.statusCode = 500;
res.end('Response timed out');
// To avoid errors being thrown for writes after this, they should be ignored.
res.writeHead = res.write = res.end = function () {};
}, 40000);
res.writeHead = function () {
// This was called in time.
clearTimeout(timeout);
writeHead.apply(this, arguments);
};
You can use addTimeout module to take away the timeout clearing part.
Once you implemented, you can handle as you like, you can just call process.exit(1); so forever will immediately replaces you.
You can make this smarter. Also in my application, if an uncaught error happens, I signal the supervisor process so that it will spin up another worker process and gracefully go down(close http server and wait for all pending requests to finish). You can do the same in your application, but make sure everything has a timeout callback as a failover/backup plan.

Node.js Ignoring blacklisted event 'disconnect' [duplicate]

I have a socket.io connection using xhr as its only transport. When I load up the app in the browser (tested in chrome and ff), the socket connects and everything works well until I navigate away from the page. If I reload the browser, I can see the 'disconnect' event get sent out by the client, but the server disconnect event doesn't fire for a very long time (presumably when the client heartbeat times out). This is a problem because I do some cleanup work in the server when the client disconnects. If the client reloads, I get multiple connection events before disconnect is fired. I've tried manually emitting a disconnect message from the client in the window's 'beforeunload' event as well, but to no avail. Any ideas?
I debugged the socket.io server, and I can confirm that Manager.prototype.onClientDisconnect is only getting hit for "close timeout" reasons.
After some more debugging, I noticed the following configuration in the socket.io Manager object:
blacklist : ['disconnect']
That causes this branch from namespace.js to not process the event:
case 'event':
// check if the emitted event is not blacklisted
if (-~manager.get('blacklist').indexOf(packet.name)) {
this.log.debug('ignoring blacklisted event `' + packet.name + '`');
} else {
var params = [packet.name].concat(packet.args);
if (dataAck) {
params.push(ack);
}
socket.$emit.apply(socket, params);
}
The change is detailed in this pull request https://github.com/LearnBoost/socket.io/pull/569. I understand why this is in place for XHR, since anyone could send an HTTP request with random session IDs trying to disconnect other users from the server.
What I plan to do instead is to check each new connection for an existing session id in the server, and make sure to run my disconnect logic before continuing with the connection logic.

Node: Distinguish between a reload event and a browser exit with socket event

I want to run some code when a a socket is closed but only if the user didn't reload the page...
I have something like
socket.on("close", function() {
//do something here
});
The problem is that this function runs during a reload event as well...Can I somehow pause it to run at a later time with the value at that later time. I tried just using a set timeout within the callback but couldn't access the socket object anymore from within the callback.
Whats the best way to prevent the function from running if a socket connection is regained shortly after?
the main concept of my thought, is that you can never know that a user is reloading or disconnecting and never come back for 1day or so , there are ways to detect that the browser is navigating away from the website, but cant know (in server side) that it will go to the same address, or different..
Instead if any clients disconnect, of course the disconnect event will be fired to socket.io server for that socket, so taking that in account you can set a Session Variable to false you can say that the player is "disconnected" so when the client of that session "reconnects" aka reloads , the socket.io will fire an "connection" event, BUT reading the session variable you can say that the client has previously disconnected and then connected again. Timestamps could apply to this so a reconnection after 15min would have to load some extra data etc..
So you can try this with sessions (assuming that you use express or something)
sockets.on("connection",function(socket){
if(session.isAReload){
// this is a reconnection
}
socket.set("isAReload",session.isAReload /*defaults to false*/);
});
sockets.on('close',function(){
socket.get('isAReload',function(err,isAReload){
if(isAReload){
// closed after reconnecting
}else{
/*
just closed connection, so next time it will be a "reload" or reconnection
a timer of lets say 30s could run to ensure that the client is reloading
*/
session.isAReload=true;
}
});
})

node.js - push data to client - only one client can be connected?

I am trying to create a server-side solution which periodically pushes data to the client (no client-side polling) via node.js. The connection should be open permanently and whenever the server has new data, it pushes it down to the client.
Here is my simple sample script:
var sys = require('sys'),
http = require('http');
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/html'});
sys.puts('Start sending...');
setInterval(function(){
res.write("<script type='text/javascript'>document.write('test<br>')</script>");
}, 10000);
}).listen(8010);
This basically works, but it seems that only one client at a time can be connected.
If I open http://127.0.0.1:8010/ with my browser I see every 10 seconds the new output written. But when I open another tab with the same url, it just loads forever. Only if I close the first tab, I get conent from the server.
What do I need to do in order to server multiple clients?
This is definitely a bug, what happens is that the Browser re-uses the same connection due to keep-alive and HTTP/1.1 and Node screws up.
You can see this at work in Opera11, open the page twice, it's the exact same page, using the exact same connection.
Curl and everything that doesn't set Connection: keep-alive works just fine, but Browsers fail to open the same page twice. Although you can open 'localhost:8010' and 'localhost:8010/foo' and it will work on both pages.
Note: This only affects GET requests, POST requests work just fine since there's no re-using of the connection.
I've filed an issue on this.
You should use socket.io. It handles all the heavy lifting for you and is a really cool library.
Be careful with this!
node.js is non-blocking but at the same time handles only 1 connection at a time. What you did is put the node into a dead state, that's why you see data on the second client when you close the first.

Resources