This Node.js server will shutdown cleanly on a Ctrl+C when all connections are closed.
var http = require('http');
var app = http.createServer(function (req, res) {
res.end('Hello');
});
process.on('SIGINT', function() {
console.log('Closing...');
app.close(function () {
console.log('Closed.');
process.exit();
});
});
app.listen(3000);
The problem with this is that it includes keepalive connections. If you open a tab to this app in Chrome and then try to Ctrl+C it, it won't shutdown for about 2 minutes when Chrome finally releases the connection.
Is there a clean way of detecting when there are no more HTTP requests, even if some connections are still open?
By default there's no socket timeout, that means that connections will be open forever until the client closes them. If you want to set a timeout use this function: socket.setTimeout.
If you try to close the server you simply can't because there are active connections, so if you try to gracefully shutdown the shutdown function will hang up. The only way is to set a timeout and when it expires kill the app.
If you have workers it's not as simple as killing the app with process.exit(), so I made a module that does extacly what you're asking: grace.
You can hack some request tracking with the finish event on response:
var reqCount = 0;
var app = http.createServer(function (req, res) {
reqCount++;
res.on('finish', function() { reqCount--; });
res.end('Hello');
});
Allowing you to check whether reqCount is zero when you come to close the server.
The correct thing to do, though, is probably to not care about the old server and just start a new one. Usually the restart is to get new code, so you can start a fresh process without waiting for the old one to end, optionally using the child_process module to have a toplevel script managing the whole thing. Or even use the cluster module, allowing you to start the new process before you've even shut down the old one (since cluster manages balancing traffic between its child instances).
One thing I haven't actually tested very far, is whether it's guaranteed safe to start a new server as soon as server.close() returns. If not, then the new server could potentially fail to bind. There's an example in the server.listen() docs about how to handle such an EADDRINUSE error.
Related
I’m developing a Node.js application which in a user signs up in three levels.
the problem is when a user encounters and error in any of the levels, the whole server reset and the rest of the user sessions get close.
the reason that server restarts is that I’m using forever start app to start my app,otherwise it goes down completely.
how to stop the server to stops completely when just one users
encounters any error?
how to start each users in an individual thread(by thread i mean an isolate environment)?
Considering #Michael 's answer
unfortunately that's not how Node works. An unhandled error can cause
the server to restart.
The only way is to handle execptions in a proper way. So you can check this answers comments about how to handle an uncaughtException
var handleRequest = function(req, res) {
res.writeHead(200);
res1.end('Hello, World!\n');
};
var server = require('http').createServer(handleRequest);
process.on('uncaughtException', function(ex) {
// do something with exception
});
server.listen(8080);
console.log('Server started on port 8080');
I'm trying to show some information before the application is closed. So, I create an event that will be fired when the server receive a SIGINT. This code will work if no connection is ever made. However, if there had been a connection localhost:4040, the server will never close as the server think there is still an active connection(connection count will be 1). The part I don't understand is why Node.js still think there is an active connection when the request has already finish. Is there any ways to kill a connection or my current way of closing the request is wrong?
Here is the code of What I'm trying to accomplish:
var http = require('http')
var server = http.createServer(function(req, res){
res.end('test');
}).listen(4040);
process.on( 'SIGINT', function(){
server.getConnections(function(err, count){
console.log('connection:' + count);
})
server.close(function(){
process.exit();
});
})
Some ideas I have:
saving a copy of the sockets I receive and close them individually
Hope someone can give me some advice in solving this.Thanks
store all sockets from you have and close them manually
use process.exit() and destroy the entire process instead
use domains and do a scary deprecated domain.dispose()
Supposed I have some unit tests that test a web server. For reasons I don't want to discuss here (outside scope ;-)), every test needs a newly started server.
As long as I don't send a request to the server, everything is fine. But once I do, a call to the http server's close function does not work as expected, as all made requests result in kept-alive connections, hence the server waits for 120 seconds before actually closing.
Of course this is not acceptable for running the tests.
At the moment, the only solutions I'd see was either
setting the keep-alive timeout to 0, so a call to close will actually close the server,
or to start each server on a different port, although this becomes hard to handle when you have lots of tests.
Any other ideas of how to deal with this situation?
PS: I had a asked How do I shutdown a Node.js http(s) server immediately? a while ago, and found a viable way to work around it, but as it seems this workaround does not run reliably in every case, as I am getting strange results from time to time.
function createOneRequestServer() {
var server = http.createServer(function (req, res) {
res.write('write stuff');
res.end();
server.close();
}).listen(8080);
}
You could also consider using process to fork processes and kill them after you have tested on that process.
var child = fork('serverModuleYouWishToTest.js');
function callback(signalCode) {
child.kill(signalCode);
}
runYourTest(callback);
This method is desirable because it does not require you to write special cases of your servers to service only one request, and keeps your test code and your production code 100% independant.
I'm trying to make simple http server, that can be pause and resume,, I've looked at Nodejs API,, here http://nodejs.org/docs/v0.6.5/api/http.html
but that couldn't help me,, I've tried to remove event listener on 'request' event and add back,, that worked well but the listen callback call increase every time i try to pause and resume,, here some code i did:
var httpServer = require('http').Server();
var resumed = 0;
function ListenerHandler(){
console.log('[-] HTTP Server running at 127.0.0.1:2525');
};
function RequestHandler(req,res){
res.writeHead(200,{'Content-Type': 'text/plain'});
res.end('Hello, World');
};
function pauseHTTP(){
if(resumed){
httpServer.removeAllListeners('request');
httpServer.close();
resumed = 0;
console.log('[-] HTTP Server Paused');
}
};
function resumeHTTP(){
resumed = 1;
httpServer.on('request',RequestHandler);
httpServer.listen(2525,'127.0.0.1',ListenerHandler);
console.log('[-] HTTP Server Resumed');
};
I don't know quite what you're trying to do, but I think you're working at the wrong level to do what you want.
If you want incoming connection requests to your web server to block until the server is prepared to handle them, you need to stop calling the accept(2) system call on the socket. (I cannot imagine that node.js, or indeed any web server, would make this task very easy. The request callback is doubtless called only when an entire well-formed request has been received, well after session initiation.) Your operating system kernel would continue accepting connections up until the maximum backlog given to the listen(2) system call. On slow sites, that might be sufficient. On busy sites, that's less than a blink of an eye.
If you want incoming connection requests to your web server to be rejected until the server is prepared to handle them, you need to close(2) the listening socket. node.js makes this available via the close() method, but that will tear down the state of the server. You'll have to re-install the callbacks when you want to run again.
I'm making some long polling with node.js.
Basically, node.js server accepts request from the user and then checks for some updates. If there're no updates, it will check them after the timeout.
But what if user has closed his tab, or went to another page? In my case, the script continues working.
Is there a way in node.js to check or detect or to catch an event when user has aborted his request (closed the connection)?
You need to use req.on('close', function(err) { ... }); instead of req.connection.on('close', function(err) { ... });
There is a very important distinction. req.on() adds a listener to this request while req.connection.on(), you add a listener to the (keep-alive) connection between the client and the server. If you use req.connection.on(), every time the client re-uses a connection, you add one more listener to the same connection. When the connection is finally aborted, all listeners are fired.
Function scoping typically keeps you safe from this screwing up your server logic, but it's a dangerous thing nevertheless. Fortunately at least NodeJS 0.10.26 is smart enough to warn the user of this:
(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace:
at Socket.EventEmitter.addListener (events.js:160:15)
at Socket.Readable.on (_stream_readable.js:689:33)
...
Thanks to Miroshko's and yojimbo87's answers I was able to catch the 'close' event, but I had to make some additional tweaks.
The reason why just catching 'close' event wasn't fixing my problem, is that when client sends the request to the node.js server, the server itself can't get information if the connection is still open until he sends something back to the client (as far as I understood - this is because of the HTTP protocol).
So, the additional tweak was to write something to the response from time to time.
One more thing that was preventing this to work, is that I had 'Content-type' as 'application/json'. Changing it to 'text/javascript' helped to stream 'white spaces' from time to time without closing the connection.
In the end, I had something like this:
var server = http.createServer(function(req,res){
res.writeHead(200, {'Content-type': 'text/javascript'});
req.connection.on('close',function(){
// code to handle connection abort
});
/**
* Here goes some long polling handler
* that performs res.write(' '); from time to time
*/
// some another code...
});
server.listen(NODE_PORT, NODE_LISTEN_HOST);
My original code is much bigger, so I had to cut it a lot just to show the sensitive parts.
I'd like to know if there are better solutions, but this is working for me at the moment.
Is there a way in node.js to check or detect or to catch an event when
user has aborted his request (closed the connection)?
You can try to use http.ServerRequest close event. Simple example:
var http = require("http"),
util = require("util");
var httpServer = http.createServer(function(req, res) {
util.log("new request...");
// notify me when client connection is lost
req.on("close", function(err) {
util.log("request closed...");
});
// wait with response for 15 seconds
setTimeout(function() {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.write("response");
res.end();
util.log("response sent...");
}, 15000);
});
httpServer.listen(8080);
util.log("Running on 8080");
I'm using Express.js (~4.10.6) and the following code is working fine for me:
//GET Request:
app.get('/', function(req, res){
req.on('close', function(){
console.log('Client closed the connection');
});
});
As soon as I close the browser's tab, the browser closes the connection, and the callback function gets executed as expected.
Seems that your question is very similar to this one:
NodeJS HTTP request connection's close event fired twice
try
request.connection.on('close', function () {
...
});