Sharing server port with modules which update client via socket.io - node.js

I am implementing a node server which in addition to serving pages also consists of a set of sub-modules which are used to report data via socket.io. Each module is pretty independent from the core server - each module has a timer which processes some data and emits the results back to the web client(s). But due to the structuring of the code/modules, I'm running to a problem related to how the port connection to the server used/shared and was wondering if there was a recommended pattern on how to do this?
The server has a very basic setup and then requires the modules:
var app = require('http').createServer(handler);
app.listen(8888);
function handler (req,res) { ... }
// Here's where the sub-processing happens
var module1 = require('./module1.js');
var module2 = require('./module2.js');
...
var moduleN = require('./moduleN.js');
Then each module has the following structure:
// Socket stuff
var io = require('socket.io').listen(port);// Not sure how to share server port???
io.sockets.on('connection', onConnect);
function onConnect(socket) { ... }
function sendUpdateToClients(type,data) {
io.sockets.emit(type,data);
}
// Timed stuff
setInterval(someProcessing,someInterval);
function someProcessing() {
... // process some data here
sendUpdateToClients(type,data);// Emit the data to clients
}
I've currently have this code running separate from my MEAN application, while I try to best figure out how to organize the code.
I guess my questions are:
- What are best practices to organize sub-modules updating modules?
- Should I be passing a socket reference from the server to each module? If so, how would this best be done?
- Or should I be returning something from the modules back to the server, so it does the updating? If so, how would this best be done?
- Should each module use their own port, separate from the server port?
- Or does this whole organization of code suck and is there a better way?

You can export your module as a function.
//module1.js
module.exports = function(port){
var io = require('socket.io').listen(port);// Not sure how to share server port???
io.sockets.on('connection', onConnect);
function onConnect(socket) { ... }
function sendUpdateToClients(type,data) {
io.sockets.emit(type,data);
}
// Timed stuff
setInterval(someProcessing,someInterval);
function someProcessing() {
... // process some data here
sendUpdateToClients(type,data);// Emit the data to clients
}
}
Then inside your server file.
module2.js
var module1 = require('./module1.js')(8888);

Related

NodeJS on multiple processors (PM2, Cluster, Recluster, Naught)

I am investigating options for running node in a multi-core environment.
I'm trying to determine the best approach and so far I've seen these options
Use built in cluster library to spin up works and respond to signals
Use PM but, PM2 -i is listed as beta.
Naught
Recluster
Are there other alternatives? What are folks using in production?
I've been using the default cluster library, and it works very well. I've had over 10,000 concurrents(multiple clusters on multiple servers) and it works very well.
It is suggested to use clusters with domain for error handling.
This is lifted straight from http://nodejs.org/api/domain.html I've mades some changes on how it spawns new clusters for each core of your machine. and got rid of if/else and added express.
var cluster = require('cluster'),
http = require('http'),
PORT = process.env.PORT || 1337,
os = require('os'),
server;
function forkClusters () {
var cpuCount = os.cpus().length;
// Create a worker for each CPU
for (var i = 0; i < cpuCount ; i += 1) {
cluster.fork();
}
}
// Master Process
if (cluster.isMaster) {
// You can also of course get a bit fancier about logging, and
// implement whatever custom logic you need to prevent DoS
// attacks and other bad behavior.
//
// See the options in the cluster documentation.
//
// The important thing is that the master does very little,
// increasing our resilience to unexpected errors.
forkClusters ()
cluster.on('disconnect', function(worker) {
console.error('disconnect!');
cluster.fork();
});
}
function handleError (d) {
d.on('error', function(er) {
console.error('error', er.stack);
// Note: we're in dangerous territory!
// By definition, something unexpected occurred,
// which we probably didn't want.
// Anything can happen now!Be very careful!
try {
// make sure we close down within 30 seconds
var killtimer = setTimeout(function() {
process.exit(1);
}, 30000);
// But don't keep the process open just for that!
killtimer.unref();
// stop taking new requests.
server.close();
// Let the master know we're dead.This will trigger a
// 'disconnect' in the cluster master, and then it will fork
// a new worker.
cluster.worker.disconnect();
} catch (er2) {
// oh well, not much we can do at this point.
console.error('Error sending 500!', er2.stack);
}
});
}
// child Process
if (cluster.isWorker) {
// the worker
//
// This is where we put our bugs!
var domain = require('domain');
var express = require('express');
var app = express();
app.set('port', PORT);
// See the cluster documentation for more details about using
// worker processes to serve requests.How it works, caveats, etc.
var d = domain.create();
handleError(d);
// Now run the handler function in the domain.
//
// put all code here. any code included outside of domain.run will not handle errors on the domain level, but will crash the app.
//
d.run(function() {
// this is where we start our server
server = http.createServer(app).listen(app.get('port'), function () {
console.log('Cluster %s listening on port %s', cluster.worker.id, app.get('port'));
});
});
}
We use Supervisor to manage our Node.JS process's, to start them upon boot, and to act as a watchdog in case the process's crash.
We use Nginx as a reverse-proxy to load balance traffic between the process's that listen to different ports
this way each process is isolated from the other.
for example: Nginx listens on port 80 and forwards traffic to ports 8000-8003
I was using PM2 for quite a while, but their pricing is expensive for my needs as I'm having my own analytics environment and I don't require support, so I decided to experiment alternatives. For my case, just forever made the trick, very simple one actually:
forever -m 5 app.js
Another useful example is
forever start app.js -p 8080

How do I stop a Node.js HTTP server programmatically such that the process exits?

I'm writing some tests and would like to be able to start/stop my HTTP server programmatically. Once I stop the HTTP server, I would like the process that started it to exit.
My server is like:
// file: `lib/my_server.js`
var LISTEN_PORT = 3000
function MyServer() {
http.Server.call(this, this.handle)
}
util.inherits(MyServer, http.Server)
MyServer.prototype.handle = function(req, res) {
// code
}
MyServer.prototype.start = function() {
this.listen(LISTEN_PORT, function() {
console.log('Listening for HTTP requests on port %d.', LISTEN_PORT)
})
}
MyServer.prototype.stop = function() {
this.close(function() {
console.log('Stopped listening.')
})
}
The test code is like:
// file: `test.js`
var MyServer = require('./lib/my_server')
var my_server = new MyServer();
my_server.on('listening', function() {
my_server.stop()
})
my_server.start()
Now, when I run node test.js, I get the stdout output that I expect,
$ node test.js
Listening for HTTP requests on port 3000.
Stopped listening.
but I have no idea how to get the process spawned by node test.js to exit and return back to the shell.
Now, I understand (abstractly) that Node keeps running as long as there are bound event handlers for events that it's listening for. In order for node test.js to exit to the shell upon my_server.stop(), do I need to unbind some event? If so, which event and from what object? I have tried modifying MyServer.prototype.stop() by removing all event listeners from it but have had no luck.
I've been looking for an answer to this question for months and I've never yet seen a good answer that doesn't use process.exit. It's quite strange to me that it is such a straightforward request but no one seems to have a good answer for it or seems to understand the use case for stopping a server without exiting the process.
I believe I might have stumbled across a solution. My disclaimer is that I discovered this by chance; it doesn't reflect a deep understanding of what's actually going on. So this solution may be incomplete or maybe not the only way of doing it, but at least it works reliably for me. In order to stop the server, you need to do two things:
Call .end() on the client side of every opened connection
Call .close() on the server
Here's an example, as part of a "tape" test suite:
test('mytest', function (t) {
t.plan(1);
var server = net.createServer(function(c) {
console.log("Got connection");
// Do some server stuff
}).listen(function() {
// Once the server is listening, connect a client to it
var port = server.address().port;
var sock = net.connect(port);
// Do some client stuff for a while, then finish the test
setTimeout(function() {
t.pass();
sock.end();
server.close();
}, 2000);
});
});
After the two seconds, the process will exit and the test will end successfully. I've also tested this with multiple client sockets open; as long as you end all client-side connections and then call .close() on the server, you are good.
http.Server#close
https://nodejs.org/api/http.html#http_server_close_callback
module.exports = {
server: http.createServer(app) // Express App maybe ?
.on('error', (e) => {
console.log('Oops! Something happened', e));
this.stopServer(); // Optionally stop the server gracefully
process.exit(1); // Or violently
}),
// Start the server
startServer: function() {
Configs.reload();
this.server
.listen(Configs.PORT)
.once('listening', () => console.log('Server is listening on', Configs.PORT));
},
// Stop the server
stopServer: function() {
this.server
.close() // Won't accept new connection
.once('close', () => console.log('Server stopped'));
}
}
Notes:
"close" callback only triggers when all leftover connections have finished processing
Trigger process.exit in "close" callback if you want to stop the process too
To cause the node.js process to exit, use process.exit(status) as described in http://nodejs.org/api/process.html#process_process_exit_code
Update
I must have misunderstood.
You wrote: "...but I have no idea how to get the process spawned by node test.js to exit and return back to the shell."
process.exit() does this.
Unless you're using the child_processes module, node.js runs in a single process. It does not "spawn" any further processes.
The fact that node.js continues to run even though there appears to be nothing for it to do is a feature of its "event loop" which continually loops, waiting for events to occur.
To halt the event loop, use process.exit().
UPDATE
After a few small modifications, such as the proper use of module.exports, addition of semicolons, etc., running your example on a Linux server (Fedora 11 - Leonidas) runs as expected and dutifully returns to the command shell.
lib/my_server.js
// file: `lib/my_server.js`
var util=require('util'),
http=require('http');
var LISTEN_PORT=3000;
function MyServer(){
http.Server.call(this, this.handle);
}
util.inherits(MyServer, http.Server);
MyServer.prototype.handle=function(req, res){
// code
};
MyServer.prototype.start=function(){
this.listen(LISTEN_PORT, function(){
console.log('Listening for HTTP requests on port %d.', LISTEN_PORT)
});
};
MyServer.prototype.stop=function(){
this.close(function(){
console.log('Stopped listening.');
});
};
module.exports=MyServer;
test.js
// file: `test.js`
var MyServer = require('./lib/my_server');
var my_server = new MyServer();
my_server.on('listening', function() {
my_server.stop();
});
my_server.start();
Output
> node test.js
Listening for HTTP requests on port 3000.
Stopped listening.
>
Final thoughts:
I've found that the conscientious use of statement-ending semicolons has saved me from a wide variety of pernicious, difficult to locate bugs.
While most (if not all) JavaScript interpreters provide something called "automatic semicolon insertion" (or ASI) based upon a well-defined set of rules (See http://dailyjs.com/2012/04/19/semicolons/ for an excellent description), there are several instances where this feature can inadvertently work against the intent of the programmer.
Unless you are very well versed in the minutia of JavaScript syntax, I would strongly recommend the use of explicit semicolons rather than relying upon ASI's implicit ones.

How do I shutdown a Node.js http(s) server immediately?

I have a Node.js application that contains an http(s) server.
In a specific case, I need to shutdown this server programmatically. What I am currently doing is calling its close() function, but this does not help, as it waits for any kept alive connections to finish first.
So, basically, this shutdowns the server, but only after a minimum wait time of 120 seconds. But I want the server to shutdown immediately - even if this means breaking up with currently handled requests.
What I can not do is a simple
process.exit();
as the server is only part of the application, and the rest of the application should remain running. What I am looking for is conceptually something such as server.destroy(); or something like that.
How could I achieve this?
PS: The keep-alive timeout for connections is usually required, hence it is not a viable option to decrease this time.
The trick is that you need to subscribe to the server's connection event which gives you the socket of the new connection. You need to remember this socket and later on, directly after having called server.close(), destroy that socket using socket.destroy().
Additionally, you need to listen to the socket's close event to remove it from the array if it leaves naturally because its keep-alive timeout does run out.
I have written a small sample application you can use to demonstrate this behavior:
// Create a new server on port 4000
var http = require('http');
var server = http.createServer(function (req, res) {
res.end('Hello world!');
}).listen(4000);
// Maintain a hash of all connected sockets
var sockets = {}, nextSocketId = 0;
server.on('connection', function (socket) {
// Add a newly connected socket
var socketId = nextSocketId++;
sockets[socketId] = socket;
console.log('socket', socketId, 'opened');
// Remove the socket when it closes
socket.on('close', function () {
console.log('socket', socketId, 'closed');
delete sockets[socketId];
});
// Extend socket lifetime for demo purposes
socket.setTimeout(4000);
});
// Count down from 10 seconds
(function countDown (counter) {
console.log(counter);
if (counter > 0)
return setTimeout(countDown, 1000, counter - 1);
// Close the server
server.close(function () { console.log('Server closed!'); });
// Destroy all open sockets
for (var socketId in sockets) {
console.log('socket', socketId, 'destroyed');
sockets[socketId].destroy();
}
})(10);
Basically, what it does is to start a new HTTP server, count from 10 to 0, and close the server after 10 seconds. If no connection has been established, the server shuts down immediately.
If a connection has been established and it is still open, it is destroyed.
If it had already died naturally, only a message is printed out at that point in time.
I found a way to do this without having to keep track of the connections or having to force them closed. I'm not sure how reliable it is across Node versions or if there are any negative consequences to this but it seems to work perfectly fine for what I'm doing. The trick is to emit the "close" event using setImmediate right after calling the close method. This works like so:
server.close(callback);
setImmediate(function(){server.emit('close')});
At least for me, this ends up freeing the port so that I can start a new HTTP(S) service by the time the callback is called (which is pretty much instantly). Existing connections stay open. I'm using this to automatically restart the HTTPS service after renewing a Let's Encrypt certificate.
If you need to keep the process alive after closing the server, then Golo Roden's solution is probably the best.
But if you're closing the server as part of a graceful shutdown of the process, you just need this:
var server = require('http').createServer(myFancyServerLogic);
server.on('connection', function (socket) {socket.unref();});
server.listen(80);
function myFancyServerLogic(req, res) {
req.connection.ref();
res.end('Hello World!', function () {
req.connection.unref();
});
}
Basically, the sockets that your server uses will only keep the process alive while they're actually serving a request. While they're just sitting there idly (because of a Keep-Alive connection), a call to server.close() will close the process, as long as there's nothing else keeping the process alive. If you need to do other things after the server closes, as part of your graceful shutdown, you can hook into process.on('beforeExit', callback) to finish your graceful shutdown procedures.
The https://github.com/isaacs/server-destroy library provides an easy way to destroy() a server with the behavior desired in the question (by tracking opened connections and destroying each of them on server destroy, as described in other answers).
As others have said, the solution is to keep track of all open sockets and close them manually. My node package killable can do this for you. An example (using express, but you can call use killable on any http.server instance):
var killable = require('killable');
var app = require('express')();
var server;
app.route('/', function (req, res, next) {
res.send('Server is going down NOW!');
server.kill(function () {
//the server is down when this is called. That won't take long.
});
});
var server = app.listen(8080);
killable(server);
Yet another nodejs package to perform a shutdown killing connections: http-shutdown, which seems reasonably maintained at the time of writing (Sept. 2016) and worked for me on NodeJS 6.x
From the documentation
Usage
There are currently two ways to use this library. The first is explicit wrapping of the Server object:
// Create the http server
var server = require('http').createServer(function(req, res) {
res.end('Good job!');
});
// Wrap the server object with additional functionality.
// This should be done immediately after server construction, or before you start listening.
// Additional functionailiy needs to be added for http server events to properly shutdown.
server = require('http-shutdown')(server);
// Listen on a port and start taking requests.
server.listen(3000);
// Sometime later... shutdown the server.
server.shutdown(function() {
console.log('Everything is cleanly shutdown.');
});
The second is implicitly adding prototype functionality to the Server object:
// .extend adds a .withShutdown prototype method to the Server object
require('http-shutdown').extend();
var server = require('http').createServer(function(req, res) {
res.end('God job!');
}).withShutdown(); // <-- Easy to chain. Returns the Server object
// Sometime later, shutdown the server.
server.shutdown(function() {
console.log('Everything is cleanly shutdown.');
});
My best guess would be to kill the connections manually (i.e. to forcibly close it's sockets).
Ideally, this should be done by digging into the server's internals and closing it's sockets by hand. Alternatively, one could run a shell-command that does the same (provided the server has proper privileges &c.)
I have answered a variation of "how to terminate a HTTP server" many times on different node.js support channels. Unfortunately, I couldn't recommend any of the existing libraries because they are lacking in one or another way. I have since put together a package that (I believe) is handling all the cases expected of graceful HTTP server termination.
https://github.com/gajus/http-terminator
The main benefit of http-terminator is that:
it does not monkey-patch Node.js API
it immediately destroys all sockets without an attached HTTP request
it allows graceful timeout to sockets with ongoing HTTP requests
it properly handles HTTPS connections
it informs connections using keep-alive that server is shutting down by setting a connection: close header
it does not terminate the Node.js process
Usage:
import http from 'http';
import {
createHttpTerminator,
} from 'http-terminator';
const server = http.createServer();
const httpTerminator = createHttpTerminator({
server,
});
await httpTerminator.terminate();
const Koa = require('koa')
const app = new Koa()
let keepAlive = true
app.use(async (ctx) => {
let url = ctx.request.url
// destroy socket
if (keepAlive === false) {
ctx.response.set('Connection', 'close')
}
switch (url) {
case '/restart':
ctx.body = 'success'
process.send('restart')
break;
default:
ctx.body = 'world-----' + Date.now()
}
})
const server = app.listen(9011)
process.on('message', (data, sendHandle) => {
if (data == 'stop') {
keepAlive = false
server.close();
}
})
process.exit(code); // code 0 for success and 1 for fail

Best way using events in node.js

Which is the best approach for listening/launching events in node.js?
I've been testing event launching and listening in node.js by extending the model with EventEmitter and I'm wondering if it has sense this approach since the events are only listened when there is a instance of the model.
How can be achieved that events will be listened while the node app is alive??
Example for extending the model using eventEmitter.
// myModel.js
var util = require('util');
var events2 = require('events').EventEmitter;
var MyModel = function() {
events2.call(this);
// Create a event listener
this.on('myEvent', function(value) {
console.log('hi!');
});
};
MyModel.prototype.dummyFunction = function(params) {
// just a dummy function.
}
util.inherits(MyModel, events2);
module.exports = MyModel;
EDIT: A more clear question about this would be: how to keep a permanent process that listens the events during the app execution and it has a global scope (something like a running event manager that listens events produced in the app).
Would be a solution to require the file myModel.js in app.js? How this kind of things are solved in node.js?
I'm not entirely sure what you mean about events only being active when there is an instance of a model since without something to listen and react to them, events cannot occur.
Having said that, it is certainly reasonable to:
util.inherits(global,EventEmitter)
global.on('myEvent',function(){
/* do something useful */
});
which would allow you to:
global.emit('myEvent')
or even:
var ee=new EventEmitter();
ee.on('myEvent',...)
As for how to properly use EventEmitter: it's defined as
function EventEmitter() {}
which does not provide for initialization, so it should be sufficient to:
var Thing=function(){};
util.inherits(Thing,EventEmitter);
which will extend instances of Thing with:
setMaxListeners(num)
emit(type,...)
addListener(type,listener) -- aliased as on()
once(type,listener)
removeListener(type,listener)
removeAllListeners()
listeners(type)
The only possible "gotcha" is that EventEmitter adds its own _events object property to any extended object (this) which suggests you should not name any of your own object properties with the same name without unexpected behavior.

Nodeunit Execution Order?

I am trying to test my web server using nodeunit:
test.js
exports.basic = testCase({
setUp: function (callback) {
this.ws = new WrappedServer();
this.ws.run(PORT);
callback();
},
tearDown: function (callback) {
delete this.ws;
callback();
},
testFoo: function(test) {
var socket = ioClient.connect(URL);
console.log('before client emit')
socket.emit('INIT', 1, 1);
console.log('after client emit');
}
});
and this is my very simple nodejs server:
WrappedServer.prototype.run = function(port) {
this.server = io.listen(port, {'log level': 2});
this.attachCallbacks();
};
WrappedServer.prototype.attachCallbacks = function() {
var ws = this;
ws.server.sockets.on('connection', function(socket) {
ws.attachDebugToSocket(socket);
console.log('socket attaching INIT');
socket.on('INIT', function(userId, roomId) {
// do something here
});
console.log('socket finished attaching INIT');
});
}
Basically I am getting this error:
[...cts/lolol/nodejs/testing](testingServer)$ nodeunit ws.js
info - socket.io started
before client emit
after client emit
info - handshake authorized 1013616781193777373
The "sys" module is now called "util". It should have a similar interface.
socket before attaching INIT
socket finished attaching INIT
info - transport end
Somehow, the socket emits INIT BEFORE the server attaches callbacks for sockets.
Why is this happening? In addition, what's the right way to do this?
I'm assuming you were expecting the order to be this?
socket before attaching INIT
socket finished attaching INIT
before client emit
after client emit
From the small amount of code given, the issue is probably two things.
First, and probably the main issue, is that your ioClient.connect will not connect immediately. You need to pass some kind of callback to that, and emit INIT, and then execute the test's callback function once it has actually connected.
Second, you should probably do the same thing with you run command. listen will not stary listening immediately, so you're going to get inconsistent results occasionally if it hasn't started listening by the time it executes your test. You should also pass the setUp's callback to io.listen.
Update
To be clear for listen, just like most things in node, the socketio server's listen method is asynchronous. Calling the method tells it to start listening, but there is some time in the background where the server sets up the networking stuff to start listening. Just like node's core listen, http://nodejs.org/docs/latest/api/net.html#server.listen, socket.io's version takes a callback argument that is called once the server is up and listening.
io.listen(port, {'log level': 2}, callback);
Unless socket.io starts giving you errors about failing to connect, this probably is not an issue, but it is something to keep in mind. Treating asynchronous actions as if they were instantaneous is an easy way to make bugs that only come up occasionally. Since your run wraps listen, I think in general, not just for testing, passing a callback to run would be a very good idea.

Resources