Node socket.io server crash when starts - node.js

I have a socket.io server and a client runing correctly. Each time that the server is down, the client try to connect again each 5 seconds. When the server is up again they connect without problems.
But the problem comes when I wait long time before up the server again, when the server is up, it crashes showing :
info - socket.io started
debug - client authorized
info - handshake authorized DqN4t2YVP7NiqQi8zer9
debug - setting request GET /socket.io/1/websocket/DqN4t2YVP7NiqQi8zer9
debug - set heartbeat interval for client DqN4t2YVP7NiqQi8zer9
debug - client authorized for
debug - websocket writing 1::
buffer.js:287
pool = new SlowBuffer(Buffer.poolSize);
^
RangeError: Maximum call stack size exceeded
Client reconnection (Executed each 5 seconds while is not connected):
function socket_connect() {
if (!socket) {
socket = io.connect('http://192.168.1.25:8088', { 'reconnect':false, 'connect timeout': 5000 });
} else {
socket.socket.connect();
}
socket.on("connect", function () {
clearInterval(connect_interval);
connect_interval = 0;
socket.emit('player', { refresh_data:true });
});
}
On server side, only with the socket instance, it crashes:
var io = require('socket.io').listen(8088);
I think that the problem is:
When the server goes up, it recive all the connections emited by the client each 5 seconds, (15 hours disconnected * 60 m * 60 s / 5 seconds reconnection) and it crashes.
What can i do to close the connections that the server try to do?
PS:If i reload the client, and after up the server, it works

The main idea for socket.io.js is to reuse an existing connection.
You should only connect it once and then exchange messages by using socket.emit()
I am not sure why you are creating a new connection between your client and server for every 5 seconds. There is a limit on the number of connections the server can create, but that should be more than enough. If you put it in a loop then eventually the server will run out of sockets.
io.connect has to be executed once on the client, then may be you can socket.emit() every 5 seconds. Remove the { 'reconnect':false, 'connect timeout': 5000 } and you will be fine.

I founded the problem...
Each time that the function socket_connect() is called, a "socket.on("connect" ..." function is created. So when the server turns up, a new connection is created, but the event "socket.on("connect" is fired multiple times...
The solution was:
function socket_connect() {
if (!socket) {
socket = io.connect('http://192.168.1.25:8088', { 'reconnect':false, 'connect timeout': 5000 });
} else {
socket.socket.connect();
}
}
socket.on("connect", function () {
clearInterval(connect_interval);
connect_interval = 0;
socket.emit('player', { refresh_data:true });
});

Related

Node.JS event loop socket delayed or thread blocked

I have a problem with node.js when sending a lot of concurrent request. The problem is that sometimes it seems it puts some request at the end of the event pool and give me the response after some serious time (60 seconds+, normal is under 10 seconds).
The story goes like this, i have 3 scripts, a CONSOLE, a SERVER and 10 CLIENTS.
console.js
// sending message
client.connect(port, host, function(connect_socket)
{
client.sendMessage(code:301,... some json-socket message ...);
client.on('message', function(message)
{
if(message.code == 304)
{
console.log(... print data ...)
}
});
}
server.js
server = net.createServer(function(socket)
{
socket = new JsonSocket(socket);
socket.on('message', function(message)
{
if(message.code == 301)
{
var client_random = get_random_client();
client_random.sendMessage(code:302,... some json-socket message ...);
}
if(message.code == 303)
{
var client_return = get_client_by_id(message.return_client_id);
client_return.sendMessage(code:304,... some json-socket message ...);
}
});
});
});
client.js
client.connect(port, host, function(connect_socket)
{
client.on('message', function(message)
{
if(message.code == 302)
{
execute_command(message.data, function(result_command)
{
client.sendMessage(code:303,... some json-socket message (good or bad check) ...)
});
}
});
}
Arhitecture concept, console sends message to server, server to a random client, client executes an external program and sends output back to server, server sends response back to the console and console prints it.
console.js => server.js => client.js => server.js => console.js
I open the server, clients are connecting no problem. I open the console and type the command, i get every time the response under 10 seconds.
Now, i made another PHP script that would simulate 600 requests per second. I do the same thing, open console, send command, and once every 10 runs (1 of 10), the console waits, waits and waits, and after 60 seconds it gives me the result (10 was normal).
I made some debug and it seems that server.js do not trigger message event when reciving from client.js and somehow puts it at the very end of the event pool, but never forget it, runs eventually.
I have double check :
console.js every time sends message to server.js (always instant)
server.js every time sends message to client.js (always instant)
client.js every time sends message to server.js (always instant)
[server.js do not fire the event message event instant, and put it
on the very end of the event pool ]
server.js every time sends message to client.js (always instant)
Also i have checked for the possible I/O main thread block, everything is fine. All operations are async, no sync functions.
It is that kind of bug that sometime it is manifesting, sometimes not. Like after a console.js waiting, you can open another terminal and console.js and send messages and see how it responds right away. As i already told, it is like a probability of 1 from 10.
How can i solve this? I had made a lot of debugging.

Node.js connectListener still called on socket error

I'm having a weird issue with a TCP client - I use socket.connect() to connect to the server instance. However, since the server is not running, I receive an error of ECONNREFUSED (so far so good).
I handle it using on('error') and set a timeout to try and reconnect in 10 seconds. This should continue to fail as long as the server is down. which is the case.
However, as soon as the server is running, it looks like all of the previous sockets are still active, so now I have several client sockets connected to the server.
I tried to call the destroy at the beginning of the on('error') handler function.
Any ideas how to deal with that?
Thanks!
EDIT: Code snippet:
var mySocket;
var self = this;
...
var onError = function (error) {
mySocket.destroy(); // this does not change anything...
console.log(error);
// Wait 10 seconds and try to reconnect
setTimeout(function () {
console.log("reconnecting...");
self.once('InitDone', function () {
// do something
console.log("init is done")
});
self.init();
}, 10000);
}
Inside init function:
...
console.log("trying to connect");
mySocket = tls.connect(options, function () {
console.log("connected!");
self.emit('InitDone');
});
mySocket.setEncoding('utf8');
mySocket.on('error', onError);
...
The result of this is something like the following:
trying to connect
ECONNREFUSED
reconnecting...
trying to connect
ECONNREFUSED
reconnecting...
trying to connect
ECONNREFUSED
reconnecting...
--> Starting the server here
trying to connect
connected
init is done
connected
init is done
connected
init is done
connected
init is done
However I would expect only one connection since the previous sockets failed to connect. Hope this clarifies the question.
Thanks!

How do I shutdown a Node.js http(s) server immediately?

I have a Node.js application that contains an http(s) server.
In a specific case, I need to shutdown this server programmatically. What I am currently doing is calling its close() function, but this does not help, as it waits for any kept alive connections to finish first.
So, basically, this shutdowns the server, but only after a minimum wait time of 120 seconds. But I want the server to shutdown immediately - even if this means breaking up with currently handled requests.
What I can not do is a simple
process.exit();
as the server is only part of the application, and the rest of the application should remain running. What I am looking for is conceptually something such as server.destroy(); or something like that.
How could I achieve this?
PS: The keep-alive timeout for connections is usually required, hence it is not a viable option to decrease this time.
The trick is that you need to subscribe to the server's connection event which gives you the socket of the new connection. You need to remember this socket and later on, directly after having called server.close(), destroy that socket using socket.destroy().
Additionally, you need to listen to the socket's close event to remove it from the array if it leaves naturally because its keep-alive timeout does run out.
I have written a small sample application you can use to demonstrate this behavior:
// Create a new server on port 4000
var http = require('http');
var server = http.createServer(function (req, res) {
res.end('Hello world!');
}).listen(4000);
// Maintain a hash of all connected sockets
var sockets = {}, nextSocketId = 0;
server.on('connection', function (socket) {
// Add a newly connected socket
var socketId = nextSocketId++;
sockets[socketId] = socket;
console.log('socket', socketId, 'opened');
// Remove the socket when it closes
socket.on('close', function () {
console.log('socket', socketId, 'closed');
delete sockets[socketId];
});
// Extend socket lifetime for demo purposes
socket.setTimeout(4000);
});
// Count down from 10 seconds
(function countDown (counter) {
console.log(counter);
if (counter > 0)
return setTimeout(countDown, 1000, counter - 1);
// Close the server
server.close(function () { console.log('Server closed!'); });
// Destroy all open sockets
for (var socketId in sockets) {
console.log('socket', socketId, 'destroyed');
sockets[socketId].destroy();
}
})(10);
Basically, what it does is to start a new HTTP server, count from 10 to 0, and close the server after 10 seconds. If no connection has been established, the server shuts down immediately.
If a connection has been established and it is still open, it is destroyed.
If it had already died naturally, only a message is printed out at that point in time.
I found a way to do this without having to keep track of the connections or having to force them closed. I'm not sure how reliable it is across Node versions or if there are any negative consequences to this but it seems to work perfectly fine for what I'm doing. The trick is to emit the "close" event using setImmediate right after calling the close method. This works like so:
server.close(callback);
setImmediate(function(){server.emit('close')});
At least for me, this ends up freeing the port so that I can start a new HTTP(S) service by the time the callback is called (which is pretty much instantly). Existing connections stay open. I'm using this to automatically restart the HTTPS service after renewing a Let's Encrypt certificate.
If you need to keep the process alive after closing the server, then Golo Roden's solution is probably the best.
But if you're closing the server as part of a graceful shutdown of the process, you just need this:
var server = require('http').createServer(myFancyServerLogic);
server.on('connection', function (socket) {socket.unref();});
server.listen(80);
function myFancyServerLogic(req, res) {
req.connection.ref();
res.end('Hello World!', function () {
req.connection.unref();
});
}
Basically, the sockets that your server uses will only keep the process alive while they're actually serving a request. While they're just sitting there idly (because of a Keep-Alive connection), a call to server.close() will close the process, as long as there's nothing else keeping the process alive. If you need to do other things after the server closes, as part of your graceful shutdown, you can hook into process.on('beforeExit', callback) to finish your graceful shutdown procedures.
The https://github.com/isaacs/server-destroy library provides an easy way to destroy() a server with the behavior desired in the question (by tracking opened connections and destroying each of them on server destroy, as described in other answers).
As others have said, the solution is to keep track of all open sockets and close them manually. My node package killable can do this for you. An example (using express, but you can call use killable on any http.server instance):
var killable = require('killable');
var app = require('express')();
var server;
app.route('/', function (req, res, next) {
res.send('Server is going down NOW!');
server.kill(function () {
//the server is down when this is called. That won't take long.
});
});
var server = app.listen(8080);
killable(server);
Yet another nodejs package to perform a shutdown killing connections: http-shutdown, which seems reasonably maintained at the time of writing (Sept. 2016) and worked for me on NodeJS 6.x
From the documentation
Usage
There are currently two ways to use this library. The first is explicit wrapping of the Server object:
// Create the http server
var server = require('http').createServer(function(req, res) {
res.end('Good job!');
});
// Wrap the server object with additional functionality.
// This should be done immediately after server construction, or before you start listening.
// Additional functionailiy needs to be added for http server events to properly shutdown.
server = require('http-shutdown')(server);
// Listen on a port and start taking requests.
server.listen(3000);
// Sometime later... shutdown the server.
server.shutdown(function() {
console.log('Everything is cleanly shutdown.');
});
The second is implicitly adding prototype functionality to the Server object:
// .extend adds a .withShutdown prototype method to the Server object
require('http-shutdown').extend();
var server = require('http').createServer(function(req, res) {
res.end('God job!');
}).withShutdown(); // <-- Easy to chain. Returns the Server object
// Sometime later, shutdown the server.
server.shutdown(function() {
console.log('Everything is cleanly shutdown.');
});
My best guess would be to kill the connections manually (i.e. to forcibly close it's sockets).
Ideally, this should be done by digging into the server's internals and closing it's sockets by hand. Alternatively, one could run a shell-command that does the same (provided the server has proper privileges &c.)
I have answered a variation of "how to terminate a HTTP server" many times on different node.js support channels. Unfortunately, I couldn't recommend any of the existing libraries because they are lacking in one or another way. I have since put together a package that (I believe) is handling all the cases expected of graceful HTTP server termination.
https://github.com/gajus/http-terminator
The main benefit of http-terminator is that:
it does not monkey-patch Node.js API
it immediately destroys all sockets without an attached HTTP request
it allows graceful timeout to sockets with ongoing HTTP requests
it properly handles HTTPS connections
it informs connections using keep-alive that server is shutting down by setting a connection: close header
it does not terminate the Node.js process
Usage:
import http from 'http';
import {
createHttpTerminator,
} from 'http-terminator';
const server = http.createServer();
const httpTerminator = createHttpTerminator({
server,
});
await httpTerminator.terminate();
const Koa = require('koa')
const app = new Koa()
let keepAlive = true
app.use(async (ctx) => {
let url = ctx.request.url
// destroy socket
if (keepAlive === false) {
ctx.response.set('Connection', 'close')
}
switch (url) {
case '/restart':
ctx.body = 'success'
process.send('restart')
break;
default:
ctx.body = 'world-----' + Date.now()
}
})
const server = app.listen(9011)
process.on('message', (data, sendHandle) => {
if (data == 'stop') {
keepAlive = false
server.close();
}
})
process.exit(code); // code 0 for success and 1 for fail

node.js + socket.IO - socket not reconnecting?

I know socket.io has a built in feature for reconnecting and everything, however I don't think that it is working - as I have seen from others it's also not working for them either.
If a user puts their computer to sleep, it disconnects them, and then when they open it back up they are no longer connected so they don't any of the notifications or anything until they refresh the page. Perhaps it's just something that I'm not doing correctly?
var io = require('socket.io').listen(8080);
var users = {};
////////////////USER CONNECTED/////
console.log("Sever is now running");
io.sockets.on('connection', function (socket) {
//Tell the client that they are connected
socket.emit('connected');
//Once the users session is recieved
socket.on('session', function (session) {
//Add users to users variable
users[socket.id] = {userID:session, socketID:socket};
//When user disconnects
socket.on('disconnect', function () {
//socket.socket.connect();
var count= 0;
for(var key in users){
if(users[key].userID==session)++count;
if(count== 2) break;
}
if(count== 1){
socket.broadcast.emit('disconnect', { data : session});
}
//Remove users session id from users variable
delete users[socket.id];
});
socket.on('error', function (err) {
//socket.socket.connect();
});
socket.emit("connection") needs to be called when the user reconnects, or at least the events that happen in that event need to be called.
Also socket.socket.connect(); doesn't work, it returns with an error and it shuts the socket server down with an error of "connect doesn't exist".
The problem is related to io.connect params.
Look at this client code (it will try to reconnect forever, with max delay between attempts 3sec):
ioParams = {'reconnection limit': 3000, 'max reconnection attempts': Number.MAX_VALUE, 'connect timeout':7000}
socketAddress = null
socket = io.connect(socketAddress, ioParams)
There are two important parameters out there, related to your problem:
reconnection limit - limit the upper time of delay between reconnect attemts. Normally it's getting bigger and bigger in time of server outage
max reconnection attempts - how many times you want to try. Default is 10. In most cases this is the problem why the client stops trying.

How do you kill a redis client when there is no connection?

I have a valid server configuration in which redis can't be accessed, but the server can function correctly (I simply strip away features when redis can't be found).
However, I can't manage the connection errors well. I'd like to know when a connection error fails and shutdown the client in that case.
I've found that the connection retry will never stop. And quit() is actually swallowed - "Queueing quit for next server connection." - when called.
Is there a way to kill the client in the case where no connection can be established?
var redis = require("redis"),
client = redis.createClient();
client.on("error", function(err) {
logme.error("Bonk. The worker framework cannot connect to redis, which might be ok on a dev server!");
logme.error("Resque error : "+err);
client.quit();
});
client.on("idle", function(err) {
logme.error("Redis queue is idle. Shutting down...");
});
client.on("end", function(err) {
logme.error("Redis is shutting down. This might be ok if you chose not to run it in your dev environment");
});
client.on("ready", function(err) {
logme.info("Redis up! Now connecting the worker queue client...");
});
ERROR - Resque error : Error: Redis connection to 127.0.0.1:6379 failed - connect ECONNREFUSED
ERROR - Redis is shutting down. This might be ok if you chose not to run it in your dev environment
ERROR - Resque error : Error: Redis connection to 127.0.0.1:6379 failed - connect ECONNREFUSED
ERROR - Resque error : Error: Redis connection to 127.0.0.1:6379 failed - connect ECONNREFUSED
ERROR - Resque error : Error: Redis connection to 127.0.0.1:6379 failed - connect ECONNREFUSED
ERROR - Resque error : Error: Redis connection to 127.0.0.1:6379 failed - connect ECONNREFUSED
One thing that is interesting is the fact that the 'end' event gets emitted. Why?
For v3.1.2 of the library
The right way to have control on the client's reconnect behaviour is to use a retry_strategy.
Upon disconnection the redisClient will try to reconnect as per the default behaviour. The default behaviour can be overridden by providing a retry_strategy while creating the client.
Example usage of some fine grained control from the documentation.
var client = redis.createClient({
retry_strategy: function (options) {
if (options.error && options.error.code === 'ECONNREFUSED') {
// End reconnecting on a specific error and flush all commands with
// a individual error
return new Error('The server refused the connection');
}
if (options.total_retry_time > 1000 * 60 * 60) {
// End reconnecting after a specific timeout and flush all commands
// with a individual error
return new Error('Retry time exhausted');
}
if (options.attempt > 10) {
// End reconnecting with built in error
return undefined;
}
// reconnect after
return Math.min(options.attempt * 100, 3000);
}
});
Ref: https://www.npmjs.com/package/redis/v/3.1.2
For the purpose of killing the client when the connection is lost, we could use the following retry_strategy.
var client = redis.createClient({
retry_strategy: function (options) {
return undefined;
}
});
Update June 2022 (Redis v4.1.0)
The original answer was for an earlier version of Redis client. Since v4 things have changed in the client configuration. Specifically, the retry_strategy is now called reconnectStrategy and is nested under the socket configuration option for createClient.
You might want to just forcibly end the connection to redis on error with client.end() rather than using client.quit() which waits for the completion of all outstanding requests and then sends the QUIT command which as you know requires a working connection with redis to complete.

Resources