Error connecting mariadb - node.js? - node.js

I am using MariaDB in my node.js application. I am having the following code
var nodeMaria = require('node-mariadb');
var connection = nodeMaria.createConnection({
driverType: nodeMaria.DRIVER_TYPE_HANDLER_SOCKET,
host:'localhost',
port:9998
});
connection.on('error', function(err){
console.log(err);
process.exit(1);
});
connection.on('connect', function(){
console.log("mariadb connected");
});
Problem
After connecting the db, It is logging "mariadb connected".
After that Application is breaking without throwing any error.
Note: I had handled the error in connection connection.on('erorr',function(){});
Any help will be great.

If the application is closing, it doesn't necessarily mean that an error has occurred with connecting to MariaDB.
In fact, if there isn't anything keeping a node application explicitly open, it just closes after execution of the code.
What keeps a node application open? Event listeners. If you have an event listener listening for events, the node application doesn't close after finishing code execution. For example, if you have a http.listen command, which starts the web server and starts listening for incoming HTTP connections.

Related

Redis connection lost without any indication

I'm using a very simple redis pub-sub application, in which I have a redis server in AWS and a nodejs based redis client that is located inside office LAN that subscribes to some channel.
This worked great until the network changed and it seems that some device is now interfering with outgoing connections (I also started receiving socket hangups on outbound SSH connections which I mitigated with the ServerAliveInterval 60 setting in the SSH config).
After the network change, whenever the redis client application is executed, it creates a redis client, subscribes to some channel and acts upon published messages in that channel.
It works okay for several minutes, but then it stops receiving any messages.
I registered the redis client to all known connection events (including the "error" event), I added a "retry_strategy" handler and also modified the configuration to have "socket_keepalive" and "socket_initialdelay" to 10 seconds (see code below).
Nevertheless, no event is triggered when the connection is interfered.
When the application stops receiving the messages, I see that the connection on the redis port is still valid:
dev#server:~> sudo netstat -tlnpua | grep 6379
tcp 0 0 10.43.22.150:52052 <server_ip>:6379 ESTABLISHED 27014/node
I also captured a PCAP on port 6379 on which I don't see any resets or TCP errors, and it seems that from the connection perspective everything is valid.
I tried running another nodejs application from within the LAN in which I create a client that connects to the AWS redis server, registers to all events and only publishes messages once in a while.
After several minutes (in which the connection breaks), I try publishing another command and the error event handler is indeed triggered:
> client.publish("channel", "ANOTHER TRY")
true
> Error: Redis connection to <server_hostname>:6379 failed - read ECONNRESET
Redis connection ended
Redis reconnecting
Redis connected
Redis connection is ready
So if I try publishing via the client after the connection was interfered, the connection event callbacks are indeed called and I can run some kind of reconnection logic.
But in the scenario in which I subscribe and wait for publishes to the channel, no connection event handler is called and the application is basically broken.
Application code:
const redis = require('redis');
const config = { "host": <hostname>, "port": 6379, "socket_keepalive": true,
"socket_initdelay": 10};
config.retry_strategy = function (options) {
console.log("retry strategy. error code: " + (options.error ?
options.error.code : "N/A"));
console.log("options.attempt", options.attempt, "options.total_retry_time",
options.total_retry_time);
return 2000;
}
const client = redis.createClient(config);
client.on('message', function(channel, message) {
console.log("Channel", channel, ", message", message);
});
client.on("error", function (err) {
console.log("Error " + err);
});
client.on("end", function () {
console.log("Redis connection ended");
});
client.on("connect", function () {
console.log("Redis connected");
});
client.on("reconnecting", function () {
console.log("Redis reconnecting");
});
client.on("ready", function () {
console.log("Redis connection is ready");
});
const channel = "channel";
console.log("Subscribing to channel", channel);
client.subscribe(channel);
I'm using redis#2.8.0 and node v8.11.3.
The solution for this issue is quite sad.
First, there is indeed some network device between the redis client and server, which drops inactive connections after some timeout. It seems that this timeout is really low (several minutes).
Redis has a socket_keepalive configuration which is enabled by default, and its default value is Node.js's default socket keep alive value (which is set for 2 hours if i'm not mistaken).
As can be seen above, I used a socket_initdelay configuration parameter that should have changed this default value, but unfortunately the code that uses this parameter isn't in the redis npm package but rather in node-redis.
To summarize:
There is no configuration setting to change the keep alive timeout value in redis#2.8.0 (latest version when writing this post).
You can either:
Use node-redis which accepts the socket_initdelay setting.
Modify the timeout manually by running the following:
const client = redis.createClient();
client.on("connect", function () {
client.stream.setKeepAlive(true, <timeout_value_in_milliseconds>);
}

Node.JS net module handling unexpected connection loss

I can't figure out one problem I got.
I'm using the Net module on my Node.JS server which is used to listen to client connections.
The client do connect to the server correctly and the connection remains available to read/write data. So far, so good. But when the client unexpectedly disconnects (ed. when internet falls away at client side) I want to fire an event server side.
In socket.io it would be done with the 'disconnect' event, but this event doesn't seem to exist for the Net module. How is it possible to do?
I've searched on Google/StackOverflow and in the Net documentation (https://nodejs.org/api/net.html) but I couldn't find anything usefull. I'm sry if I did mis something.
Here is a code snippet I got:
var net = require('net');
var server = net.createServer(function(connection) {
console.log('client connected');
connection.wildcard = false;//Connection must be initialised with a configuration stored in the database
connection.bidirectional = true;//When piped this connection will be configured as bidirectional
connection.setKeepAlive(true, 500);
connection.setTimeout(3000);
connection.on('close', function (){
console.log('Socket is closed');
});
connection.on('error', function (err) {
console.log('An error happened in connection' + err.stack);
});
connection.on('end', function () {
console.log('Socket did disconnect');
});
connection.on('timeout', function () {
console.log('Socket did timeout');
connection.end();
});
connection.on('data', function (data) {
//Handling incoming data
});
});
serverUmrs.listen(40000, function () {
console.log('server is listening');
});
All the events(close, end, error, timeout) don't fire when I disconnect the client(by pulling out the UTP cable).
Thanks in advance!
EDIT:
I did add a timeout event in the code here above but the only thing that happens is that the socket does timeout after 3 seconds everytime the client does connect again. Isn't KeepAlive enough to make the socket not Idle? How is it possible to make the socket not idle without to much overhead. It may be possible that there are more than 10,000 connections at the same time which must remain alive as long as they are connected (ie respond to the keepalive message).
Update:
I think the KeepAlive is not related with the Idle state of socket, sort of.
Here is my test, I remove the following code in your example.
//connection.setKeepAlive(true, 500);
Then test this server with one client connect to it var nc localhost 40000. If there is no message sending to server after 3 seconds, the server logs as below
Socket did timeout
Socket did disconnect
Socket is closed
The timeout event is triggered without KeepAlive setting.
Do further investigation, refer to the Node.js code
function onread(nread, buffer) {
//...
self._unrefTimer();
We know timeout event is triggered by onread() operation of socket. Namely, if there is no read operation after 3 seconds, the timeout event will be emitted. To be more precisely, not only onread but also write successfully will call _unrefTimer().
In summary, when the write or read operation on the socket, it is NOT idle.
Actually, the close event is used to detect the client connection is alive or not, also mentioned in this SO question.
Emitted when the server closes. Note that if connections exist, this event is not emitted until all connections are ended.
However, in your case
disconnect the client(by pulling out the UTP cable).
The timeout event should be used to detective the connection inactivity. This is only to notify that the socket has been idle. The user must manually close the connection. Please refer to this question.
In TCP connection, end event fire when the client sends 'FIN' message to the server.
If the client side is not sending 'FIN' message that event is not firing.
For example, in your situation,
But when the client unexpectedly disconnects (ed. when internet falls away at client side) I want to fire an event server side.
There may not be a 'FIN' message because internet is gone.
So you should handle this situation in timeout without using keepAlive. If there is no data coming data, you should end or destroy the socket.
EDIT: I did add a timeout event in the code here above but the only
thing that happens is that the socket does timeout after 3 seconds
everytime the client does connect again. Isn't KeepAlive enough to
make the socket not Idle? How is it possible to make the socket not
idle without to much overhead. It may be possible that there are more
than 10,000 connections at the same time which must remain alive as
long as they are connected (ie respond to the keepalive message).
For your edit, your devices should send to the server some heartbeat message between a time period. So that, server understands that that device is alive and that timeout event will not fire because you get some data. If there is no heartbeat message such cases you cannot handle this problem.

Socket.io client behavior when server is unreachable

I was wondering what is the default behavior when socket.io-client can't connect to the server and no callback is provided on the error.
Does the client indefinitely try to reconnect until it can reach the server?
I noticed that If I run this code on client before launching the server. As soon as the latter is started it receives the 'doSomethig' evnt.
socket.on('connect', function () {
socket.emit('doSomething', data);
socket.destroy();
});
How can I prevent the server to receive data emitted before it was started?
socket.on('connect_error', function () { socket.destroy(); });

geddy with socket.io error

I'm doing a realtime app with geddy framework (the basic chat example). But I get and error when the client tries to establish the connection.
here's the server-side code (on the init.js file):
var io = require('socket.io').listen(geddy.server);
io.sockets.on('connection', function (socket) {
console.log("Good!");
socket.emit('new', { message: 'world' });
socket.on('newMessage', function (data) {
console.log(data);
});
});
and the client-side code:
$(document).ready(function(){
startSockets();
});
function startSockets(){
var socket = io.connect('http://localhost:4004');
socket.on('new', function (data) {
alert(data);
//socket.emit('newMessage', { my: 'data' });
});
}
When I try to connect to localhost:4004/ I get the next warn:
debug - setting request GET /socket.io/1/websocket/G_GapksVv1J4iBZIUVe3
debug - set heartbeat interval for client G_GapksVv1J4iBZIUVe3
debug - websocket writing 7:::1+0
warn - client not handshaken client should reconnect
info - transport end (error)
debug - set close timeout for client G_GapksVv1J4iBZIUVe3
debug - cleared close timeout for client G_GapksVv1J4iBZIUVe3
debug - cleared heartbeat interval for client G_GapksVv1J4iBZIUVe3
debug - discarding transport
besides Chrome console gives this error:
WebSocket is closed before the connection is established.
I don't know what can cause these. Any ideas?
The problem is that Socket.io needs to connect to the server after it's started, and the server doesn't start until after the code in init.js has run. The current (hacky) solution in the existing built-in RT code (as of Geddy v0.11) is to put this sort of code in an after_start.js file in the config directory, which Geddy runs after the server starts up. This should work as a workaround in this case too, where you're wiring up Socket.io to the server yourself.
This is obviously not ideal, and a major goal for v0.12 is fixing up the RT integration so it's much more usable and useful. If you have input into how you think this should look, definitely hit us up in IRC (#geddy on Freenode.net), or on the mailing list (https://groups.google.com/group/geddyjs).

node.js mongodb closing the connection

I am trying to use node.js with mongodb and following the tutorial at http://howtonode.org/express-mongodb
The code for opening the connection is
ArticleProvider = function(host, port) {
this.db= new Db('node-mongo-blog', new Server(host, port, {auto_reconnect: true}, {}));
this.db.open(function(){});
};
However i cannot see any connections being closed.
But when i see the logs on the mongo console, i can see that are connections which open and they close after some time.
Does the connection close automatically? Will it be a problem when a large no of clients try to access the server? Where should the connection be closed?
Thanks
Tuco
In that example application, only a single ArticleProvider object is created for the application to share when serving requests. That object's constructor opens a db connection that won't be closed until the application terminates (which is fine).
So what you should see is that you get a new mongo connection each time you start your app, but no additional connections made no matter how many clients access the server. And shortly after you terminate your app you should see its connection disappear on the mongo side.
node-mongodb-native provides a close method for Db objects and you can close your connection when you are finished by calling it.
var that = this;
this.db.open(function(){
// do db work here
// close the connection
that.db.close();
});
If you don't close your connection, event loop keeps the connection open and your process doesn't exit. If you are building a web server where your process will not be terminated, it's not necessary for you to close the connection.
A better reference for node-mongodb-native can be found on https://github.com/mongodb/node-mongodb-native.
Remember to put the db.close in the last callback that gets executed so the connection is open until all callbacks are finished. Otherwise, it gives an error like
/usr/local/lib/node_modules/mongodb/lib/utils.js:97
process.nextTick(function() { throw err; });
^
Error
at Error.MongoError (/usr/local/lib/node_modules/mongodb/node_modules/mongodb-core/lib/error.js:13:17)
at Server.destroy (/usr/local/lib/node_modules/mongodb/node_modules/mongodb-core/lib/topologies/server.js:629:47)
at Server.close (/usr/local/lib/node_modules/mongodb/lib/server.js:344:17)
at Db.close (/usr/local/lib/node_modules/mongodb/lib/db.js:267:19)

Resources