MongoDB connections not releasing after query - node.js

I made some node app with using MongoDB and Socket.IO, and when I insert some documents, every operation isn't working. And I figured out that MongoDB doesn't insert documents after some documents inserted(I don't know how many exactly). So I checked connections with mongodb shell:
db.serverStatus().connections
before starting node app, it saids:
{ "current" : 1, "available" : 3275, "totalCreated" : NumberLong(639) }
after inserts some docs:
{ "current" : 51, "available" : 3225, "totalCreated" : NumberLong(708) }
turn off node app:
{ "current" : 1, "available" : 3275, "totalCreated" : NumberLong(708) }
This is the code of server side. (I'm using external MongoDB module so it can be little different from Original MongoDB module for node.js. this module is just simple wrapper for MongoDB with Promise based API)
const app = require('http').createServer();
const io = require('socket.io')(app);
const am2 = require('am2');
...
am2.connect('localhost', '...', { ... });
const sockets = {};
io.on('connection', (socket) => {
// save the socket into sockets object
sockets[socket.id] = socket;
// release socket on disconnection
socket.on('disconnect', () => {
delete sockets[socket.id];
});
...
// pseudo doc inserting event
socket.on('/insert/some/docs', (data) => {
am2.insert({ ... }).then( () => {
socket.emit('/after/insert/some/docs', {});
}).catch( ... );
});
});
when client emit '/insert/some/docs' event, server will insert document into MongoDB. first few tries works well, but after some insertion, it does not work anymore.
I think this happen because lot's of connections are still alive after insertion is done, but I don't know why. If it was RDBMS like MySQL, every connection must be close after operation is done, but in MongoDB, it should not be(as I know).
I don't know why this is happening, so it will be very appreciate give me a hand.

I solved this by releasing cursor after get data from MongoDB. Make sure to release, or it keeps your connection pool and makes your application not working.

Related

Scaffolding a Node.js app properly without Express (the app doesn't receive requests)

[This question is quite vague, I apologize for it. I'm trying to address my various troubles by answering the question myself]
I am building a Node.js app which has to perform various tasks at given intervals. Here is the global scaffold (involves bluebird promises and mongoose for DB interactions) :
var Promise = require("bluebird");
var mongoose = require('mongoose');
mongoose.Promise = require('bluebird');
// Personal modules
var bootApp = require(...);
var doStuffA = require(...);
var doStuffB = require(...);
var doStuffC = require(...);
// running locally, but meant to be deployed at some point
mongoose.connect('mongodb://localhost:27017/myDatabase');
var db = mongoose.connection;
db.on('error', () => {
console.log("Error : lost connection !"));
process.exit(1);
});
db.once('open', () => {
bootApp() // always start by booting
.then( () => { // then start the infinite loop of events
setInterval(doStuffA, 1000*60*60); // 1x/1h
setInterval(doStuffB, 1000*60*10); // 1x/10min
setInterval(doStuffC, 1000*60*3); // 1x/3min
}).catch((e) => { // errors are handled by doStuffX(), so we should never catch anything here
console.log(e.message);
process.exit(1);
});
});
Each module doStuffX is a function returning a Promise, handling its own errors, and should finish at some point.
Expected behaviour for the entire app :
The app should be able to run forever
The app should try to doStuffX() at the given interval, regardless of whether it succeeded or failed last time.
[Optional :] The app should close smoothly without retrying any doStuff upon receiving a "shut down" signal.
My question : how to build a clean scaffold for such an app ? Can I get rid of setInterval and use promises instead ? One of my main concerns is to make sure the previous instance of doStuffX() is finished before starting the next one, even if it involves "killing" it in some way.
I am open to any link about scaffolding apps, but PLEASE DO NOT GIVE ME AN ANSWER/LINK INVOLVING EXPRESS : I don't need Express, since my app doesn't receive any request. (everything I found so far starts with Express :/)
If you don't want to start the next doStuffX() until the previous one is done, then you can replace your setInterval() with repeated setTimeout() calls.
function runA() {
setTimeout(function() {
doStuffA().then(runA).catch(function(err) {
// decide what to do differently if doStuffA has an error
});
}, 1000*60*60);
}
runA();
You could also add a timeout to this so that if doStuffA() doesn't respond within a certain amount of time, then you take some other action. This would involve using another timer and a timeout flag.
[I answer my own question, trying to put here everything I changed afterwards, in case someone falls into this page someday...]
For the Mongoose part of the scaffold, here is what I got so far for a reliable long-term DB connection :
The Mongoose documentation gives a fancy way to ensure the driver will never give up on trying to reconnect with reconnectTries
I don't really understand socketOptions and keepalive which seem related to replicas, so I leave them out of my code for now
Since Mongoose should autoreconnect whenever something goes wrong, I'll keep the db.once('open') as the access to the app code itself, even though I don't really understand yet the difference with db.on('connected')
I recommend reading this.
var Promise = require("bluebird");
var mongoose = require('mongoose');
mongoose.Promise = require('bluebird');
// Personal modules
var bootApp = require(...);
var doStuffA = require(...);
var doStuffB = require(...);
var doStuffC = require(...);
// running locally, but meant to be deployed at some point
var uri = 'mongodb://localhost:27017/myDatabase';
// the added option makes sure the app will always try to reconnect...
mongoose.connect(uri, { server: { reconnectTries: Number.MAX_VALUE } });
var db = mongoose.connection;
db.on('error', () => {
console.log("Error with Mongoose connection."));
});
db.once('open', () => {
bootApp() // always start by booting
.then( () => { // then start the infinite loop of events
//////////////////////////////////
/// Here goes the actual stuff ///
//////////////////////////////////
}).catch((e) => { // errors are handled by doStuffX(), so we should never catch anything here
console.log(e.message);
});
});
Now, for the actual repetitive stuff, my objective is to make sure everything runs smoothly, and that no process gets stuck. About the changes I made :
The methods used are not native ES6 but are specific to bluebird. You can read about .timeout() and .delay() which I find very useful for chaining timeouts and intervals in a clean code.
In my mind, .then(runA, runA) should always launch ONE UNIQUE instance of runA but I'm concerned about whether I could actually end up launching two parallel instances...
// Instead of using setInterval in a bluebird promised environment...
setInterval(doStuffA, 1000*60*60); // 1x/1h
// I would have liked a full promise chain, but as jfriend00 stated,
// It will end up crashing because the initial promise is never resolved...
function runA() {
return doStuffA()
.timeout(1000*60*30) // kill the running instance if it takes longer than 30min
.delay(1000*60*60) // wait 60min
.then(runA, runA); // whatever the outcome, restart the process
}
runA();
// Therefore, a solution like jfriend00's seems like the way to go :
function runA() {
setTimeout(function() {
doStuffA()
.timeout(1000*60*30)
.then(runA, runA)
}, 1000*60*60);
}
runA();

Why mongoose opens two connections?

It's a simple file from mongoose quick guide
mongoose.js
var mongoose = require('mongoose');
mongoose.connect('mongodb://localhost/Chat');
var userSchema = mongoose.Schema({
name: String
});
var User = mongoose.model('User', userSchema);
var user = new User({name: 'Andy'});
user.save(); // if i comment it mongoose will keep one connection
User.find({}, function(err, data) { console.log(data); }); // the same if i comment it
I tried to use db.once method, but effect the same.
Why mongoose opens the second connection in this case?
Mongoose uses native mongo driver underneath, and it in turn uses connection pooling - I believe the default is 5 connections (Check here).
So your mongoose connection will use up to 5 simultaneous connections when it has simultaneous requests.
And since both user.save and User.find are asynchronous, those will be done simultaneously. So what your "program" tells node:
1. Ok, you need to shoot a `save` request for this user.
2. Also, you need to fire this `find` request.
The node runtime then reads these, runs through the whole of your function (until a return). Then it looks at it's notes:
I was supposed to call this save
I also need to call this find
Hey, mongo native driver (which is written in C++) - here are two tasks for you!
and then the mongo driver fires the first request. And it sees it is allowed to open more connections then one, so it does, and fires the second request too, without waiting for the first to finish.
If you called the find within a callback to save, it would be sequential, and the driver would probably reuse the connection it already had.
Example:
// open the first connection
user.save(function(err) {
if (err) {
console.log('I always do this super boring error check:', err);
return;
}
// Now that the first request is done, we fire the second one, and
// we probably end up reusing the connection.
User.find(/*...*/);
});
Or similar with promises:
user.save().exec().then(function(){
return User.find(query);
})
.then(function(users) {
console.log(users);
})
.catch(function(err) {
// if either fails, the error ends up here.
console.log(err);
});
By the way, you can tell mongoose to use only one connection if you need to, for some reason:
let connection = mongoose.createConnection(dbUrl, {server: {poolSize: 1}});
That would be the gist of it.
Read more on MongoLab blog and Mongoose website.

Avoid query timeout in oriento (node.js driver for OrientDB)

Given the following node.js + oriento sample code, I have an issue of running into a timeout, [OrientDB.ConnectionError [2]: read ETIMEDOUT], first time I make a DB query after a longish inactivity period. Right after the timeout error the connection is somehow re-initialized and the next query runs fine.
var oriento = require("oriento"),
server = oriento({...}),
db = server.use("users");
var getData = function(statement, opts, callback) {
db.query(statement, opts).then(function(data) {
callback(null, data);
}).catch(callback);
};
So I have the following questions:
Is this the right way to go or should I call oriento({...}).use("users") every time I make a query rather than reusing the connection object?
If this is the right way, why the connection is not validated and refreshed automatically?
How can I manually check that I am not going to run into a timeout (i.e. validate the connection) and force a connection refresh?
Any suggestions better than the following fairly ugly hack to keep the transport socket by pinging the DB every minute?
setInterval(function(db) {
db.query("select from user").then(function(data) {
console.log("still alive");
}).catch(function(err) {
console.error(err);
});
}, 60000, db);

Keeping open a MongoDB database connection

In so many introductory examples of using MongoDB, you see code like this:
var MongoClient = require('mongodb').MongoClient;
MongoClient.connect("mongodb://localhost:port/adatabase", function(err, db)
{
/* Some operation... CRUD, etc. */
db.close();
});
If MongoDB is like any other database system, open and close operations are typically expensive time-wise.
So, my question is this: Is it OK to simply do the MongoClient.connect("... once, assign the returned db value to some module global, have various functions in the module do various database-related work (insert documents into collections, update documents, etc. etc.) when they're called by other parts of the application (and thereby re-use that db value), and then, when the application is done, only then do the close.
In other words, open and close are done once - not every time you need to go and do some database-related operation. And you keep re-using that db object that was returned during the initial open\connect, only to dispose of it at the end, with the close, when you're actually done with all your database-related work.
Obviously, since all the I/O is asynch, before the close you'd make sure that the last database operation completed before issuing the close. Seems like this should be OK, but i wanted to double-check just in case I'm missing something as I'm new to MongoDB. Thanks!
Yes, that is fine and typical behavior. start your app, connect to db, do operations against the db for a long time, maybe re-connect if the connection ever dies unexpectedly, and then just never close the connection (just rely on the automatic close that happens when your process dies).
mongodb version ^3.1.8
Initialize the connection as a promise:
const MongoClient = require('mongodb').MongoClient
const uri = 'mongodb://...'
const client = new MongoClient(uri)
const connection = client.connect() // initialized connection
And then call the connection whenever you wish you perform an action on the database:
// if I want to insert into the database...
const connect = connection
connect.then(() => {
const doc = { id: 3 }
const db = client.db('database_name')
const coll = db.collection('collection_name')
coll.insertOne(doc, (err, result) => {
if(err) throw err
})
})
The current accepted answer is correct in that you may keep the same database connection open to perform operations, however, it is missing details on how you can retry to connect if it closes. Below are two ways to automatically reconnect. It's in TypeScript, but it can easily be translated into normal Node.js if you need to.
Method 1: MongoClient Options
The most simple way to allow MongoDB to reconnect is to define a reconnectTries in an options when passing it into MongoClient. Any time a CRUD operation times out, it will use the parameters passed into MongoClient to decide how to retry (reconnect). Setting the option to Number.MAX_VALUE essentially makes it so that it retries forever until it's able to complete the operation. You can check out the driver source code if you want to see what errors will be retried.
class MongoDB {
private db: Db;
constructor() {
this.connectToMongoDB();
}
async connectToMongoDB() {
const options: MongoClientOptions = {
reconnectInterval: 1000,
reconnectTries: Number.MAX_VALUE
};
try {
const client = new MongoClient('uri-goes-here', options);
await client.connect();
this.db = client.db('dbname');
} catch (err) {
console.error(err, 'MongoDB connection failed.');
}
}
async insert(doc: any) {
if (this.db) {
try {
await this.db.collection('collection').insertOne(doc);
} catch (err) {
console.error(err, 'Something went wrong.');
}
}
}
}
Method 2: Try-catch Retry
If you want more granular support on trying to reconnect, you can use a try-catch with a while loop. For example, you may want to log an error when it has to reconnect or you want to do different things based on the type of error. This will also allow you to retry depending on more conditions than just the standard ones included with the driver. The insert method can be changed to the following:
async insert(doc: any) {
if (this.db) {
let isInserted = false;
while (isInserted === false) {
try {
await this.db.collection('collection').insertOne(doc);
isInserted = true;
} catch (err) {
// Add custom error handling if desired
console.error(err, 'Attempting to retry insert.');
try {
await this.connectToMongoDB();
} catch {
// Do something if this fails as well
}
}
}
}
}

Updating Mongo document from Node

so I am inserting a document into mongo db as such
io.sockets.on('connection', function (socket) {
connectLog(socket.handshake.address.address);
function connectLog(ipAddress) {
db.collection('tracking', function (err, collection) {
var currentTime = new Date();
var doc={"ip":ipAddress, "connect": currentTime, "culture": "en" };
collection.insert(doc, function () { });
});
}
I have another event
function logout(id, disconnect) {
I'd like to update (or replace?) that record and add disconnect: (time) to it. How would I go about doing this? This way I can tell when a person connects and when they disconnect from the chat.
I am using socket.io so I'll know their exact disconnect time
Thank you in advance
First, read this about explicit vs. implicit disconnects: Socket.io: How to handle closing connections?. Basically, your handler for explicit logouts (which are good!) should call the same code as your disconnect handler, in case the user doesn't get a chance to explicitly logout.
So in addition to your logout code, you'll also want:
socket.on('disconnect', handleDisconnect)
And in that disconnect/logout handler, you'll want to find the most recent connection document for that user and update it.
collection.findAndModify(
{"address" : address}, //same address
[['connect', 'descending']], //the most recent, findAndModify only changes the first doc
{$set: {disconnect: currentTime}}, //set the disconnect time
function(err, object){/*deal with errors*/}
)

Resources