What's the best practice for MongoDB connections on Node.js? - node.js

This is something that is a bit unclear to me (I'm just getting started with Node and Mongo), and it really concerns me because of server performance and strain (which I guess is another question, but I'll get to that at the end of the post).
So, assuming I'm writing an API with Node.js and Restify, where each API endpoint corresponds to a function, should I:
a) open the db connection and store it in a global var, and then just use that in every function?
Example:
// requires and so on leave me with a db var, assume {auto_reconnect: true}
function openDB() {
db.open(function(err, db) {
// skip err handling and so on
return db;
}
}
var myOpenDB = openDB(); // use myOpenDB in every other function I have
b) open the db connection and then just put everything in one giant closure?
Example:
// same as above
db.open(function(err, db) {
// do everything else here, for example:
server.get('/api/dosomething', function doSomething(req, res, next) { // (server is an instance of a Restify server)
// use the db object here and so on
});
}
c) open and close the db each time it is needed?
Example:
// again, same as above
server.get('/api/something', function doSomething(req, res, next) {
db.open(function(err, db) {
// do something
db.close();
});
});
server.post('/api/somethingelse', function doSomethingElse(req, res, next) {
db.open(function(err, db) {
// do something else
db.close();
});
});
This last one is what I would do out of intuition, but at the same time I don't feel entirely comfortable doing this. Doesn't it put too much strain on the Mongo server? Especially when (and I hope I do get to that) it gets hundreds — if not thousands — of calls like this?
Thank you in advance.

I like MongoJS a lot. It lets you use Mongo in a very similar way to the default command line and it's just a wrapper over the official Mongo driver. You only open the DB once and specify which collections you'll be using. You can even omit the collections if you run Node with --harmony-proxies.
var db = require('mongojs').connect('mydb', ['posts']);
server.get('/posts', function (req, res) {
db.posts.find(function (err, posts) {
res.send(JSON.stringify(posts));
});
});

Option A is not a great idea since there is no guarantee that the DB will be finished opening before an HTTP request is handled (granted this is very unlikely)
Option C is also not ideal since it needlessly opens and closes the DB connection
The way that I like to handle this is using deferreds/promises. There are a bunch of different promise libraries available for Node but the basic idea is to do something like this:
var promise = new Promise();
db.open(function(err, db) {
// handle err
promise.resolve(db);
});
server.get('/api/something', function doSomething(req, res, next) {
promise.then(function(db)
// do something
});
});
I believe Mongoose handles connections in a way that vaguely resembles this.

Related

NodeJS stop main function when nested callback throws an exception

I'm quite new to NodeJS, and I came across a construction that I can't wrap my head around. Consider the following code:
router.get("/", function(req, res, next) {
let db = new sqlite3.Database(dbname, sqlite3.OPEN_READONLY, (err) => {
if (err) next(err);
});
res.render("views/page");
}
If an exception is raised by Database (for instance because dbname does not exist), then the callback will pass it to the Express error handler (next). But the code does not stop there, and it will continue to execute the next line and attempt to render something, which is problematic if the error handler also sends headers. If I add a return statement within the callback, it will simply terminate Database but not the rest of the function.
My question: is there anyway to prevent the rest of the code from being executed if an exception is raised?
Please read the sqlite3 npm documentation. Ideally, you should not be opening a database connection on every request. Instead, you should open the database connection when the server is started, and share a reference to the db variable to be used in the API routes (gets/posts).
Unless you just want to make it work, I believe the below should do it.
router.get("/", function(req, res, next) {
let db = new sqlite3.Database(dbname, sqlite3.OPEN_READONLY, (err) => {
if (err) return next(err);
res.render("views/page");
});
}
EDIT: Javascript does not allow for async/await when instantiating objects, so in this case I believe would be better to just initialize the database connection outside the router method, like below, and use the error event to handle errors
const db = new sqlite3.Database(dbname, sqlite3.OPEN_READONLY);
db.on('error', function(err) { /*handle error here*/ });
router.get("/", function(req, res, next) {
res.render("views/page");
});
}

How do I get a MongoDB import script to close the db after inserting results?

I am running a quick little nodejs script to find documents in one collection and insert them into another collection but on the same DB. I came up with this guy, but it has no way to close because I think its running open or async?
I have tried placing the db.close() in various places and tried mongoClient.close(). No luck which had me thinking about trying to force a timeout for the async call. Added a connection Time out but it did not have the desired behaviour.
var MongoClient = require('mongodb').MongoClient
, assert = require('assert');
const async = require("async");
// Connection URL
var url = 'mongodb://localhost:27017/sourceDB';
// Use connect method to connect to the Server
MongoClient.connect(url,{connectTimeoutMS: "5"}, (err, db) => {
db.collection('source.collection', function(err, col) {
assert.equal(null, err);
col.find().forEach(function (data) {
console.log(data);
db.collection('destination.collection').insertOne(data, function(err, res) {
assert.equal(null, err);
});
console.log("Moved");
});
});
});
The script does well and picks up the collection and inserts, but the connection remains open.
It is not recommended to explicitly close the connection as shown by this SO thread.
Rather, allow the client library to manage the connection for you.

Effective way to get data that's needed on all pages

I'm using nodejs and express and I have a navigation menu that is built using data that is in mongodb currently I'm just making a call to the database to get a list of companies and passing that back inside each route. There doesn't seem to be a way to store this information in localstorage client side. So I"m wondering what is the most effective way to handle this situation. Sample of my code
admin.get('/', function(res, resp){
mongodb.connect(url, function(err, db){
var collection = db.collection('companies')
collection.find({}).toArray(function(err, companies){
res.render('adminview', {companies:companies})//once the page is rendered I would like to save the company list to localstorage.
})
})
})
admin.get('/:company', function(res, resp){
/* repeating code from above because I need this list */
mongodb.connect(url, function(err, db){
var collection = db.collection('companies')
collection.find({}).toArray(function(err, companies){
/* more code to do stuff to render the company page */
res.render('companyadminview', {companies:companies, company:company})
}) })
I could be going about this the wrong way I'm new to web development this feels wrong to me but can't figure out a different way.
So, first off you should be able to store it in localstorage or sessionstorage just fine, unless you're targeting browsers that don't support it.
That said, I think it's best not to, as the fact that you're storing it in the DB implies that it changes with enough frequency that you will get buggy clientside behavior if you cache it there for too long.
Instead, I'd just setup a middleware and attach it to the locals object on a per request basis, unless you want to do some kind of cache on the server:
app.use(function(req, res, next) {
mongodb.connect(url, function(err, db){
if (err) return next(err);
var collection = db.collection('companies')
collection.find({}).toArray(function(err, companies){
if (err) return next(err);
res.locals.companies = companies;
next();
});
});
});

Does mongodb require connection on each operation

So i am very new to mongodb and i wish to use it in my application. Now i HATE redundant code but reading up on how to use mongodb with node.js it seems that there is a pattern where you always have to connect before making any CRUD operation.
Example from the offical documentation:
MongoClient.connect(url, function(err, db) {
assert.equal(null, err);
insertDocument(db, function() {
db.close();
});
});
My question is. is it possible to make a middleware that keeps the connection open so you only has to call insertDocument (in the above example) ?
Yea of course, just keep the db variable around until you don't need it any longer - then call close()
var mdb;
MongoClient.connect(url, function(err, db) {
assert.equal(null, err);
mdb = db;
insertDocument(mdb, function() {
// ...
});
});
// do other stuff with mdb
You can also look into using Mongoose as you mentioned middleware.
The connection is only opened once (on the global scope) and then you can use it throughout the app.

Disconnect PostgreSQL when client goes to new page

I'm trying to build a real time web page and use postgreSQL as my database. I use node.js and express to build backend stuff. Since this is a real time webpage and needs to update information very frequently, I keep a long connection with postgreSQL, which looks like:
app.get('/:A/:B', function(req,res){
var A = req.params.A;
var B = req.params.B;
var client = new pg.Client(config[A][B]);
client.connect(function(err){
if (err) {
console.log("Error occurred when try to connect the database",err);
}
else {
console.log("Connected to the database");
}
});
Do some queries with current database connection...
}
The problem is, when I change the value of A and B in browser and try to connect to a new database, I didn't disconnect with the old one so the info on my page are still from the old database. I'm new to node and web development. Can anyone let me know how to disconnect with the old database when client try to go to a new url?
I think is not good way to create connection for each request. If size of A-B variants is limited then create of connection pool on start is better.
app.get('/:A/:B', function(req, res, next){ // next to forwarding error
var A = req.params.A;
var B = req.params.B;
var client = new pg.Client(config[A][B]);
client.connect(function(err){
if (err)
return next(err); // go to error-middleware
console.log("Connected to the database");
// Do some queries with current database connection...
// Keep it mind that they're also asynchronous, so better way is use promises or async (https://github.com/caolan/async)
client.end(function (err) {
if (err)
next(err);
});
});
}
// Error middleware
app.use(function(err, req, res, next) {
console.log(req.url, err.message);
})

Resources