Does mongoose allow for multiple database requests concurrently? - node.js

I read that Mongoose will only open one connection at maximum per collection, and there's no option to change this.
Does this mean that a slow mongo query will make all subsequent queries wait?
I know everything in node.js is non-blocking, but I'm wondering whether a slow query will delay the execution of all subsequent queries. And whether there is a way to change this.

It does only use one connection, if you use the default method where you do mongoose.connect(). To get around this, you can create multiple connections, and then tie a model pointing to the same schema to that connection.
Like so:
var conn = mongoose.createConnection('mongodb://localhost/test');
var conn2 = mongoose.createConnection('mongodb://localhost/test');
var model1 = conn.model('Model', Schema);
var model2 = conn2.model('Model', Schema);
model1.find({long query}, function() {
console.log("this will print out last");
});
model2.find({short query}, function() {
console.log("this will print out first");
});
Hope that helps.
Update
Hey, that does work. Updating from the comments, you can create a connection pool using createConnection. It lets you do multiple queries from the same model concurrently:
var conn = mongoose.createConnection('mongodb://localhost/test', {server:{poolSize:2}});
var model = conn.model('Model', Schema);
model.find({long query}, function() {
console.log("this will print out last");
});
model.find({short query}, function() {
console.log("this will print out first");
});
Update 2 -- Dec 2012
This answer may be slightly outdated now--I noticed I've been continuing to get upvotes, so I thought I would update it. The mongodb-native driver that mongoose wraps now has a default connection pool size of 5, so you probably don't need to explicitly specify it in mongoose.

Related

Why does my Azure Function to write to MongoDB Atlas have >70s cold start time?

I wonder whether there is a general problem in my function, which leads to such a long cold start. Mongoose has been installed as a dependency which might increase the time. But 70s?!? Come on...
Here is my code. Quite simple really. Just wanna write some stuff to MongoDB. I appreciate any feedback.
module.exports = function(context, req) {
context.log("Function started!");
// Database interaction.
const mongoose = require('mongoose');
const DATABASE = process.env.MongodbAtlas;
// Connect to our Database and handle any bad connections
mongoose.connect(DATABASE);
mongoose.Promise = global.Promise; // Tell Mongoose to use ES6 promises
mongoose.connection.on('error', (err) => {
context.log(`ERROR→ ${err.message}`);
});
// Portfolio Schema.
require('./portfolioModel');
const Portfolio = mongoose.model('Portfolio');
//Create a Portfolio object.
var portfolio = new Portfolio();
portfolio.fiat = "EUR";
portfolio.token[0] = {
crypto_ticker: "BTC",
crypto_name: "Bitcoin",
crypto_qty: 50,
crypto_invested_sum: 9000
};
// Save to db.
portfolio.save();
context.done();
};
The cold start time was reduced to around 3s once I used the Azure Functions extension for VS Code. The extension is automatically creating a package file and sets WEBSITE_RUN_FROM_PACKAGE=1 (some in-depth infos regarding packaging).
Now "Functions as a Service" gets a lot more interesting.

How to make two db query synchronous, so that if any of them fails then both fails in node.js

async.parallel([
function(callback){
con.Attandance.insert({'xxx':'a'}, function(err,data) {
console.log(data);
callback();
});
}, function(callback) {
console.log(data);
con.Profile.insert({'xxx':'a'},function(err){callback()});
}
], function(err) {
console.log('Both a and b are saved now');
});
Attendance.insert() works either Profile.insert() execute or fails. I want if any of them fails data should not be saved in any collection either in Attendance or in Profile
What you mean are transactions, which have nothing to do with synchronous / asynchronous.
Unfortunately, MongoDB simply does not support transactions. The only way to achieve something even remotely close, you have to perform either a two phase commit, or implement a custom rollback logic to undo all changes to Attandance if the changes to Profile failed.
The only possibility to at least achieve atomic (yet not transactions!) updates, is by changing your model. If the Profile is a container for all Attandance instances, you can update the entire object at one. It's impossible to update more than one object atomically with MongoDB, and neither is it possible to achieve a strict order of transactions.
If you need that, go for an SQL database instead. Pretty much all (except SQlite) support transactions.
I wrote a library that implements the two phase commit system (mentioned in a prior answer) described in the docs. It might help in this scenario. Fawn - Transactions for MongoDB.
var Fawn = require("Fawn");
// intitialize Fawn
Fawn.init("mongodb://127.0.0.1:27017/testDB");
/**
optionally, you could initialize Fawn with mongoose
var mongoose = require("mongoose");
mongoose.connect("mongodb://127.0.0.1:27017/testDB");
Fawn.init(mongoose);
**/
// after initialization, create a task
var task = Fawn.Task();
task.save("Attendance", {xxx: "a"})
.save("Profile", {xxx: "a"})
.run()
.then(function(results){
// task is complete
// result from first operation
var firstUpdateResult = results[0];
// result from second operation
var secondUpdateResult = results[1];
})
.catch(function(err){
// Everything has been rolled back.
// log the error which caused the failure
console.log(err);
});

Why mongoose opens two connections?

It's a simple file from mongoose quick guide
mongoose.js
var mongoose = require('mongoose');
mongoose.connect('mongodb://localhost/Chat');
var userSchema = mongoose.Schema({
name: String
});
var User = mongoose.model('User', userSchema);
var user = new User({name: 'Andy'});
user.save(); // if i comment it mongoose will keep one connection
User.find({}, function(err, data) { console.log(data); }); // the same if i comment it
I tried to use db.once method, but effect the same.
Why mongoose opens the second connection in this case?
Mongoose uses native mongo driver underneath, and it in turn uses connection pooling - I believe the default is 5 connections (Check here).
So your mongoose connection will use up to 5 simultaneous connections when it has simultaneous requests.
And since both user.save and User.find are asynchronous, those will be done simultaneously. So what your "program" tells node:
1. Ok, you need to shoot a `save` request for this user.
2. Also, you need to fire this `find` request.
The node runtime then reads these, runs through the whole of your function (until a return). Then it looks at it's notes:
I was supposed to call this save
I also need to call this find
Hey, mongo native driver (which is written in C++) - here are two tasks for you!
and then the mongo driver fires the first request. And it sees it is allowed to open more connections then one, so it does, and fires the second request too, without waiting for the first to finish.
If you called the find within a callback to save, it would be sequential, and the driver would probably reuse the connection it already had.
Example:
// open the first connection
user.save(function(err) {
if (err) {
console.log('I always do this super boring error check:', err);
return;
}
// Now that the first request is done, we fire the second one, and
// we probably end up reusing the connection.
User.find(/*...*/);
});
Or similar with promises:
user.save().exec().then(function(){
return User.find(query);
})
.then(function(users) {
console.log(users);
})
.catch(function(err) {
// if either fails, the error ends up here.
console.log(err);
});
By the way, you can tell mongoose to use only one connection if you need to, for some reason:
let connection = mongoose.createConnection(dbUrl, {server: {poolSize: 1}});
That would be the gist of it.
Read more on MongoLab blog and Mongoose website.

How to do a atomic find and save using mongoose in Node.js

In my db, I have a "Thing", whose id is "53e5fec1bcb589c92f6f38dd".
I wanna update the count member in it everytime when doAdd is called.
Like the code below.
But since the find and save operation is separated, I cannot get the desired result..
Any best practice about this situation?
Thanks in advance!
var mongoose = require('mongoose');
mongoose.connect('mongodb://localhost/test');
var ThingSchema = new mongoose.Schema({
count: Number,
});
var Thing = mongoose.model('Thing', ThingSchema);
var doAdd = function(id){
Thing.findById(id, function(err, thing){
if(******){
#perform some logic...
thing.count++;
}
thing.save();
});
};
var id = "53e5fec1bcb589c92f6f38dd";
doAdd(id);
doAdd(id);
The problem is you are expecting to increase the count by 2 as you are calling doAdd twice. But here the code in doAdd is of asynchronous nature i.e the second doAdd call is called before the the first query and save process is completed. The results are unpredictable due to the asynchronous nature of the function. All mongoose queries and the save calls are asynchronous.
So if you call
thing.save();
console.log('saved!!');
The console ouput appears before the document is actually saved because node doesn't wait for the save to be finished. So if you need any operation to be done after the document is saved, you would place it in the callback.
thing.save(function(err){
if(!err){
console.log('saved');
}
});
If you want to avoid callbacks have a look at async module.

Keeping open a MongoDB database connection

In so many introductory examples of using MongoDB, you see code like this:
var MongoClient = require('mongodb').MongoClient;
MongoClient.connect("mongodb://localhost:port/adatabase", function(err, db)
{
/* Some operation... CRUD, etc. */
db.close();
});
If MongoDB is like any other database system, open and close operations are typically expensive time-wise.
So, my question is this: Is it OK to simply do the MongoClient.connect("... once, assign the returned db value to some module global, have various functions in the module do various database-related work (insert documents into collections, update documents, etc. etc.) when they're called by other parts of the application (and thereby re-use that db value), and then, when the application is done, only then do the close.
In other words, open and close are done once - not every time you need to go and do some database-related operation. And you keep re-using that db object that was returned during the initial open\connect, only to dispose of it at the end, with the close, when you're actually done with all your database-related work.
Obviously, since all the I/O is asynch, before the close you'd make sure that the last database operation completed before issuing the close. Seems like this should be OK, but i wanted to double-check just in case I'm missing something as I'm new to MongoDB. Thanks!
Yes, that is fine and typical behavior. start your app, connect to db, do operations against the db for a long time, maybe re-connect if the connection ever dies unexpectedly, and then just never close the connection (just rely on the automatic close that happens when your process dies).
mongodb version ^3.1.8
Initialize the connection as a promise:
const MongoClient = require('mongodb').MongoClient
const uri = 'mongodb://...'
const client = new MongoClient(uri)
const connection = client.connect() // initialized connection
And then call the connection whenever you wish you perform an action on the database:
// if I want to insert into the database...
const connect = connection
connect.then(() => {
const doc = { id: 3 }
const db = client.db('database_name')
const coll = db.collection('collection_name')
coll.insertOne(doc, (err, result) => {
if(err) throw err
})
})
The current accepted answer is correct in that you may keep the same database connection open to perform operations, however, it is missing details on how you can retry to connect if it closes. Below are two ways to automatically reconnect. It's in TypeScript, but it can easily be translated into normal Node.js if you need to.
Method 1: MongoClient Options
The most simple way to allow MongoDB to reconnect is to define a reconnectTries in an options when passing it into MongoClient. Any time a CRUD operation times out, it will use the parameters passed into MongoClient to decide how to retry (reconnect). Setting the option to Number.MAX_VALUE essentially makes it so that it retries forever until it's able to complete the operation. You can check out the driver source code if you want to see what errors will be retried.
class MongoDB {
private db: Db;
constructor() {
this.connectToMongoDB();
}
async connectToMongoDB() {
const options: MongoClientOptions = {
reconnectInterval: 1000,
reconnectTries: Number.MAX_VALUE
};
try {
const client = new MongoClient('uri-goes-here', options);
await client.connect();
this.db = client.db('dbname');
} catch (err) {
console.error(err, 'MongoDB connection failed.');
}
}
async insert(doc: any) {
if (this.db) {
try {
await this.db.collection('collection').insertOne(doc);
} catch (err) {
console.error(err, 'Something went wrong.');
}
}
}
}
Method 2: Try-catch Retry
If you want more granular support on trying to reconnect, you can use a try-catch with a while loop. For example, you may want to log an error when it has to reconnect or you want to do different things based on the type of error. This will also allow you to retry depending on more conditions than just the standard ones included with the driver. The insert method can be changed to the following:
async insert(doc: any) {
if (this.db) {
let isInserted = false;
while (isInserted === false) {
try {
await this.db.collection('collection').insertOne(doc);
isInserted = true;
} catch (err) {
// Add custom error handling if desired
console.error(err, 'Attempting to retry insert.');
try {
await this.connectToMongoDB();
} catch {
// Do something if this fails as well
}
}
}
}
}

Resources