Database in MongoDB connection string - node.js

In the MongoDB Nodejs driver, I see some confusion in what the connection URI can be.
On one hand, in the page describing the URI (https://docs.mongodb.com/manual/reference/connection-string/) it says the the path parameter is the "authentication database".
On the other hand, in many of the official examples (http://mongodb.github.io/node-mongodb-native/driver-articles/mongoclient.html#mongoclient-connect) it seems they are using the path parameter as the active database to use (they call db.collection() straight away, without calling .database().
Am I missing something?

TL;DR:
Calling db.collection() immediately after connection only works in versions of the driver less than 3.0.
Details:
Firstly, the official examples you sighted were from MongoDB driver at version 1.4.9, the driver is now at version 3.5.8, I would suggest you check out the latest documentation and examples here.
To clarify the confusion, the database path specified in the connection URI is the authentication database i.e the database used to log in, this is true even for the 1.4.9 version of the driver - reference.
However, the reason for the difference you mentioned, i.e being able to call db.collection() immediately after a connection in some cases is a result of the change in the MongoClient class in version 3 of the driver - reference.
Before version 3, MongoClient.connect would return a DB instance to its call back function and this instance would be referencing the database specified in the path of the connection URI, so you could call db.collection() straight away:
MongoClient.connect("<connection_URI>", function(err, db) {
// db is a DB instance, so I can access my collections straight away:
db.collection('sample_collection').find();
});
However, an update was made at version 3 such that, MongoClient.connect now returns a MongoClient instance not a DB instance anymore - reference:
MongoClient.connect("<connection_URI>", function(err, client) {
// client is a MongoClient instance, you would have to call
// the Client.db() method to access your database
const db = client.db('sample_database');
// Now you can access your collections
db.collection('sample_collection').find();
});

Related

Should I worry to close db connection while connecting cloudant db from nodejs using cloudant module?

I am connecting to cloudant using "cloudant" module in nodejs (npm install --save cloudant). Below is the code to initiate the db connection variable and then to use it.
//instantiate a cloudant var with the cloudant url. The url has id, password in it.
cloudant = require('cloudant')(dbCredentials.url);
//set the middleware with the db name
db = cloudant.use(dbCredentials.dbName);
//use the db variable to insert data into the db
function(req){
db.insert({
name:req.name,
age:req.age
},id,function(err,doc){
....
});
};
Should I be worrying about closing the connection after I use db variable? It does not make sense to me since we are not using any connection pool here. To me we are simply instantiating the db variable with the endpoint, credentials and db name. Later we are calling the cloudant resources as ReST APIs. I am slightly confused here and dont think we need to do any close connection (which in fact means nothing but nullifying the cloudant variable). Can please share any comments, whether I am wrong or right? Thanks in advance.
By default, the Cloudant library uses the default Node.js connection pooling so it will respect the server's "Keep-Alive" instruction, but is nothing that you need to worry about. Simply keep making Cloudant library function calls and the library will make HTTP connections when required - reusing existing connections when necessary, creating new ones in other cases.

Why would setting readPreference nearest generate not master errors?

We are using a node stack to connect to a mongo replica set. Because our replicas are geographically distributed, we would like to use the readPreference option in the URI and set it to nearest. But when we do so, while performance is greatly improved, we start getting "not master" errors.
Am I misunderstanding the use of the flag?
We are using mongo 2.6.3 and we are using version 2.0.24 of the mongodb node library.
The URI for the connection is:
mongodb://mongo-1:27017,mongo-2:27017,mongo-3:27017,mongo-4:27017,mongo-5:27017/db?replicaSet=main&readPreference=nearest
Burc
Option 1:
You could append slaveOk to end of connection URI. readPreference tells mongodb that how you'd like to read data and slaveOk instructs that it's OK to read from secondaries (bit duplicate) but works.
e.g.
mongodb://mongo-1:27017,mongo-2:27017,mongo-3:27017,mongo-4:27017,mongo-5:27017/db?replicaSet=main&readPreference=nearest&slaveOk=true
please see &slaveOk=true and end of URI.
https://mongodb.github.io/node-mongodb-native/driver-articles/mongoclient.html#read-preference
Option 2:
if above solution is not working, you'll need to modify code:
var client = require('mongodb').MongoClient;
var uri= "mongodb://mongo-1:27017,mongo-2:27017,mongo-3:27017,mongo-4:27017,mongo-5:27017/db?replicaSet=main";
Please note that I have modified connection uri. Instead of setting readPrefference in Uri, I moved it to as db option in MongoClient.connect.
var options = {db:{readPreference:"ReadPreference.NEAREST"}};
client.connect(uri, options, function(err, db){
if(err){
console.log(err);
return;
}
db = db.collection('data');
db.findOne({}, function(err, result){
console.log(result);
});
});
I have tested in nodejs driver 2.2 and hopping it should work in 2.0 version too.
It seems like there was a bug in the driver which was fixed in 2.0.28, where findAndModify used the readPreference setting. Upgrading the driver to the latest release seemed to have fixed the problem.
2.0.x drivers history

EADDRINUSE Node.js MongoDB Callbacks issue

The problem: (node.js application + mongodb Native driver)
I have a JSON file with more than 60000 Json Documents.the documents always a creation date and unique id called vid. and I need to insert in a MongoDB collection.
I need to insert the new vid or update the ones already existing with another document more recent.
What I already did:
https://github.com/TelmoIvo/PFC/blob/master/cfginit.js
What is happening:
After inserting/updating like 500 times and getting 287 documents in collection I get this error:
AssertionError: null == { [MongoError: connect EADDRINUSE] name : 'MongoError', message: 'connect EADDRINUSE' } at the line assert.equal (null, err);
from what I read, it's saying I have the connection to DB already in use. but I close after I insert/update everytime.
Any advice?
I wouldn't be calling MongoClient.connect every time. That's causing a ton of connections to open and close all the time which is overloading mongo. You should let the MongoClient manage the connection pool. Change it so that you store the db object from MongoClient.connect maybe in your init file add something like
//store this outside your init so its accessible to other functions
//this is what you will use to access the database
var db;
//add this to your init function
MongoClient.connect(url, function(err, database){
db = database;
}
Then in your functions to add and update use the db object you stored to update your collections and you won't need to keep opening connections. You can drop all the MongoClient.connect code and don't call db.close() since your connections are being shared to the object so let MongoClient manage them.

Is CRUD API of NeDB compatibale with MongoDB?

I am looking for MongoDB API compatible DB engine that does not require a full blown mongod process to run (kind of SQLite for Node).
From multiple candidates that persistently store data on a local disk with similar API ended up with two:
NeDB https://github.com/louischatriot/nedb
tingodb http://www.tingodb.com/
Problem
I have worked with neither of them.
I am also very new to the API of MongoDB, so it is difficult for me to judge about comparability.
Requirements
I need your help/advice on picking only one library that satisfies
It is stable enough.
It is fast to handle ~1Mb JSON documents on disk or bigger.
I want to be able to switch to MongoDB as a data backend in the future or by demand by changing a config file. I don't want to duplicate code.
DB initialization api is different
Now only tingodb claims the API compatibility. Even initialization looks fairly similar.
tingodb
var Db = require('tingodb')().Db, assert = require('assert');
vs
mongodb
var Db = require('mongodb').Db,
Server = require('mongodb').Server,
assert = require('assert');
In case of NeDB it looks a bit different because it uses the datastore abstraction:
// Type 1: In-memory only datastore (no need to load the database)
var Datastore = require('nedb')
, db = new Datastore();
QUESTION
Obliviously initialization is not compatible. But what about CRUD? How difficult it is to adopt it?
Since most of the code I do not want to duplicate will be CRUD operations, I need to know how similar they are, i.e. how agnostic can be my code about the fact which backend I have.
// If doc is a JSON object to be stored, then
db.insert(doc); // which is a NeDB method which is compatiable
// How about *WriteResult*? does not look like it..
db.insert(doc, function (err, newDoc) { // Callback is optional
// newDoc is the newly inserted document, including its _id
// newDoc has no key called notToBeSaved since its value was undefined
});
I will appreciate your insight in this choice!
Also see:
Lightweight Javascript DB for use in Node.js
Has anyone used Tungus ? Is it mature?
NeDB CRUD operations are upwards compatible with MongoDB, but initialization is indeed not. NeDB implements part of MongoDB's API but not all, the part implemented is upwards compatible.
It's definitely fast enough for your requirements, and we've made it very stable over the past few months (no more bug reports)

Correct Use of Mongoskin

I usually work with mongoskin because I like to be close to the database. Usually, I do a setup with a file like db.coffee, that contains just this:
mongo = require 'mongoskin'
# either local
module.exports = mongo.db 'mongodb://localhost/database'
# or remote
module.exports = mongo.db 'mongodb://<user>:<pass>#<host>:<port>/<db>?auto_reconnect=true'
Then I use it in my other sources:
db = require 'db'
users = db.collection 'users'
# Now use the collection in handlers and middleware
This seems to work perfectly fine when I am using a local mongo server, I've had an uptime for months and it never turned out to be a problem.
However, when I am using the remote second, I get problem if the server runs longer than just a few minutes - the connection to the mongodb seems lost, despite auto_reconnect. I guess this is because the localhost connection is never closed automatically.
However this led me to thinking if I am maybe using mongoskin in a wrong way, or if there's simply a bug with the auto_reconnect?
ensure mongoskin is using the 1.0.0 or higher driver

Resources