I'm writing a Node.js cli in which I've to read from one Mongo Atlas DB and write to another Mongo Atlas DB. I'll be reading documents from one db and writing equivalent documents in the other db, one document at a time. I've two separate connection files like this:
ReadDB.js:
require('dotenv').config();
const mongoose = require('mongoose');
const read_db_url = process.env.READDB_URI;
const readDB = async () => {
try {
await mongoose.connect(read_db_url,
{
useNewUrlParser: true,
useUnifiedTopology: true,
dbName: "dbProd"
}
);
} catch (err) {
console.error(err);
}
}
module.exports = readDB
WriteDB.js:
require('dotenv').config();
const mongoose = require('mongoose');
const write_db_url = process.env.WRITEDB_URI;
const writeDB = async () => {
try {
await mongoose.connect(write_db_url,
{
useNewUrlParser: true,
useUnifiedTopology: true,
dbName: "dbQA"
}
);
} catch (err) {
console.error(err);
}
}
module.exports = writeDB
This what I've so far for the main application (cli.js):
cli.js:
require('dotenv').config();
const mongoose = require('mongoose');
const connectReadDB = require('./ReadDB.js');
const connectWriteDB = require('./WriteDB.js');
connectReadDB();
connectWriteDB();
const findProduct = async (productId) => {
products = await Products.find({_id:productId});
}
I guess my confusion is how Node.js will know which db to read from to begin with? Will I need separate set of models, one for read and one for write? How can I establish two simultaneous connections in the same Node.js app?
Mongoose handling connections via connections pool http://mongoosejs.com/docs/connections.html
You can use server: {poolSize: 5} option for increase/decrease pool (number of parallel connections)
If you need connections to different databases look here Mongoose and multiple database in single node.js project
Example of multiple connections:
const mongoose = require('mongoose')
const connection = mongoose.createConnection('mongodb://localhost/db1');
const connection2 = mongoose.createConnection('mongodb://localhost/db2');
const Schema = new mongoose.Schema({})
const model1 = connection.model('User', Schema);
const model2 = connection2.model('Item', Schema);
model1.find({}, function() {
console.log("this will print out last");
});
model2.find({}, function() {
console.log("this will print out first");
});
I would like to connect to my Atlas cluster only once per instance running Cloud Functions.
Here is my code for an instance :
const MongoClient = require("mongodb").MongoClient;
const client = new MongoClient("myUrl", {
useNewUrlParser: true,
useUnifiedTopology: true,
});
exports.myHttpMethod = functions.region("europe-west1").runWith({
memory: "128MB",
timeoutSeconds: 20,
}).https.onCall((data, context) => {
console.log("Data is: ", data);
client.connect(() => {
const testCollection = client.db("myDB").collection("test");
testCollection.insertOne(data);
});
});
And i would like to avoid the client.connect() in each function call that seems to be really too much.
I would like to do something like this :
const MongoClient = require("mongodb").MongoClient;
const client = await MongoClient.connect("myUrl", {
useNewUrlParser: true,
useUnifiedTopology: true,
});
const db = client.db("myDB");
exports.myHttpMethod = functions.region("europe-west1").runWith({
memory: "128MB",
timeoutSeconds: 20,
}).https.onCall((data, context) => {
console.log("Data is: ", data);
const testCollection = db.collection("test");
testCollection.insertOne(data);
});
But i can't await like this.
In my AWS Lambda functions (running in python) i have not this issue and i am able to connect only once per instance, so i guess there is an equivalent but i don't know much JS / Node JS.
You can store your database client as a global variable. From the documentation,
Cloud Functions often recycles the execution environment of a previous invocation. If you declare a variable in global scope, its value can be reused in subsequent invocations without having to be recomputed.
Try refactoring the code as shown below:
import * as functions from "firebase-functions";
import { MongoClient } from "mongodb";
let client: MongoClient | null;
const getClient = async () => {
if (!client) {
const mClient = new MongoClient("[MONGODB_URI]", {});
client = await mClient.connect();
functions.logger.log("Connected to MongoDB");
} else {
functions.logger.log("Using existing MongoDB connection");
}
functions.logger.log("Returning client");
return client;
};
export const helloWorld = functions.https.onRequest(
async (request, response) => {
const db = (await getClient()).db("[DATABASE]");
const result = await db.collection("[COLLECTION]").findOne({});
response.send("Hello from Firebase!");
}
);
This should reuse the connection for that instance.
The official documentation of the Node.js Driver version 3.6 contains the following example for the .find() method:
const { MongoClient } = require("mongodb");
// Replace the uri string with your MongoDB deployment's connection string.
const uri = "mongodb+srv://<user>:<password>#<cluster-url>?w=majority";
const client = new MongoClient(uri);
async function run() {
try {
await client.connect();
const database = client.db("sample_mflix");
const collection = database.collection("movies");
// query for movies that have a runtime less than 15 minutes
const query = { runtime: { $lt: 15 } };
const options = {
// sort returned documents in ascending order by title (A->Z)
sort: { title: 1 },
// Include only the `title` and `imdb` fields in each returned document
projection: { _id: 0, title: 1, imdb: 1 },
};
const cursor = collection.find(query, options);
// print a message if no documents were found
if ((await cursor.count()) === 0) {
console.log("No documents found!");
}
await cursor.forEach(console.dir);
} finally {
await client.close();
}
}
To me this somewhat implies that I would have to create a new connection for each DB request I make.
Is this correct? If not, then what is the best practise to keep the connection alive for various routes?
You can use mongoose to set a connection with your database.
mongoose.connect('mongodb://localhost:27017/myapp', {useNewUrlParser: true});
then you need to define your models which you will use to communicate with your DB in your routes.
const MyModel = mongoose.model('Test', new Schema({ name: String }));
MyModel.findOne(function(error, result) { /* ... */ });
https://mongoosejs.com/docs/connections.html
It's 2022 and I stumbled upon your post because I've been running into the same issue. All the tutorials and guides I've found so far have setups that require reconnecting in order to do anything with the Database.
I found one solution from someone on github, that creates a class to create, save and check if a client connection exist. So, it only recreates a client connection if it doesn't already exist.
const MongoClient = require('mongodb').MongoClient
class MDB {
static async getClient() {
if (this.client) {
return this.client
}
this.client = await MongoClient.connect(this.url);
return this.client
}
}
MDB.url='<your_connection_url>'
app.get('/yourroute', async (req, res) => {
try {
const client = await MDB.getClient()
const db = client.db('your_db')
const collection = db.collection('your_collection');
const results = await collection.find({}).toArray();
res.json(results)
} catch (error) {
console.log('error:', error);
}
})
I am currently trying to attach a global Mongoose on runtime with no luck. My plugin requires a few dependencies and options generated upon my app's bootstrapping thus I need to add it sequentially. Mongoose seems to ignore everything wrapped within a closure.
const mongoose = require('mongoose');
const config = {};
const {DB_CONNECT} = process.env;
const myPlugin = schema => {
console.log('done'); // this line is not logged at all
schema.methods.mymethod = () => {};
}
const connectAndAddPlugins = async () => {
await mongoose.connect(
DB_CONNECT,
{...config}
);
mongoose.plugin(myPlugin)
};
connectAndAddPlugins();
Any help will be highly appreciated.
Apparently, since a model gets compiled and loaded with Mongoose global plugins are not attached anymore thus models should get registered afterwards:
const mongoose = require('mongoose');
const config = {};
const {DB_CONNECT} = process.env;
const myPlugin = schema => {
console.log('done'); // this line is not logged at all
schema.methods.mymethod = () => {};
}
const connectAndAddPlugins = async () => {
await mongoose.connect(
DB_CONNECT,
{...config}
);
mongoose.plugin(myPlugin)
};
const loadModels = () => {
const model = mongoose.model('Cat', { name: String });
}
connectAndAddPlugins();
loadModels();
I'm trying to use mongoose to control my db logic and transactions. I already got Schema definitions and I'm exporting the models.
Howver when i try to use a model, it will fail witl a message like:
return mongoose.model('Report', reportSchema);
} has no method 'find'...
This is my Model export:
module.exports = (function() {
var mongoose = require('mongoose'),
Schema = mongoose.Schema,
ObjectId = Schema.ObjectId;
var reportSchema = mongoose.Schema({
category: ObjectId,
authorName: String,
authorEmail: String,
text: String,
address: String,
coordinates: {
type: "Point",
coordinates: [Number,Number]
},
date: {
type: Date,
default: new Date()
},
comments: Array
});
return mongoose.model('Report', reportSchema);
});
And this is how my controller functions are coded using mongoose inside:
module.exports = (function() {
var mongoose = require('mongoose');
var Report = require('../models/Report');
var Category = require('../models/Category');
function _getReports (request,response,next) {
var take = request.query.take;
var skip = request.query.skip;
Report.find({}).limit(take).skip(skip).exec(function (err,reports) {
callback(err,reports,response);
});
}
function _getReport (request,response,next) {
var id = request.params.id;
Report.findById({_id: id}, function (err,report) {
callback(err,report);
});
}
function _createReport (request,response) {
var newReport = new Report();
newReport.text = request.body.text;
newReport.category = request.body.category;
newReport.author = request.session.userId;
newReport.save(function (err,savedReport) {
callback(err,savedReport._id,response);
});
}
function _updateReport (request,response) {
var id = request.params.id;
var update = request.body.report;
Report.findByIdAndUpdate(id, update, function (err,foundReport) {
callback(err,foundReport,response);
});
}
function _deleteReport (request,response) {
var id = request.params.id;
Report.findByIdAndRemove(id, function (err,foundReport) {
callback(err,foundReport,response);
});
}
function _getReports (request,response,next) {
var take = request.query.take;
var skip = request.query.skip;
Report.find({}).limit(take).skip(skip).exec(function (err,reports){
callback(err,reports,response);
});
}
function _getCategories (request,response) {
var take = request.query.take;
var skip = request.query.skip;
Report.find({}).limit(take).skip(skip).exec(function (err,reports) {
callback(err,reports,response);
});
}
function _getCategoryReports (argument) {
var _id = mongoose.Types.ObjectId(request.params.id);
Report.find({category:id},{category:false},function (err, foundReports) {
callback(err,foundReports,response);
});
}
function _createCategory (request,response) {
var newCategory = new Category();
newCategory.name = request.body.name;
newCategory.save(function (err,savedCategory) {
callback(err,savedCategory._id,response);
});
}
function _updateCategory (request,response) {
var id = request.params.id;
var update = request.body.category;
Category.findByIdAndUpdate(id, update, function (err,foundCategory) {
callback(err,foundCategory,response);
});
}
function _deleteCategory (request,response) {
var id = request.params.id;
Category.findByIdAndRemove(id, function (err,foundCategory) {
callback(err,foundCategory,response);
});
}
function callback (err,object,response) {
if (err)
response.status(500).send(JSON.stringify(err));
response.send(JSON.stringify(object));
}
var apiController = {
getReports: _getReports,
getReport: _getReport,
createReport: _createReport,
updateReport: _updateReport,
deleteReport: _deleteReport,
getCategories: _getCategories,
getCategoryReports: _getCategoryReports,
createCategory: _createCategory,
updateCategory: _updateCategory
}
return apiController;
})();
Before this, a mongoose connection is ensured:
var connectToMongoose = function (mongoose,app) {
var connect = function () {
var options = { server: { socketOptions: { keepAlive: 1 } } };
mongoose.connect( 'mongodb://localhost/data4', options);
}
mongoose.connection.on('connected', function () {
console.log('Connected to db');
app.listen(32884, function() {
console.log("Listening at \"data4 port\" #:32884");
});
})
mongoose.connection.on('error', function (err) {
console.log(err);
});
mongoose.connection.on('disconnected', function () {
console.log('Disconnected from db, attempting to connect again...');
app.close();
connect();
});
connect();
};
module.exports = connectToMongoose;
Which is invoked by require('./db/mongoose-connect.js')(mongoose,app);
What am I doing wrong?
There are a couple issues here that I caught off the bat.
First off, I don't see a mongoose.connect() line that explicitly connects your mongoose ODM to a mongo server+database. An example would be:
var mongoose = require( 'mongoose' ),
Schema = mongo.Schema,
ObjectId = mongo.Schema.ObjectId;
mongoose.connect( 'mongodb://localhost/db_name' );
Your schema export looks fine. But you're using an anonymous function as your export. Since you're doing that, your require statement needs to change a little:
var Report = require('../models/Report')();
var Category = require('../models/Category')();
Notice the () at the end of the require statements. You need to execute the function that you're defining as your model file's module.export.
EDIT: I see that you added your mongoose connect code. At this point, executing the module.exports function that you assign in the model file should allow your mongoose models to function as intended.
When you export a function;
// file: A.js
module.exports = function () {
//some logic
};
And you want to use it on another file, when you require the A file, you are importing a function and in order to use that function, you need to to invoke it.
// file: B.js
var A = require('./A.js');
A();
so your model is exporting a function
module.exports = (function() {
var mongoose = require('mongoose'),
Schema = mongoose.Schema,
ObjectId = Schema.ObjectId;
// ..
// some code
// ..
return mongoose.model('Report', reportSchema);
});
and when you are importing your model from your controller, you need to execute your imported function so that your Report variable contains the model created:
module.exports = (function() {
var mongoose = require('mongoose');
var Report = require('../models/Report') ();
I have created a gist of how you could write your code using modules without using IIFE.
https://gist.github.com/wilsonbalderrama/d5484f3f530899f101dc
actually if you download all those files on a folder and run:
$ sudo npm install
$ mocha
You could see that all the tests created for the controller are passing.
In addition you don't need to use IIFE in Node.JS since when you are creating a module because you already have an isolated scope in Node.JS using modules.
// IIFE
module.exports = (function() {
var apiController = {
getReport: function () {}
}
return apiController;
})();
In Node.JS you can export a object,
// GOOD WAY
module.exports = {
getReport: function () {}
};