Create Bookshelf.js models without connection and make connection in route files - node.js

I have Bookshelf.js models (a general model with parse() and format() functions and a model for each table that extends the general model) and routes that use these models. The database is PostgreSQL. But I have users in a SQLite database and each user has its own PostgreSQL connection options. I can retrieve the current user and expose its database connection options in the request, so every route knows how to connect to PostgreSQL. But how can I connect and use my Bookshelf.js models in a route? This is an example of what I am trying to do in a routes file:
var express = require('express'),
knex = require('knex'),
// This should be the file that exposes all models.
// Every model exposed, extends the general model.
// But it already requires a connection when init Bookshelf.js...
dataBookshelf = require('../models'),
router = express.Router();
router.get('/items/:id',function(req,res,next) {
// In req.db I have PostgreSQL connection options for the current user.
var connection = req.db;
// Make a connection to PostgreSQL.
var db = dataBookshelf(knex(connection));
// Use Items model.
var Items = db.Items;
Items.read(req.params.id).then(function(item) {
res.json(item);
}).catch(Items.NotFoundError,function() {
var err = new Error('Item not found.');
res.status(400).json(err);
});
});
module.exports = router;
If there is another way to save the current user connection options globally and to use them in the module that exposes Bookshelf.js instance, instead of passing them in the request, it should be better and safer.

Related

Database Connection using common module is not working [ mongoose and mongodb ]

I am trying to implement a common module for MongoDB connection using mongoose. and want to use the connection in other application for database operation. but facing issue when trying to use common database module. operation is halted / hanging after creating db connection. here is my codebase.
When I am using module specific dababase connection, then it is working fine, but when I am using common database connection it is hanging
Common DB Module
'use strict'
const mongoose = require('mongoose');
const DBOptions = require('./DBOption');
require("dotenv").config();
mongoose.Promise = global.Promise;
let isConnected;
const connectToDatabase = (MONGODB_URL) => {
if (isConnected) {
console.log('using existing database connection');
return Promise.resolve();
}
console.log('using new database connection');
console.log('DBOptions >> '+JSON.stringify(DBOptions));
return mongoose.connect(MONGODB_URL, DBOptions)
.then(db => {
console.log('db.connections[0].readyState >> '+db.connections[0].readyState);
isConnected = db.connections[0].readyState;
});
};
module.exports = connectToDatabase;
API Controller
const dbConnection = require('../DB/connection') // Internal Class
const DBConnection = require('as-common-util').connectToDatabase; // Common Class
/**
*
*/
app.get('/usr/alluser', async (req, res) => {
try {
//await dbConnection(process.env.MONGODB_URL) // This is working
await DBConnection(process.env.MONGODB_URL) // Code is hanging for this
let allUsers = await UserService.getAllUser()
console.log("All Users >> " + allUsers)
if (allUsers) {
return res.status(200).send(
new APIResponse({
success: true,
obj: allUsers
})
)
}
} catch (error) {
console.log(error)
}
})
It is hanging at following position
using new database connection
DBOptions >>
{"useNewUrlParser":true,"useUnifiedTopology":true,"useCreateIndex":true,"useFindAndModify":false,"autoIndex":false,"poolSize":10,"serverSelectionTimeoutMS":5000,"socketTimeoutMS":45000,"family":4}
db.connections[0].readyState >> 1
I am confused why same code is not working for common module.
This kind of pattern is not how Mongoose is meant to be used. Under the hood, Mongoose passes the underlying connection to the models in your module without the user really knowing anything about what is going on. That's why you can do magic stuff like MyModel.find() without ever having to create a model object yourself, or pass a db connection object to it.
If your db connection is in another module though, Mongoose won't be able to make those connections between your models and the MongoDB client connection since the models are no longer being registered on the mongoose object that is actually connected, and as a result, any requests you make using your models will break, since they will always be trying to connect through the object in your module.
There are other reasons why this won't, and shouldn't, work though. You're not supposed to be able to split a client. Doing so would make it unclear where communication along a client is coming from, or going to. You could change your function to make it return an established client connection. But your Mongoose models still wouldn't work. You would just be left with raw mongodb. If you want to do that, you might as well just uninstall Mongoose and use the mongodb library. Ultimately, you don't really gain anything from initializing the connection in a shared module. Initializing a connection is just a couple lines of code.
I doubt it's the connection that you want to share, rather it's the models (I'm guessing). You can put those in a shared module, and export them as a kind of connector function that injects the a given Mongoose instance into the models. See: Defining Mongoose Models in Separate Module.

More than one Mongo endpoint in same API

My NodeJS application has form with text input field (for search) and a dropdown mongo for DEV, UAT and Production database options.
Based on the user selection respective database has to be accessed.
I want to know how to dynamically handle /change different database endpoint or change node env in run-time ?
One way that comes to my mind is to disconnect and connect again. If you are using mongoose, do something like:
var mongoose = require('mongoose')
...
try {
mongoose.disconnect();
mongoose.connect(mongoURL);
catch (e) {
console.log(e);
}
every time and take the mongoURL from the user input.
Another way is to use multiple connections:
var mongoose = require('mongoose')
var conn = mongoose.createConnection('mongodb://localhost/db1');
var conn2 = mongoose.createConnection('mongodb://localhost/db2');
and then choose the connection that you want to use depending on the user choice. I prefer this last one.
Take a look at this answer for more info:
https://stackoverflow.com/a/32909008/7041393

NodeJS Express Dependency Injection and Database Connections

Coming from a non Node background, my first instinct is to define my service as such
MyService.js
module.exports = new function(dbConnection)
{
// service uses the db
}
Now, I want one open db connection per request, so I define in middleware:
res.locals.db = openDbConnection();
And in some consuming Express api code:
api.js
var MyService = require(./services/MyService')
...
router.get('/foo/:id?', function (req, res) {
var service = new MyService(res.locals.db);
});
Now, being that Node's preferred method of dependency injection is via the require(...) statement, it seems that I shouldn't be using the constructor of MyService for injection of the db.
So let's say I want to have
var db = require('db');
at the top of MyService and then use somehow like db.current....but how would I tie the db to the current res.locals object now that db is a module itself? What's a recommended way of handling this kind of thin in Node?
Updated Answer: 05/02/15
If you want to attach a DB connection to each request object, then use that connection in your service, the connection will have to be passed to myService some how. The example below shows one way of doing that. If we try to use db.current or something to that effect, we'll be storing state in our DB module. In my experience, that will lead to trouble.
Alternatively, I lay out the approach I've used (and still use) in this previous answer. What this means for this example is the following:
// api.js
var MyService = require(./services/MyService')
...
router.get('/foo/:id?', function (req, res) {
MyService.performTask(req.params.id);
});
// MyService.js
var db = require('db');
module.exports = {
performTask: function(id)
{
var connection = db.getOpenConnection();
// Do whatever you want with the connection.
}
}
With this approach, we've decoupled the DB module from the api/app/router modules and only the module that actually uses it will know it exists.
Previous Answer: 05/01/15
What you're talking about could be done using an express middleware. Here's what it might look like:
var db = require('db');
// Attach a DB connection to each request coming in
router.use(req, res, next){
req.locals.db = db.getOpenConnection();
next();
}
// Later on..
router.get('/foo/:id?', function (req, res) {
// We should now have something attached to res.locals.db!
var service = new MyService(res.locals.db);
});
I personally have never seen something like new MyService before in express applications. That doesn't mean it can't be done, but you might consider an approach like this
// router.js
var MyService = require('MyService');
router.get('/foo/:id?', function (req, res) {
MyService.foo(res.locals.db);
});
// MyService.js
module.exports.foo(connection){
// I have a connection!
}

Mongoose dynamic models, cannot use populate()

My CMS is written in Node.js + Express + Mongoose. It's a multisite CMS and each site has its own database. So I needed to make Mongoose to switch the connection at every HTTP request. I looked around for some solution, or someone who had my same condition, but without success. So this is my solution:
HTTP Request:
router.get('/site/pages', function (req, res) {
pagesBiz.list(req.session, function(err, list) {
//do stuff
});
});
BIZ component (pagesBiz):
module.exports = {
list: function(session, callback) {
Page(session.database).find(callback);
}
}
Model (Page)
var cached = {};
var getModel = function (database) {
if(! cached[database]) {
var conn = mongoose.createConnection('mongodb://localhost/' + database);
cached[database] = conn.model('Page', pageSchema);
}
return cached[database];
}
module.exports = function(database) {
return getModel(database);
}
So how does it works? When the user logs in a new user session is created and bound to the user via the session cookie (I use MongoStore + Express Session). The session contains the name of the database that is used to dynamically instantiate the Mongoose models.
It works perfectly, users of different sites read from their own databases without the risk of "collisions", on the other side I have to pass the session (or the database name) around the functions, but I guess this is how Node.js works in a multithreaded context.
The only problem is when I use the populate method(), I get this error:
f:\www\rudsjs\node_modules\mongoose\lib\connection.js:625
throw new MongooseError.MissingSchemaError(name);
^ MissingSchemaError: Schema hasn't been registered for model "User". Use mongoose.model(name, schema)
at NativeConnection.Connection.model (f:\www\rudsjs\node_modules\mongoose\lib\connection.js:625:11)
at populate (f:\www\rudsjs\node_modules\mongoose\lib\model.js:2136:24)
at Function.Model.populate (f:\www\rudsjs\node_modules\mongoose\lib\model.js:2101:5)
at Object.cb (f:\www\rudsjs\node_modules\mongoose\lib\query.js:1159:16)
at Object._onImmediate (f:\www\rudsjs\node_modules\mongoose\node_modules\mquery\lib\utils.js:137:16)
at processImmediate [as _immediateCallback] (timers.js:336:15)
Process finished with exit code 8
I tried to preload the model before the populate() call using this:
User(session.database);
but the problem seems to be related to the way Mongoose caches the models (that is not mine), I looked at connection.js
model = this.base.models[name];
if (!model) {
throw new MongooseError.MissingSchemaError(name);
}
so I'd need a way to insert my model inside this.base.models. Do you have any idea? Do you especially have a better solution to implement a multi-site/multi-database environment?
I found a solution. I was caching the models and not the connections, so I was spawning a new Mongoose connection for each model, that's why Mongoose wasn't able to load other models using populate().
My solution is: use a global Cache object that stores the connections that will be used to build all the models.
Cache global object
module.exports = {
Cache: {
database: {}
}
}
Model (Page)
module.exports = function(session) {
if(! Cache.database[session.database]) {
debug("Create connection to " + session.database);
var conn = mongoose.createConnection('mongodb://localhost/' + session.database);
Cache.database[session.database] = conn;
}
debug("[rud] " + session.database + " model created");
return Cache.database[session.database].model('Page', pageSchema);
}
As you can see all models now will share the same connection stored in Cache.database[session.database]

efficiency of mongodb/mongoskin access by multiple modules approach?

I'm developing an express app that provides a REST api, it uses mongodb through mongoskin. I wanted a layer that splits routing from db acess. I have seen an example that creates a database bridge by creating a module file, an example models/profiles.js:
var mongo = require('mongoskin'),
db = mongo.db('localhost:27017/profiler'),
profs = db.collection('profiles');
exports.examplefunction = function (info, cb) {
//code that acess the profs collection and do the query
}
later this module is required in the routing files.
My question is: If I use this aproach for creating one module for each collection, will it be efficient? Do I have an issue of connecting and disconnecting multiple(unnecessary) times from mongo by doing that?
I was thiking that maybe exporting the db variable from one module to the others that handle each collection would solve the suposed issue, but I'm not sure.
Use a single connection and then create your modules passing in the shared db instance. You want to avoid setting up separate db pools for each module. One of doing this is to construct the module as a class.
exports.build = function(db) {
return new MyClass(db);
}
var MyClass = function(db) {
this.db = db;
}
MyClass.doQuery = function() {
}

Resources