Split mongoose models into subsets based on session variable - node.js

I'm migrating an application from MySQL to Node/Mongoose running on Express. Currently in my LAMP stack I have an "account_id" column in several tables, and according to the "active account" in the session, the system automatically queries according to the account_id. Obviously I can do the same in my new setup, but seeing as I'm new to Node, Mongoose, and NoSQL in general, I was wondering if there is a better way to accomplish this with this new technology.
I was thinking:
1) Use multiple databases (but I can't figure out how to determine database from session unless I connect to the database separately on each request)
2) Add a prefix to the collection name, but then I'd have to reconstruct the mongoose.model object on every request, which I guess wouldn't be so bad.
What are the costs and benefits to these and other solutions?

What I would do is to have a user model and have it be connected to the other models of your project like so:
mongoose.model('User', new Schema({
name: String,
email: String,
}))
mongoose.model('Other', new Schema({
user: {type: ObjectId, ref: 'User'},
name: String,
}))
It shouldn't be much different from you MySQL database.

Related

How can two different Node.js servers communicate to one mongoose schema?

I have two servers (Public server and Admin Server) and I would like to access from both one mongoose schema. The main server is the Public one where everything happens, but sometimes, from Admin server I would like to access that schema. I could write the schema on both the servers, but that would mean bad code. If that is the only solution, I will do it. But, is there any other way of doing this? For MongoDB I have a third server, that is only for database. Could I write something there so that when I connect with mongoose to the MongoDB server to receive the model from there?
Let's say I have this code (somewhere, I don't know where yet).
const mongoose = require('mongoose');
const postSchema = mongoose.Schema({
title: {
type: String,
required: true,
}
});
const Post = new mongoose.model('Post', postSchema);
module.exports = Post;
What I am trying to do in a server file is for example to call Post.save() or whatever function I am trying to get, without having the schema on both servers.
I used Mongoose-Gen npm and created an API in order to get the schema from on server to another.

How to share mongoose models with multiple microservices

I've a User model which looks like:
import mongoose from 'mongoose';
const UserSchema = new mongoose.Schema({
name: String,
email: {
type: String,
required: true,
unique: true,
},
password: {
type: String,
required: true,
},
});
export default mongoose.model('User', UserSchema);
I'm trying to share this model with multiple microservices, but how do i share this? Should i make a database service exposed over http or should i manually make models in each server and use it that way, or is there any other way to do the same?
It's dangerous to share schemas across microservices because they could become very coupled, or at least not like that. It's normal that microservices use data from each other, but models should not be fully imported in another microservice. Instead, the dependent microservices should use a subset, a local representation of the remote model. For this you should use an Anti-corruption layer. This ACL would receive as input remote models and produce as output a local, immutable/readonly representation of that model. The ACL lives at the outer boundary of the microservice, i.e. where the remote calls are made.
Also, sharing *schema.js files across microservices would force you to use JavaScript/NodeJS in all the other microservices, which is not good. Each microservice should use whatever programming language is best suited for it.
Should i make a database service exposed over http
The database is private to the owning microservice. It should not be exposed.

Multi tenant (SAAS) using nodejs sequelize

I am trying to build a multi tenant ( / Software as a service) using nodejs and postgres, sequelize as ORM. I decided to go with separate DBs for each client, rather than having single DB with all table having the extra column, because of security reasons. I achieved the result, but performance was not good, since i have to initialise models for each DB according to sequelize(for almost each request). Is there any better way to do this? Am I missing something in sequelize?
A quick implementation of my comment above.
app.js:
const Sequelize = require('sequelize');
const connections = {
client1: new Sequelize('postgres://user:pass#example.com:5432/client1'),
client2: new Sequelize('postgres://user:pass#example.com:5432/client2'),
client3: new Sequelize('postgres://user:pass#example.com:5432/client3'),
};
const User = require('./models/user');
const Post = require('./models/post');
const Comment = require('./models/comment');
Object.keys(connections).forEach(connection => {
connection.define('User', userColumns);
connection.define('Post', postColumns);
connection.define('Comment', commentColumns);
});
models/user.js:
module.exports = {
id: {
type: Sequelize.INTEGER,
primaryKey: true,
autoIncrement: true
},
username: Sequelize.STRING,
email: Sequelize.STRING
// etc, etc
};
Obviously whatever server framework you use you'll need to detect (from the url I imagine) which client connection to use for a given request.
Alternatively, consider writing a single-connection app and deploying multiple instances of it (I'm doing this currently). Might be a simpler choice if you're set on separate DBs.
I prefer the schema approach to multi tenancy in Postgres. I'm no authority on security but it should be better than table based multi tenancy. Disaster recovery should also be slightly easier than table based MT, but still worse than with separate DBs.
I managed to implement this in Sequilize using ES6 Proxies. This puts a hard requirment on your Node version, but if you're willing to use at least v6.9.2 you could give my lib a try:
https://www.npmjs.com/package/sequelize-multi-tenant-enhancer

How to use MongoDB with mean.io

I am new to server side javascipt. I have started with mean.io. I gained some understanding of nodejs, express, mongodb last few days. I have my mean.io app but I don't know what's the right way to connect to mongodb and query it from my js files.
Is there a guide/blog which can help me work with mongodb from my server side javascript files?
All I want is to store some data mongodb and fetch that at some later point.
By default, you should see there is a mean-dev collection in your mongodb. The best way I thought to get familiar with mongo and mean is play around the code (for instance, the article package). Inside /packages/article/system/, you will see how the blog example works.
That works great for me.
I couldn't find one related to mean.io but below few links helped me get started with mean.io.
http://cwbuecheler.com/web/tutorials/2013/node-express-mongo/
https://www.youtube.com/watch?v=AEE7DY2AYvI
https://www.youtube.com/watch?v=5e1NEdfs4is
Edit:
Past few days I have been working on it and by test & learn I was able to got things working for me. I'll share whatever I know till now.
So mean.io use mongoose ODM to connect to the mongodb.
mean.io would automatically connect to your DB. You can configure DB name in development.js db: 'mongodb://localhost/myDB'. So you won't have to worry about connecting to mongoDB. You just need to start the mongoDB using mongod.
How to use mongoose?
To use mongoose to connect to mongoDB you need to build schemas. You can do so in myApp/app/models directory, since they represents models.
Sample model file user.js
var mongoose = require('mongoose');
var Schema = mongoose.Schema;
var userSchema = new Schema({
name: String,
email: String,
DOB : Date,
address: {
house_no: String,
street: String
}
});
module.exports = mongoose.model('tbl_user',userSchema);
Note:- tbl_user would be stored as tbl_userS in DB.
How to save data to mongoDB?
One would generally do save to DB in controller. Below I have shown how one can do this.
To make models available to all controller one need to write this piece of code in server.js so that all your models get registered during server startup. Alternatively, import individual models using require('tbl_user').
Server.js :-
var models_path = __dirname + '/app/models';
var arrFiles = fs.readdirSync(models_path);
arrFiles.forEach(function(file){
if(file.indexOf('.js') > 0){
require(models_path + '/' + file);
}
});
controller code myApp/app/controllers/myController.js
var mongoose = require('mongoose');
var jsonEntry = {'name':'Mady', 'email':'xyz#xyz.com', 'address':{'house_no':12N, 'stree':'abc'}};
var User = mongoose.model('tbl_user');
var user = new User(jsonEntry);
user.save();
The above code would create and update the tbl_users collection in mongoDB.

Using sails.js with an existing postgres database

I was looking at using Sails for an app that we are developing.
I'm using the sails-postgresql adapter which uses the waterline orm.
I have an existing database that I want to connect to.
If I create a model using generate something
and then in my model I have
attributes:{
title:{type:'String'}
}
If I browse to localhost/something the orm deletes all the columns in the something table except title.
Is there a way to stop it from doing this? This app should not delete columns on this database.
Thanks!
I am the author of Sails-Postgresql. Sails has an ORM called Waterline that it uses for managing data. The default setting assumes that you would want to auto-migrate your database to match your model attributes. Because Postgresql is a SQL database the Sails-Postgresql adapter has a setting called syncable that defaults to true. This would be false in a NoSQL database like redis.
This is easy to turn off if you want to manage your database columns yourself. You can add migrate: safe to your model and it won't try and update your database schema when you start Sails.
module.exports = {
adapter: 'postgresql',
migrate: 'safe',
attributes: {
title: { type: 'string' }
}
};
Sails doesn't have anything like migrations in Rails. It uses auto-migrations to attempt to remove this from your development process and then leaves updating your production schema to you.

Resources