I have been looking around for simple database abstraction implementation, then i found great article http://howtonode.org/express-mongodb, which old but I still like the idea.
Well maybe the construction, could take some kind of object literal with database settings.
So the main idea is that there could be different implementations of UserService-s, but locate in different directories and require only the one that's needed.
/data-layer/mongodb/user-service.js
/post-service.js
/comment-service.js
/data-layer/couchdb/user-service.js
/post-service.js
/comment-service.js
When the Database is needed, I wil get it with var UserService = require(__dirname + '/data-layer/mongodb/user-service).UserService(db); where var db = "open db object"
Would this be the correct way to do it or is there any better solutions ?
There are a few solutions, available via NPM :
Node-DBI : "Node-DBI is a SQL database abstraction layer library, strongly inspired by the PHP Zend Framework Zend_Db API. It provides unified functions to work with multiple database engines, through Adapters classes. At this time, supported engines are mysql, mysql-libmysqlclient and sqlite3". Looks like the developpment has been paused.
Accessor : "A database wrapper, provide easy access to databases." Supports only MySQL and MongoDB at the moment.
Activerecord : "An ORM written in Coffeescript that supports multiple database systems (SQL, NoSQL, and even REST), as well as ID generation middleware. It is fully extendable to add new database systems and plugins."
Update:
Since I've posted this answer, I have abandoned mongoose for official MongoDB NodeJS Drivers as it is fairely intuitive and more loyal to the concept of NoSQL databases.
Original Answer:
I though it might be time to update the answer of an old question:
If you want to use MongoDB as your document-oriented database, mongoose is a good choice and easy to use (example from official site):
var mongoose = require('mongoose');
mongoose.connect('mongodb://localhost/test');
var Cat = mongoose.model('Cat', { name: String });
var kitty = new Cat({ name: 'Zildjian' });
kitty.save(function (err) {
if (err) // ...
console.log('meow');
});
For a rather modern approach, Mongorito is a good ODM which uses ES6 generators instead of callbacks.
As of 06.2015 I reckon that the best ORM for SQL databases with Node.js/io.js is Sequelize supporting the following databases:
PostgreSQL
MySQL
MariaDB
SQLite
MSSQL
The setup is fairly easy:
var sequelize = new Sequelize('database', 'username', 'password', {
host: 'localhost',
dialect: 'mysql'
});
// Or you can simply use a connection uri
var sequelize = new Sequelize('postgres://user:pass#example.com:5432/dbname');
It also provides transactions, migrations and many other goodies.
Related
I was trying to understand how to use the passport.js in Mongodb native driver,Express now the problem is that all the reference or tutorials are showing the LOCAL-STRATEGY by using mongoose which creates a schema or model........so now iam STUCK
Take a look at the mongodb documentation for their Nodejs driver.
mongoDB Node Driver
Sorry for being here for a little bit late, but maybe my answer would be helpful to others who seeking answer for this kind of question.
I assume that you were struggling with these problems:
How to reuse database connection among your project
You can define your MongoClient object once and reuse this across multiple modules in your project as follow:
dbUtil.js hold definition of MongoClient object:
const MongoClient = require('mongodb').MongoClient;
const SERVER_URI = // link to your server, for example: 'http://localhost:27017';
const DB_NAME = // database name, for example: 'test';
/* #WHY?:
(option 1?) : to let the server assign objectId instead of Node driver,
(option 2 & 3?) : to get rid of deprecation warnings
*/
const clientObj = new MongoClient(`${SERVER_URI}/${DB_NAME}`, {
forceServerObjectId: true,
useNewUrlParser: true,
useUnifiedTopology: true
});
module.exports = {
client: clientObj,
dbName: DB_NAME
}
In another module where you need to use the defined connection:
const { client, dbName } = require('dbUtil');
// Because client.connect() return a Promise, you should wrap everything
// inside an immediately-invoked expression like this
(async () => {
await client.connect(); // at first you need to open the connection client
const dbO = await client.db(dbName); // get the connection to database
/* perform database operations, for example:
dbO.collection(users).insertOne({ name:'mongoadmin' });
*/
client.close(); // remember to close the connection when you're done
})();
So instead of the Mongoose way of using User.find().exec(), in Mongo native driver you have to activate connection to Client first and then use client.dbO.collection('users') (which return a Promise).
What the heck is Passport and why it's needed for your project
Passport is authentication middleware for Express that support authentication from Facebook, Google, JWT,... and many other authentication strategies. It can be helpful when you need to you want to support authentication from multiple authentication portal. However, it's not a must-have.
Sometimes applying another layer of abstraction from third-party libraries not only bring no sense to you & your project, but also over-complicate your existed code base. You'd chose not to use Mongoose and adapted MongoDb native driver instead, stated that you didn't need schema & model stuffs. For the same logic, I don't see any necessity of adapting Passport. This link can be helpful to you in some way: another Stackoverflow post
To apply authentication using JSON web token to your Express routes, you need to do these following steps:
Generate token for user signed in
Verify token
Define protected routes and write middlewares for those
All these tasks can be done without any third-party modules/libraries!
I believe your question stems from using mongodb schema validation instead of mongoose schema. You can use another means of authentication like JWT which does not directly need models for its authentication.
I am looking for MongoDB API compatible DB engine that does not require a full blown mongod process to run (kind of SQLite for Node).
From multiple candidates that persistently store data on a local disk with similar API ended up with two:
NeDB https://github.com/louischatriot/nedb
tingodb http://www.tingodb.com/
Problem
I have worked with neither of them.
I am also very new to the API of MongoDB, so it is difficult for me to judge about comparability.
Requirements
I need your help/advice on picking only one library that satisfies
It is stable enough.
It is fast to handle ~1Mb JSON documents on disk or bigger.
I want to be able to switch to MongoDB as a data backend in the future or by demand by changing a config file. I don't want to duplicate code.
DB initialization api is different
Now only tingodb claims the API compatibility. Even initialization looks fairly similar.
tingodb
var Db = require('tingodb')().Db, assert = require('assert');
vs
mongodb
var Db = require('mongodb').Db,
Server = require('mongodb').Server,
assert = require('assert');
In case of NeDB it looks a bit different because it uses the datastore abstraction:
// Type 1: In-memory only datastore (no need to load the database)
var Datastore = require('nedb')
, db = new Datastore();
QUESTION
Obliviously initialization is not compatible. But what about CRUD? How difficult it is to adopt it?
Since most of the code I do not want to duplicate will be CRUD operations, I need to know how similar they are, i.e. how agnostic can be my code about the fact which backend I have.
// If doc is a JSON object to be stored, then
db.insert(doc); // which is a NeDB method which is compatiable
// How about *WriteResult*? does not look like it..
db.insert(doc, function (err, newDoc) { // Callback is optional
// newDoc is the newly inserted document, including its _id
// newDoc has no key called notToBeSaved since its value was undefined
});
I will appreciate your insight in this choice!
Also see:
Lightweight Javascript DB for use in Node.js
Has anyone used Tungus ? Is it mature?
NeDB CRUD operations are upwards compatible with MongoDB, but initialization is indeed not. NeDB implements part of MongoDB's API but not all, the part implemented is upwards compatible.
It's definitely fast enough for your requirements, and we've made it very stable over the past few months (no more bug reports)
I am new to mongodb and nodejs.So far I have been able to create a new mongodb database and access it via nodejs. However I want to write some generic set of methods for accessing collections (CRUD), as my list of collections will grow in number. For example I have a collection which contains books and authors
var books = db.collection('books');
var authors = db.collection('authors');
exports.getBooks = function(callback) {
books.find(function(e, list) {
list.toArray(function(res, array) {
if (array) callback(null, array);
else callback(e, "Error !");
});
});
};
Similar to this I have the method for getting authors as well.Now this is getting too repetitive as I want to add methods for CRUD operations as well. Is there a way to have common/generic CRUD methods for all my collections ?
You should take a look at Mongoose, it makes it easy to handle Mongodb from node.js, Mongoose js has a schema based solution, where each schema maps to a Mongodb collection, and you have a set of methods to manipulate these collections via models, that are obtained by compiling the schemas. I was exactly in the same place couple of months ago and have found that Mongoosejs is a good enough for all your needs.
#Dilpa - not sure if you have looked at or are utilizing Mongoose link, but it can be helpful with implementing CRUD.
I wrote my own service to handle very simple CRUD operations on mongodb documents. Mongoose is excellent, but imposes structure on the documents (which IMO goes against the purpose of mongodb--if you're going to have a schema, why not just use a relational db?).
https://github.com/stupid-genius/MongoCRUD
This service also has the advantage of being implemented as a REST API, so it can be consumed by a node.js app or others. If you send a GET request to the root path of the server, you'll get a screen that shows the syntax for all the CRUD operations (the GUI isn't implemented yet). My basic approach was to specify the db and collection on the URL path host/db/collection and then pass the doc in the POST body. The route handlers then just pass on the doc to an appropriate mongodb function; my service just exposes those methods in a pretty raw state (it does require authentication though).
I read :
How do I manage MongoDB connections in a Node.js web application?
http://mongodb.github.io/node-mongodb-native/driver-articles/mongoclient.html
How can I set up MongoDB on a Node.js server using node-mongodb-native in an EC2 environment?
And I am really confused. How I should work with mongoDB from node.js? I’m a rookie, and my question may look stupid.
var db = new db.MongoClient(new db.Server('localhost', 27017));
db.open(function(err, dataBase) {
//all code here?
dataBase.close();
});
Or every time when I needing something from db I need call:
MongoClient.connect("mongodb://localhost:27017/myDB", function(err, dataBase) {
//all code here
dataBase.close();
});
What is the difference betwen open and connect? I read in the manual that open: Initialize and second connect. But what exactly does that mean? I assume that both do the same, but in the other way, so when should I use one instead the other?
I also wanna ask it's normal that mongoClient needing 4 socket? I running two myWEbServer at the same time, here’s picture:
http://i43.tinypic.com/29mlr14.png
EDIT:
I wanna mention that this isn't a problem ( rather doubt :D), my server works perfect. I ask because I wanna know if I am using mongoDB driver correctly.
Now/Actually I use first option,init mongo dirver at the beginning and inside load put all code.
I'd recommend trying the MongoDB tutorial they offer. I was in the same boat, but this breaks it down nicely. In addition, there's this article on github that explains the basics of DB connection.
In short, it does look like you're doing it right.
MongoClient.connect("mongodb://localhost:27017/myDB", function(err, dataBase) {
//all code here
var collection = dataBase.collection('users');
var document1 = {'name':'John Doe'};
collection.insert(document1, {w:1}, function(err,result){
console.log(err);
});
dataBase.close();
});
You still can sign up for a free course M101JS: MongoDB for Node.js Developers, provided by MongoDB guys
Here is short description:
This course will go over basic installation, JSON, schema design,
querying, insertion of data, indexing and working with language
drivers. In the course, you will build a blogging platform, backed by
MongoDB. Our code examples will be in Node.js.
I had same question. I couldn't find any proper answer from mongo documentation.
All document say is to prefer new db connection and then use open (rather than using connect() )
http://docs.mongodb.org/manual/reference/method/connect/
So I have been playing with NodeJS/Express for a little with now and I would really like to try to rewrite a relatively large side project using a full JavaScript stack just to see how it will work. Sails.js seems to be a pretty good choice for a NodeJS backend for a REST API with support for web sockets which is exactly what I am looking for however is one more issue I am looking to resolve and that is transactional SQL within NodeJS.
Most data layer/orms I have seen on the NodeJS side of things don't seem to support transactions when dealing with MySQL. The ORM provided with Sails.js (Waterline) also does not seem to support transactions which is weird because I have seen places where is mentioned it did though those comments are quite old. Knex.js has support for transactions so I was wondering if it is easy to replace the ORM is Sails.js with this (or if Sails.js assumes Waterline in the core framework).
I was also wondering if there is an ORM built on top of Knex.js besides Bookshelf as I am not a fan of Backbones Model/Collection system?
You can still write SQL queries directly using Model.query(). Since this is an asynchronous function, you'll have to use promises or async to reserialize it. For instance, using the MySQL adapter, async, and a model called User:
async.auto({
transaction: function(next){
User.query('BEGIN', next);
},
user: ['transaction', function(next) {
User.findOne(req.param('id')).exec(next);
}],
// other queries in the transaction
// ...
}, function(err, results) {
if (err) {
User.query('ROLLBACK', next);
return next(err);
}
User.query('COMMIT', next);
// final tasks
res.json(results.serialize);
});
We're working on native support for transactions at the ORM level:
https://github.com/balderdashy/waterline/issues/62
Associations will likely come first, but transactions are next. We just finished GROUP BY and aggregations (SUM, AVG, etc.)
Transactions in SailsJS turned out to be much trickier than anticipated. The goal is to let the ORM adapter know that two very disparate controller actions on models are to be sent through a single MySQL connection.
The natural way to do it is two write a new adapter that accepts an additional info to indicate that a query belongs to a transaction call. Doing that requires a change in waterline (sails ORM abstraction module) itself.
Checkout if this helps - https://www.npmjs.com/package/sails-mysql-transactions