accessing mongodb from nodejs - common CRUD methods - node.js

I am new to mongodb and nodejs.So far I have been able to create a new mongodb database and access it via nodejs. However I want to write some generic set of methods for accessing collections (CRUD), as my list of collections will grow in number. For example I have a collection which contains books and authors
var books = db.collection('books');
var authors = db.collection('authors');
exports.getBooks = function(callback) {
books.find(function(e, list) {
list.toArray(function(res, array) {
if (array) callback(null, array);
else callback(e, "Error !");
});
});
};
Similar to this I have the method for getting authors as well.Now this is getting too repetitive as I want to add methods for CRUD operations as well. Is there a way to have common/generic CRUD methods for all my collections ?

You should take a look at Mongoose, it makes it easy to handle Mongodb from node.js, Mongoose js has a schema based solution, where each schema maps to a Mongodb collection, and you have a set of methods to manipulate these collections via models, that are obtained by compiling the schemas. I was exactly in the same place couple of months ago and have found that Mongoosejs is a good enough for all your needs.

#Dilpa - not sure if you have looked at or are utilizing Mongoose link, but it can be helpful with implementing CRUD.

I wrote my own service to handle very simple CRUD operations on mongodb documents. Mongoose is excellent, but imposes structure on the documents (which IMO goes against the purpose of mongodb--if you're going to have a schema, why not just use a relational db?).
https://github.com/stupid-genius/MongoCRUD
This service also has the advantage of being implemented as a REST API, so it can be consumed by a node.js app or others. If you send a GET request to the root path of the server, you'll get a screen that shows the syntax for all the CRUD operations (the GUI isn't implemented yet). My basic approach was to specify the db and collection on the URL path host/db/collection and then pass the doc in the POST body. The route handlers then just pass on the doc to an appropriate mongodb function; my service just exposes those methods in a pretty raw state (it does require authentication though).

Related

Exposing Meteor's mongo DB to a stateless client

I need a legacy java application to pull information from a meteor's collection.
Ideally, I would need a simple service where my app would be able to download the latest list of items prices. A scenario like going on (through an http GET):
www.mystore.com/listOfPrices
would return a json with an array
[{"item":"beer", price:"2.50"}, {"item":"water":, price:"1"}]
The problem is that I cannot make a meteor page printing the result "as is" because meteor assumes the client supports javascript. Note that I do plan to implement the java DDP client in a latter stage but here I would like to start with a very simple service.
Idea: I thought of doing my own Node.js request aside of the running meteor service in order to retrieve a snapshot of the collection. Then this request would be using a server based javascript DDP client in order to subscribe and filter to then return the collection once loaded as a jSON document (array).
Any idea on how to achieve this ?
Looks like you want to provide a REST interface. See the MeteorPedia page on REST for how to expose collection data. It might be as simple as
prices = new Mongo.Collection('prices');
// Add access points for `GET`, `POST`, `PUT`, `DELETE`
HTTP.publish({collection: prices}, function (data) {
// here you have access to this.userId, this.query, this.params
return prices.find({});
});

Is CRUD API of NeDB compatibale with MongoDB?

I am looking for MongoDB API compatible DB engine that does not require a full blown mongod process to run (kind of SQLite for Node).
From multiple candidates that persistently store data on a local disk with similar API ended up with two:
NeDB https://github.com/louischatriot/nedb
tingodb http://www.tingodb.com/
Problem
I have worked with neither of them.
I am also very new to the API of MongoDB, so it is difficult for me to judge about comparability.
Requirements
I need your help/advice on picking only one library that satisfies
It is stable enough.
It is fast to handle ~1Mb JSON documents on disk or bigger.
I want to be able to switch to MongoDB as a data backend in the future or by demand by changing a config file. I don't want to duplicate code.
DB initialization api is different
Now only tingodb claims the API compatibility. Even initialization looks fairly similar.
tingodb
var Db = require('tingodb')().Db, assert = require('assert');
vs
mongodb
var Db = require('mongodb').Db,
Server = require('mongodb').Server,
assert = require('assert');
In case of NeDB it looks a bit different because it uses the datastore abstraction:
// Type 1: In-memory only datastore (no need to load the database)
var Datastore = require('nedb')
, db = new Datastore();
QUESTION
Obliviously initialization is not compatible. But what about CRUD? How difficult it is to adopt it?
Since most of the code I do not want to duplicate will be CRUD operations, I need to know how similar they are, i.e. how agnostic can be my code about the fact which backend I have.
// If doc is a JSON object to be stored, then
db.insert(doc); // which is a NeDB method which is compatiable
// How about *WriteResult*? does not look like it..
db.insert(doc, function (err, newDoc) { // Callback is optional
// newDoc is the newly inserted document, including its _id
// newDoc has no key called notToBeSaved since its value was undefined
});
I will appreciate your insight in this choice!
Also see:
Lightweight Javascript DB for use in Node.js
Has anyone used Tungus ? Is it mature?
NeDB CRUD operations are upwards compatible with MongoDB, but initialization is indeed not. NeDB implements part of MongoDB's API but not all, the part implemented is upwards compatible.
It's definitely fast enough for your requirements, and we've made it very stable over the past few months (no more bug reports)

How to work with node.js and mongoDB

I read :
How do I manage MongoDB connections in a Node.js web application?
http://mongodb.github.io/node-mongodb-native/driver-articles/mongoclient.html
How can I set up MongoDB on a Node.js server using node-mongodb-native in an EC2 environment?
And I am really confused. How I should work with mongoDB from node.js? I’m a rookie, and my question may look stupid.
var db = new db.MongoClient(new db.Server('localhost', 27017));
db.open(function(err, dataBase) {
//all code here?
dataBase.close();
});
Or every time when I needing something from db I need call:
MongoClient.connect("mongodb://localhost:27017/myDB", function(err, dataBase) {
//all code here
dataBase.close();
});
What is the difference betwen open and connect? I read in the manual that open: Initialize and second connect. But what exactly does that mean? I assume that both do the same, but in the other way, so when should I use one instead the other?
I also wanna ask it's normal that mongoClient needing 4 socket? I running two myWEbServer at the same time, here’s picture:
http://i43.tinypic.com/29mlr14.png
EDIT:
I wanna mention that this isn't a problem ( rather doubt :D), my server works perfect. I ask because I wanna know if I am using mongoDB driver correctly.
Now/Actually I use first option,init mongo dirver at the beginning and inside load put all code.
I'd recommend trying the MongoDB tutorial they offer. I was in the same boat, but this breaks it down nicely. In addition, there's this article on github that explains the basics of DB connection.
In short, it does look like you're doing it right.
MongoClient.connect("mongodb://localhost:27017/myDB", function(err, dataBase) {
//all code here
var collection = dataBase.collection('users');
var document1 = {'name':'John Doe'};
collection.insert(document1, {w:1}, function(err,result){
console.log(err);
});
dataBase.close();
});
You still can sign up for a free course M101JS: MongoDB for Node.js Developers, provided by MongoDB guys
Here is short description:
This course will go over basic installation, JSON, schema design,
querying, insertion of data, indexing and working with language
drivers. In the course, you will build a blogging platform, backed by
MongoDB. Our code examples will be in Node.js.
I had same question. I couldn't find any proper answer from mongo documentation.
All document say is to prefer new db connection and then use open (rather than using connect() )
http://docs.mongodb.org/manual/reference/method/connect/

Transactional SQL with Sails.js

So I have been playing with NodeJS/Express for a little with now and I would really like to try to rewrite a relatively large side project using a full JavaScript stack just to see how it will work. Sails.js seems to be a pretty good choice for a NodeJS backend for a REST API with support for web sockets which is exactly what I am looking for however is one more issue I am looking to resolve and that is transactional SQL within NodeJS.
Most data layer/orms I have seen on the NodeJS side of things don't seem to support transactions when dealing with MySQL. The ORM provided with Sails.js (Waterline) also does not seem to support transactions which is weird because I have seen places where is mentioned it did though those comments are quite old. Knex.js has support for transactions so I was wondering if it is easy to replace the ORM is Sails.js with this (or if Sails.js assumes Waterline in the core framework).
I was also wondering if there is an ORM built on top of Knex.js besides Bookshelf as I am not a fan of Backbones Model/Collection system?
You can still write SQL queries directly using Model.query(). Since this is an asynchronous function, you'll have to use promises or async to reserialize it. For instance, using the MySQL adapter, async, and a model called User:
async.auto({
transaction: function(next){
User.query('BEGIN', next);
},
user: ['transaction', function(next) {
User.findOne(req.param('id')).exec(next);
}],
// other queries in the transaction
// ...
}, function(err, results) {
if (err) {
User.query('ROLLBACK', next);
return next(err);
}
User.query('COMMIT', next);
// final tasks
res.json(results.serialize);
});
We're working on native support for transactions at the ORM level:
https://github.com/balderdashy/waterline/issues/62
Associations will likely come first, but transactions are next. We just finished GROUP BY and aggregations (SUM, AVG, etc.)
Transactions in SailsJS turned out to be much trickier than anticipated. The goal is to let the ORM adapter know that two very disparate controller actions on models are to be sent through a single MySQL connection.
The natural way to do it is two write a new adapter that accepts an additional info to indicate that a query belongs to a transaction call. Doing that requires a change in waterline (sails ORM abstraction module) itself.
Checkout if this helps - https://www.npmjs.com/package/sails-mysql-transactions

How does Meteor receive updates to the results of a MongoDB query?

I asked a question a few months ago, to which Meteor seems to have the answer.
Which, if any, of the NoSQL databases can provide stream of *changes* to a query result set?
How does Meteor receive updates to the results of a MongoDB query?
Thanks,
Chris.
You want query.observe() for this. Say you have a Posts collection with a tags field, and you want to get notified when a post with the important tag is added.
http://docs.meteor.com/#observe
// collection of posts that includes array of tags
var Posts = new Meteor.Collection('posts');
// DB cursor to find all posts with 'important' in the tags array.
var cursor = Posts.find({tags: 'important'});
// watch the cursor for changes
var handle = cursor.observe({
added: function (post) { ... }, // run when post is added
changed: function (post) { ... } // run when post is changed
removed: function (post) { ... } // run when post is removed
});
You can run this code on the client, if you want to do something in each browser when a post changes. Or you can run this on the server, if you want to say send an email to the team when an important post is added.
Note that added and removed refer to the query, not the document. If you have an existing post document and run
Posts.update(my_post_id, {$addToSet: {tags: 'important'}});
this will trigger the 'added' callback, since the post is getting added to the query result.
Currently, Meteor really works well with one instance/process. In such case all queries are going through this instance and it can broadcast it back to other clients. Additional, it polls MongoDB every 10s for changes to the database which were done by outside queries. They are plans for 1.0 to improve the scalability and hopefully allow multiple instances to inform each one about changes.
DerbyJS on the other hand is using Redis PubSub.
From the docs:
On the server, a collection with that name is created on a backend Mongo server. When you call methods on that collection on the server,
they translate directly into normal Mongo operations.
On the client, a Minimongo instance is created. Minimongo is essentially an in-memory, non-persistent implementation of Mongo in
pure JavaScript. It serves as a local cache that stores just the
subset of the database that this client is working with. Queries on
the client (find) are served directly out of this cache, without
talking to the server.
When you write to the database on the client (insert, update, remove),
the command is executed immediately on the client, and,
simultaneously, it's shipped up to the server and executed there too.
The livedata package is responsible for this.
That explains client to server
Server to client from what I can gather is the livedata and mongo-livedata packages.
https://github.com/meteor/meteor/tree/master/packages/mongo-livedata
https://github.com/meteor/meteor/tree/master/packages/livedata
Hope that helps.

Resources