How to work with node.js and mongoDB - node.js

I read :
How do I manage MongoDB connections in a Node.js web application?
http://mongodb.github.io/node-mongodb-native/driver-articles/mongoclient.html
How can I set up MongoDB on a Node.js server using node-mongodb-native in an EC2 environment?
And I am really confused. How I should work with mongoDB from node.js? I’m a rookie, and my question may look stupid.
var db = new db.MongoClient(new db.Server('localhost', 27017));
db.open(function(err, dataBase) {
//all code here?
dataBase.close();
});
Or every time when I needing something from db I need call:
MongoClient.connect("mongodb://localhost:27017/myDB", function(err, dataBase) {
//all code here
dataBase.close();
});
What is the difference betwen open and connect? I read in the manual that open: Initialize and second connect. But what exactly does that mean? I assume that both do the same, but in the other way, so when should I use one instead the other?
I also wanna ask it's normal that mongoClient needing 4 socket? I running two myWEbServer at the same time, here’s picture:
http://i43.tinypic.com/29mlr14.png
EDIT:
I wanna mention that this isn't a problem ( rather doubt :D), my server works perfect. I ask because I wanna know if I am using mongoDB driver correctly.
Now/Actually I use first option,init mongo dirver at the beginning and inside load put all code.

I'd recommend trying the MongoDB tutorial they offer. I was in the same boat, but this breaks it down nicely. In addition, there's this article on github that explains the basics of DB connection.
In short, it does look like you're doing it right.
MongoClient.connect("mongodb://localhost:27017/myDB", function(err, dataBase) {
//all code here
var collection = dataBase.collection('users');
var document1 = {'name':'John Doe'};
collection.insert(document1, {w:1}, function(err,result){
console.log(err);
});
dataBase.close();
});

You still can sign up for a free course M101JS: MongoDB for Node.js Developers, provided by MongoDB guys
Here is short description:
This course will go over basic installation, JSON, schema design,
querying, insertion of data, indexing and working with language
drivers. In the course, you will build a blogging platform, backed by
MongoDB. Our code examples will be in Node.js.

I had same question. I couldn't find any proper answer from mongo documentation.
All document say is to prefer new db connection and then use open (rather than using connect() )
http://docs.mongodb.org/manual/reference/method/connect/

Related

How to integrate Pouch DB, with couchDB application which is built with jquery couch API?

I developed an application with couch DB. I used jquery couch API. Now i want to add Pouch DB also in my application. So What should be the workflow for this implementation.
For Example
Currently to insert data in couch DB, I am using this,
var mydata = {
_id: 'dave#gmail.com',
name: 'David',
age: 68
};
$.couch.db('dbname').saveDoc(mydata,{
success:function(data){
console.log(data);
}
});
And on Pouch DB Tutorial, they given example to insert like following
var db = new PouchDB('dbname');
db.put({
_id: 'dave#gmail.com',
name: 'David',
age: 68
});
So Now As I want to add Pouch Db in my application, Do I need to change whole jquery couch API to Pouch Db API?
Or I can still achieve that with jquery couch API.?
Also My already developed CouchDB application has 3GB of database. So when I try to apply two way sync, it is just not happening.How can I handle this?
You can either write a proxy that will take the $.couch calls and forward them to PouchDB, or replace the $.couch ones with PouchDB calls, I would personally do the latter, the 2 apis are fairly closely matched so it should be mostly a simple find / replace
As for the sync issue, 3G is a fair amount of data to load into the browser but it should work (with prompts) or at the least fail in a way that tells you why, if you are seeing something not working please feel free to file an issue in the pouchdb repo

Connection to Mongodb-Native-Driver in express.js

I am using mongodb-native-driver in express.js app. I have around 6 collections in the database, so I have created 6 js files with each having a collection as a javascript object (e.g function collection(){}) and the prototypes functions handling all the manipulation on those collections. I thought this would be a good architecture.
But the problem I am having is how to connect to the database? Should I create a connection in each of this files and use them? I think that would be an overkill as the connect in mongodb-native-driver creates a pool of connections and having several of them would not be justified.
So how do I create a single connection pool and use it in all the collections.js files? I want to have the connection like its implemented in mongoose. Let me know if any of my thought process in architecture of the app is wrong.
Using Mongoose would solve these problems, but I have read in several places thats it slower than native-driver and also I would prefer a schema-less models.
Edit: I created a module out of models. Each collection was in a file and it took the database as an argument. Now in the index.js file I called the database connection and kept a variable db after I got the database from the connection. (I used the auto-reconnect feature to make sure that the connection wasn't lost). In the same index.js file I exported each of the collections like this
exports.model1 = require('./model1').(db)
exprorts.model2 = require('./model2').(db)
This ensured that the database part was handled in just one module and the app would just call function that each model.js file exported like save(), fincdbyid() etc (whatever you do in the function is upto you to implement).
how to connect to the database?
In order to connect using the MongoDB native driver you need to do something like the following:
var util = require('util');
var mongodb = require('mongodb');
var client = mongodb.MongoClient;
var auth = {
user: 'username',
pass: 'password',
host: 'hostname',
port: 1337,
name: 'databaseName'
};
var uri = util.format('mongodb://%s:%s#%s:%d/%s',
auth.user, auth.pass, auth.host, auth.port, auth.name);
/** Connect to the Mongo database at the URI using the client */
client.connect(uri, { auto_reconnect: true }, function (err, database) {
if (err) throw err;
else if (!database) console.log('Unknown error connecting to database');
else {
console.log('Connected to MongoDB database server at:');
console.log('\n\t%s\n', uri);
// Create or access collections, etc here using the database object
}
});
A basic connection is setup like this. This is all I can give you going on just the basic description of what you want. Post up some code you've got so far to get more specific help.
Should I create a connection in each of this files and use them?
No.
So how do I create a single connection pool and use it in all the collections.js files?
You can create a single file with code like the above, lets call it dbmanager.js connecting to the database. Export functions like createUser, deleteUser, etc. which operate on your database, then export functionality like so:
module.exports = {
createUser: function () { ; },
deleteUser: function () { ; }
};
which you could then require from another file like so:
var dbman = require('./dbmanager');
dbman.createUser(userData); // using connection established in `dbmanager.js`
EDIT: Because we're dealing with JavaScript and a single thread, the native driver indeed automatically handles connection pooling for you. You can look for this in the StackOverflow links below for more confirmation of this. The OP does state this in the question as well. This means that client.connect should be called only once by an instance of your server. After the database object is successfully retrieved from a call to client.connect, that database object should be reused throughout the entire instance of your app. This is easily accomplished by using the module pattern that Node.JS provides.
My suggestion is to create a module or set of modules which serves as a single point of contact for interacting with the database. In my apps I usually have a single module which depends on the native driver, calling require('mongodb'). All other modules in my app will not directly access the database, but instead all manipulations must be coordinated by this database module.
This encapsulates all of the code dealing with the native driver into a single module or set of modules. The OP seems to think there is a problem with the simple code example I've posted, describing a problem with a "single large closure" in my example. This is all pretty basic stuff, so I'm adding clarification as to the basic architecture at work here, but I still do not feel the need to change any code.
The OP also seems to think that multiple connections could possibly be made here. This is not possible with this setup. If you created a module like I suggest above then the first time require('./dbmanager') is called it will execute the code in the file dbmanager.js and return the module.exports object. The exports object is cached and is also returned on each subsequent call to require('./dbmanager'), however, the code in dbmanager.js will only be executed the first require.
If you don't want to create a module like this then the other option would be to export only the database passed to the callback for client.connect and use it directly in different places throughout your app. I recommend against this however, regardless of the OPs concerns.
Similar, possibly duplicate Stackoverflow questions, among others:
How to manage mongodb connections in nodejs webapp
Node.JS and MongoDB, reusing the DB object
Node.JS - What is the right way to deal with MongoDB connections
As accepted answer says - you should create only one connection for all incoming requests and reuse it, but answer is missing solution, that will create and cache connection. I wrote express middleware to achieve this - express-mongo-db. At first sight this task is trivial, and most people use this kind of code:
var db;
function createConnection(req, res, next) {
if (db) { req.db = db; next(); }
client.connect(uri, { auto_reconnect: true }, function (err, database) {
req.db = db = databse;
next();
});
}
app.use(createConnection);
But this code lead you to connection-leak, when multiple request arrives at the same time, and db is undefined. express-mongo-db solving this by holding incoming clients and calling connect only once, when module is required (not when first request arrives).
Hope you find it useful.
I just thought I would add in my own method of MongoDB connection for others interested or having problems with different methods
This method assumes you don't need authentication(I use this on localhost)
Authentication is still easy to implement
var MongoClient = require('mongodb').MongoClient;
var Server = require('mongodb').Server;
var client = new MongoClient(new Server('localhost',27017,{
socketOptions: {connectTimeoutMS: 500},
poolSize:5,
auto_reconnect:true
}, {
numberOfRetries:3,
retryMilliseconds: 500
}));
client.open(function(err, client) {
if(err) {
console.log("Connection Failed Via Client Object.");
} else {
var db = client.db("theDbName");
if(db) {
console.log("Connected Via Client Object . . .");
db.logout(function(err,result) {
if(!err) {
console.log("Logged out successfully");
}
client.close();
console.log("Connection closed");
});
}
}
});
Credit goes to Brad Davley which goes over this method in his book (page 231-232)

Transactional SQL with Sails.js

So I have been playing with NodeJS/Express for a little with now and I would really like to try to rewrite a relatively large side project using a full JavaScript stack just to see how it will work. Sails.js seems to be a pretty good choice for a NodeJS backend for a REST API with support for web sockets which is exactly what I am looking for however is one more issue I am looking to resolve and that is transactional SQL within NodeJS.
Most data layer/orms I have seen on the NodeJS side of things don't seem to support transactions when dealing with MySQL. The ORM provided with Sails.js (Waterline) also does not seem to support transactions which is weird because I have seen places where is mentioned it did though those comments are quite old. Knex.js has support for transactions so I was wondering if it is easy to replace the ORM is Sails.js with this (or if Sails.js assumes Waterline in the core framework).
I was also wondering if there is an ORM built on top of Knex.js besides Bookshelf as I am not a fan of Backbones Model/Collection system?
You can still write SQL queries directly using Model.query(). Since this is an asynchronous function, you'll have to use promises or async to reserialize it. For instance, using the MySQL adapter, async, and a model called User:
async.auto({
transaction: function(next){
User.query('BEGIN', next);
},
user: ['transaction', function(next) {
User.findOne(req.param('id')).exec(next);
}],
// other queries in the transaction
// ...
}, function(err, results) {
if (err) {
User.query('ROLLBACK', next);
return next(err);
}
User.query('COMMIT', next);
// final tasks
res.json(results.serialize);
});
We're working on native support for transactions at the ORM level:
https://github.com/balderdashy/waterline/issues/62
Associations will likely come first, but transactions are next. We just finished GROUP BY and aggregations (SUM, AVG, etc.)
Transactions in SailsJS turned out to be much trickier than anticipated. The goal is to let the ORM adapter know that two very disparate controller actions on models are to be sent through a single MySQL connection.
The natural way to do it is two write a new adapter that accepts an additional info to indicate that a query belongs to a transaction call. Doing that requires a change in waterline (sails ORM abstraction module) itself.
Checkout if this helps - https://www.npmjs.com/package/sails-mysql-transactions

Using node ddp-client to insert into a meteor collection from Node

I'm trying to stream some syslog data into Meteor collections via node.js. It's working fine, but the Meteor client polling cycle of ~10sec is too long of a cycle for my tastes - I'd like it be be ~1 second.
Client-side collection inserts via console are fast and all clients update instantly, as it's using DDP. But a direct MongoDB insert from the server side is subject to the polling cycle of the client(s).
So it appears that for now I'm relegated to using DDP to insert updates from my node daemon.
In the ddp-client package example, I'm able to see messages I've subscribed to, but I don't see how to actually send new messages into the Meteor collection via DDP and node.js, thereby updating all of the clients at once...
Any examples or guidance? I'd greatly appreciate it - as a newcomer to node and Meteor, I'm quickly hitting my limits.
Ok, I got it working after looking closely at some code and realizing I was totally over-thinking things. The protocol is actually pretty straight forward, RPC ish stuff.
I'm happy to report that it absolutely worked around the server-side insert delay (manual Mongo inserts were taking several seconds to poll/update the clients).
If you go through DDP, you get all the real-time(ish) goodness that you've come to know and love with Meteor :)
For posterity and to hopefully help drive some other folks to interesting use cases, here's the setup.
Use Case
I am spooling some custom syslog data into a node.js daemon. This daemon then parses and inserts the data into Mongo. The idea was to come up with a real-timey browser based reporting project for my first Meteor experiment.
All of it worked well, but because I was inserting into Mongo outside of Meteor proper, the clients had to poll every ~10 seconds. In another SO post #TimDog suggested I look at DDP for this, and his suggestion looks to have worked perfectly.
I've tested it on my system, and I can now instantly update all Meteor clients via a node.js async application.
Setup
The basic idea here is to use the DDP "call" method. It takes a list of parameters. On the Meteor server side, you export a Meteor method to consume these and do your MongoDB inserts. It's actually really simple:
Step 1: npm install ddp
Step 2: Go to your Meteor server code and do something like this, inside of Meteor.methods:
Meteor.methods({
'push': function(k,v) { // k,v will be passed in from the DDP client.
console.log("got a push request")
var d = {};
d[k] = parseInt(v);
Counts.insert(d, function(err,result){ // Now, simply use your Collection object to insert.
if(!err){
return result
}else{
return(err)
}
});
}
});
Now all we need to do is call this remote method from our node.js server, using the client library. Here's an example call, which is essentially a direct copy from the example.js calls, tweaked a bit to hook our new 'push' method that we've just exported:
ddpclient.call('push', ['hits', '1111'], function(err, result) {
console.log('called function, result: ' + result);
})
Running this code inserts via the Meteor server, which in turn instantly updates the clients that are connected to us :)
I'm sure my code above isn't perfect, so please chime in with suggestions. I'm pretty new to this whole ecosystem, so there's a lot of opportunity to learn here. But I do hope that this helps save some folks a bit of time. Now, back to focusing on making my templates shine with all this real-time data :)
According to this screencast its possible to simply call the meteor-methods declared by the collection. In your case the code would look like this:
ddpclient.call('/counts/insert', [{hits: 1111}], function(err, result) {
console.log('called function, result: ' + result);
})

How to create a database and expose it over http inorder to receie data from a sensor

First of all, i am very much a newbie to all the technologies mentioned in this post.
I am working on something, where I have sensors, which would send their reading via HTTP post. The sensor would send the sensed value periodically over http as XML.
I found a link here that explains how to create REST API.
Now in the link above it is quite clear untill the point, the author installs mongoDB. But after that point things get complex and the author didn't give an explanation what is happening in the code after.
What I am not able to figure out is,
How to create a database in node.js using mondgoDB and expose this database over http, for the sensors to send the readings.
How can I access this database in my URI's?
How can I access the date and time the data was added onto database.
I would really appreciate any help.
I am no node.js expert but the same rules apply across the board for this kind of stuff.
You must understand that your database will not be accessible directly. Instead it will be accesses from node.js. You can add a --rest option to MongoDBs startup which will start a self contained RESTlet within the mongod program, but this is probably not an awesome idea here.
As far as I can see your jkust confused about the layers, which is common in this scenario, so to explain:
Your sensors will POST data (I would probably change that to JSON format, it is more expressive and smaller than XML) out to your node.js server running on, i.e. 81.187.205.13
It will post to whatever destination your rest function to deal with this data is running, i.e. /someawesomecontroller/notsuchagoodfunction
That function (as described by the tutorial you linked) will then pick up this POST, parse it and use the default method within node.js (via the driver) to insert into MongoDB. You can see the guy who wrote that tutorial doing that in the later partsd, i.e.:
exports.findById = function(req, res) {
var id = req.params.id;
console.log('Retrieving wine: ' + id);
db.collection('wines', function(err, collection) {
collection.findOne({'_id':new BSON.ObjectID(id)}, function(err, item) {
res.send(item);
});
});
};
So really now all you need are some tutorials on how the MongoDB driver in node.js works, here is a nice starting place: Do you know any tutorial for mongoDB in nodeJS?
Hope it helps,

Resources