Where need it call dropCollection using nodejs with mongodb? - node.js

I was doing a server using nodejs, it need get data from mongodb. I retrieve data after require(../db.js). Somebody said the mongodb needn't be close in nodejs,because nodejs is a single process.....
My question: Need I call dropCollection to close the collection after invoked the db function many times;and How to do?Where to do that? Please,Thanks.

You dont need to drop the collection after invoking db functions,simply call db.close() though it is not needed. But if you want to do it , you can do it as follows:
var dropRestaurants = function(db, callback) {
db.collection('restaurants').drop( function(err, response) {
console.log(response)
callback();
});
};

Related

How do I ensure db call is completed before using db.colse() [in node.js to access mongo.db]

var storedArticleArray = db.collection('storedArticle').find(query).toArray;
console.dir(storedArticleArray);
db.close();
How can I ensure that console.dir(stroedArticleArray) displays its argument only after the database completed the query and stored the result in storedArticleArray?
Also db.close() does not close before the query is completed.
Does this work:
var storedArticleArray = db.collection('').find(query).toArray(function() {
console.dir(storedArticleArray);
db.close();
});
You must use Node.js callback functions when you query your MongoDB database, because they're asynchronous operations.
Following your example you can use:
var storedArticleArray = [];
db.collection('storedArticle').find(query, function(error, data) {
storedArticleArray = data.toArray;
console.dir(storedArticleArray);
db.close();
});
The callback function will be executed once the query will be completed and returned data (or an error, that you must always handle). In the callback you'll be sure to close the db connection without problems.

Node Js MongoDB Query Against returned array

I have a mongodb Relationships collection that stores the user_id and the followee_id(person the user is following). If I query for against the user_id I can find all the the individuals the user is following. Next I need to query the Users collection against all of the returned followee ids to get their personal information. This is where I confused. How would I accomplish this?
NOTE: I know I can embed the followees in the individual user's document and use and $in operator but I do not want to go this route. I want to maintain the most flexibility I can.
You can use an $in query without denormalizing the followees on the user. You just need to do a little bit of data manipulation:
Relationship.find({user_id: user_id}, function(error, relationships) {
var followee_ids = relationships.map(function(relationship) {
return relationship.followee_id;
});
User.find({_id: { $in: followee_ids}}, function(error, users) {
// voila
});
};
if i got your problem right(i think so).
you need to query each of the "individuals the user is following".
that means to query the database multiple queries about each one and get the data.
because the queries in node.js (i assume you using mongoose) are asynchronies you need to get your code more asynchronies for this task.
if you not familier with the async module in node.js it's about time to know it.
see npm async for docs.
i made you a sample code for your query and how it needs to be.
/*array of followee_id from the last query*/
function query(followee_id_arr, callback) {
var async = require('async')
var allResults = [];
async.eachSerias(followee_id_arr, function (f_id, callback){
db.userCollection.findOne({_id : f_id},{_id : 1, personalData : 1},function(err, data){
if(err) {/*handel error*/}
else {
allResults.push(data);
callback()
}
}, function(){
callback(null, allResults);
})
})
}
you can even make all the queries in parallel (for better preformance) by using async.map

how insert csv data to mongodb with nodejs

Hi im developing an app with nodeJS, express and a mongoDB, i need to take users data from a csv file and upload it to my database this db has a schema designed with mongoose.
but i don know how to do this, what is the best approach to read the csv file check for duplicates against the db and if the user (one column in the csv) is not here insert it?
are there some module to do this? or i need to build it from scratch? im pretty new to nodeJS
i need a few advices here
Thanks
this app have an angular frontend so the user can upload the file, maybe i should read the csv in the front end and transform it into an array for node, then insert it?
Use one of the several node.js csv libraries like this one, and then you can probably just run an upsert on the user name.
An upsert is an update query with the upsert flag set to true: {upsert: true}. This will insert a new record only if the search returns zero results. So you query may look something like this:
db.collection.update({username: userName}, newDocumentObj, {upsert: true})
Where userName is the current username you're working with and newDocumentObj is the json document that may need to be inserted.
However, if the query does return a result, it performs an update on those records.
EDIT:
I've decided that an upsert is not appropriate for this but I'm going to leave the description.
You're probably going to need to do two queries here, a find and a conditional insert. For this find query I'd use the toArray() function (instead of a stream) since you are expecting 0 or 1 results. Check if you got a result on the username and if not insert the data.
Read about node's mongodb library here.
EDIT in response to your comment:
It looks like you're reading data from a local csv file, so you should be able to structure you program like:
function connect(callback) {
connStr = 'mongodb://' + host + ':' + port + '/' + schema; //command line args, may or may not be needed, hard code if not I guess
MongoClient.connect(connStr, function(err, db) {
if(err) {
callback(err, null);
} else {
colObj = db.collection(collection); //command line arg, hard code if not needed
callback(null, colObj);
}
});
}
connect(function(err, colObj) {
if(err) {
console.log('Error:', err.stack);
process.exit(0);
} else {
console.log('Connected');
doWork(colObj, function(err) {
if(err) {
console.log(err.stack);
process.exit(0);
}
});
}
});
function doWork(colObj, callback) {
csv().from('/path/to/file.csv').on('data', function(data) {
//mongo query(colObj.find) for data.username or however the data is structured
//inside callback for colObj.find, check for results, if no results insert data with colObj.insert, callback for doWork inside callback for insert or else of find query check
});
}

NodeJS with arangojs and sync: Everything after .sync() ignored?

I want to use NodeJS to read 60k records from a MySQL database and write them to a ArangoDB database. I will later use ArangoDB's aggregation features etc. to process my dataset.
Coming from PHP, where a script usually runs synchronous, and because I believe it makes sense here, my initial (naive) try was to make my NodeJS script run sync too. However, it doesn't work as expected:
I print to console, call a function via .sync() to connect to ArangoDB server and print all existing databases, then print to console again. But everything below the sync call to my ArangoDB function is completely ignored (does not print to console again, nor does it seem to execute anything else here).
What am I overlooking? Does .done() in the function called via .sync() cause trouble?
var mysql = require('node-mysql');
var arango = require('arangojs');
//var sync = require('node-sync'); // Wrong one!
var sync = require('sync');
function test_arango_query() {
var db = arango.Connection("http://localhost:8529");
db.database.list().done(function(res) {
console.log("Databases: %j", res);
});
return "something?";
}
sync(function() {
console.log("sync?");
var result = test_arango_query.sync();
console.log("done."); // DOES NOT PRINT, NEVER EXECUTED?!
return result;
}, function(err, result) {
if (err) console.error(err);
console.log(result);
});
Your function test_arango_query doesn't use a callback. sync only works with functions that use a callback. It needs to know when the data is ready to return it from .sync(), if your function never calls the callback, then sync can't ever return a result.
Update your function to call a callback function when you want it to return:
function test_arango_query(callback) {
var db = arango.Connection("http://localhost:8529");
db.database.list().done(function(res) {
console.log("Databases: %j", res);
callback('something');
});
}

How to handle Node/MongoDB connection management?

I'm using the node-mongodb-native to connect to a local MongoDB instance. I'm having a little trouble wrapping my head around how to handle the connections. I've attempted to abstract the MongoDB stuff into a custom Database module:
Database.js
var mongo = require('mongodb');
var Database = function() { return this; };
Database.prototype.doStuff = function doStuff(callback) {
mongo.connect('mongodb://127.0.0.1:27017/testdb', function(err, conn) {
conn.collection('test', function(err, coll) {
coll.find({}, function(err, cursor) {
cursor.toArray(function(err, items) {
conn.close();
return callback(err, items);
});
});
});
});
};
// Testing
new Database().doStuff(function(err, items) {
console.log(err, items);
});
Is a new connection required for each method? That seems like it would get expensive awfully quick. I imagined that perhaps the connection would be established in the constructor and subsequent calls would leverage the existing connection.
This next question may be more of a design question, but considering how connection setup and tear-down may be expensive operations, I'm considering adding a Database object that is global to my application that can be leveraged to make calls to the database. Does this seem reasonable?
Please note that the code above was roughly taken from here. Thanks for your help.
You don't need a new connection for each method - you can open it once and use it for subsequent calls. The same applies to the individual collection variables - you can cache the result of a single call to collection() and this will let you only need those callbacks once, leaving them out everywhere else.

Resources