Node Js MongoDB Query Against returned array - node.js

I have a mongodb Relationships collection that stores the user_id and the followee_id(person the user is following). If I query for against the user_id I can find all the the individuals the user is following. Next I need to query the Users collection against all of the returned followee ids to get their personal information. This is where I confused. How would I accomplish this?
NOTE: I know I can embed the followees in the individual user's document and use and $in operator but I do not want to go this route. I want to maintain the most flexibility I can.

You can use an $in query without denormalizing the followees on the user. You just need to do a little bit of data manipulation:
Relationship.find({user_id: user_id}, function(error, relationships) {
var followee_ids = relationships.map(function(relationship) {
return relationship.followee_id;
});
User.find({_id: { $in: followee_ids}}, function(error, users) {
// voila
});
};

if i got your problem right(i think so).
you need to query each of the "individuals the user is following".
that means to query the database multiple queries about each one and get the data.
because the queries in node.js (i assume you using mongoose) are asynchronies you need to get your code more asynchronies for this task.
if you not familier with the async module in node.js it's about time to know it.
see npm async for docs.
i made you a sample code for your query and how it needs to be.
/*array of followee_id from the last query*/
function query(followee_id_arr, callback) {
var async = require('async')
var allResults = [];
async.eachSerias(followee_id_arr, function (f_id, callback){
db.userCollection.findOne({_id : f_id},{_id : 1, personalData : 1},function(err, data){
if(err) {/*handel error*/}
else {
allResults.push(data);
callback()
}
}, function(){
callback(null, allResults);
})
})
}
you can even make all the queries in parallel (for better preformance) by using async.map

Related

How to Re-Use MongoDB Queries

Suppose I have a mongodb query as such :
db.collection('users').updateOne({name:name},{$set:{data:data}})
I want to use this query across many functions and I want to reuse it and avoid writing the same query over and over again.
You can re-use query only if you provide condition and object data in function. Check below code it may help you to get the same result.
This will be your model function.
user.updateUser = function (conditon, updateData, callback) {
userModel.update(conditon, updateData).exec(callback);
}
You can call this function like as below.
userModel.updateUser({_id: userId, name:name},{$set:{data:data}}, (err, result) => {
console.log(err, result);
});

Postgres promise multiple queries - nodejs

After reading https://stackoverflow.com/a/14797359/4158593 : about nodejs single thread and that it takes the first parameter of async function, processes it and then uses the callback to respond when everything is ready. What confused me is what if I had multiple queries that need to be excused all at once and tell nodeJS to block other requests by adding them in a queue.
To do that I realised that I need to wrap my queries in another callback. And promises do that pretty well.
const psqlClient = psqlPool.connect();
return psqlClient.query(`SELECT username FROM usernames WHERE username=$1`, ['me'])
.then((data) => {
if(!data.rows[0].username) {
psqlClient.query(`INSERT INTO usernames (username) VALUES ('me')`);
}
else { ... }
});
This code is used during sign up, to check if username isn't taken before inserting. So it very important that nodejs puts other requests into a queue, and makes sure to select and insert at the same time. Because this code might allow people with the same username sent at the same time to select a username that has been already been taken, therefore two usernames will be inserted.
Questions
Does the code above executes queries all at once?
If 1 is correct, if I was to change the code like this
const psqlClient = psqlPool.connect();
return psqlClient.query(`SELECT username FROM usernames WHERE username=$1`, ['me'], function(err, reply) {
if(!reply.rows[0].username) {
psqlClient.query(`INSERT INTO usernames (username) VALUES ('me')`);
}
});
would that effect the behaviour?
If 1 is wrong, how should this be solved? I am going to need this pattern (mainly using select and insert/update one after another) for things like making sure that my XML sitemaps don't contain more than 50000 urls by storing the count for each file in my db which happens dynamically.
The only thing that can guarantee data integrity in your case is a single SELECT->INSERT query, which was discussed here many times.
Some examples:
Is SELECT or INSERT in a function prone to race conditions?
Get Id from a conditional INSERT
You should be able to find more of that here ;)
I also touched on this subject in a SELECT ⇒ INSERT example within pg-promise.
There is however an alternative, to make any repeated insert generate a conflict, in which case you can re-run your select to get the new record. But it is not always a suitable solution.
Here's a reference from the creator of node-postgres: https://github.com/brianc/node-postgres/issues/83#issuecomment-212657287. Basically queries are queued, but don't rely on them in production where you have many requests....
However you can use BEGIN and COMIT
var Client = require('pg').Client;
var client = new Client(/*your connection info goes here*/);
client.connect();
var rollback = function(client) {
//terminating a client connection will
//automatically rollback any uncommitted transactions
//so while it's not technically mandatory to call
//ROLLBACK it is cleaner and more correct
client.query('ROLLBACK', function() {
client.end();
});
};
client.query('BEGIN', function(err, result) {
if(err) return rollback(client);
client.query('INSERT INTO account(money) VALUES(100) WHERE id = $1', [1], function(err, result) {
if(err) return rollback(client);
client.query('INSERT INTO account(money) VALUES(-100) WHERE id = $1', [2], function(err, result) {
if(err) return rollback(client);
//disconnect after successful commit
client.query('COMMIT', client.end.bind(client));
});
});
});
Check out: https://github.com/brianc/node-postgres/wiki/Transactions
However this doesn't block the table. Here's a list of solutions: Update where race conditions Postgres (read committed)

how insert csv data to mongodb with nodejs

Hi im developing an app with nodeJS, express and a mongoDB, i need to take users data from a csv file and upload it to my database this db has a schema designed with mongoose.
but i don know how to do this, what is the best approach to read the csv file check for duplicates against the db and if the user (one column in the csv) is not here insert it?
are there some module to do this? or i need to build it from scratch? im pretty new to nodeJS
i need a few advices here
Thanks
this app have an angular frontend so the user can upload the file, maybe i should read the csv in the front end and transform it into an array for node, then insert it?
Use one of the several node.js csv libraries like this one, and then you can probably just run an upsert on the user name.
An upsert is an update query with the upsert flag set to true: {upsert: true}. This will insert a new record only if the search returns zero results. So you query may look something like this:
db.collection.update({username: userName}, newDocumentObj, {upsert: true})
Where userName is the current username you're working with and newDocumentObj is the json document that may need to be inserted.
However, if the query does return a result, it performs an update on those records.
EDIT:
I've decided that an upsert is not appropriate for this but I'm going to leave the description.
You're probably going to need to do two queries here, a find and a conditional insert. For this find query I'd use the toArray() function (instead of a stream) since you are expecting 0 or 1 results. Check if you got a result on the username and if not insert the data.
Read about node's mongodb library here.
EDIT in response to your comment:
It looks like you're reading data from a local csv file, so you should be able to structure you program like:
function connect(callback) {
connStr = 'mongodb://' + host + ':' + port + '/' + schema; //command line args, may or may not be needed, hard code if not I guess
MongoClient.connect(connStr, function(err, db) {
if(err) {
callback(err, null);
} else {
colObj = db.collection(collection); //command line arg, hard code if not needed
callback(null, colObj);
}
});
}
connect(function(err, colObj) {
if(err) {
console.log('Error:', err.stack);
process.exit(0);
} else {
console.log('Connected');
doWork(colObj, function(err) {
if(err) {
console.log(err.stack);
process.exit(0);
}
});
}
});
function doWork(colObj, callback) {
csv().from('/path/to/file.csv').on('data', function(data) {
//mongo query(colObj.find) for data.username or however the data is structured
//inside callback for colObj.find, check for results, if no results insert data with colObj.insert, callback for doWork inside callback for insert or else of find query check
});
}

node.js and express passing sqlite data to one of my views

In my app.js I have the following to try to retrieve data from a sqlite database and pass it to one of my views:
app.get("/dynamic", function(req, res) {
var db = new sqlite3.Database(mainDatabase)
var posts = []
db.serialize(function() {
db.each("SELECT * FROM blog_posts", function(err, row) {
posts.push({title: row.post_title, date: row.post_date, text: row.post_text})
})
})
res.render("dynamic", {title: "Dynamic", posts: posts})
})
Can someone tell me what I am doing wrong here. The posts array seems to stay empty nomatter what.
EDIT
I was following a tutorial that explained that though the plugin has async, this method is not asynchronous
Here is a quote from the tutorial
Despite the callbacks and asynchronous nature of Node.js, these
transactions will run in series, allowing us to create, insert, and
query knowing that the statement prior will run before the current one
does. However, sqlite3 provides a "parallel" wrapper with the same
interface, but runs all the transactions in parallel. It just all
depends on your current circumstances.
the db calls are likely asynchronous. Which means you are rendering before they return with their data.
You need to figure out how to get one callback from your query, and render your template in that callback.
It looks like you want a second complete callback passed to db.each() (Thanks, Jonathan Lonowski, for the tip!)
var posts = [];
db.serialize(function() {
db.each("SELECT * FROM blog_posts", function(err, row) {
posts.push({title: row.post_title, date: row.post_date, text: row.post_text})
}, function() {
// All done fetching records, render response
res.render("dynamic", {title: "Dynamic", posts: posts})
})
})
The idea is the render in the last callback of any asynchronous code, that way you have everything you need.

In Node.js how can I programmatically retrieve many hash's from a redis db using a set as indices

I have a whole bunch of fields for each user in my redis database, and I want to be able to retrieve all their records and display them.
The way I do it, is store a set of all userids, When I want all their records, I recursively iterate the set grabbing their records using the userids in the set and adding them to a global array, then finally returning this global array. Anyway I don't particularly like this method and would like to hear some suggestions of alternatives, I feel there must be better functionality in node.js or redis for this very problem. Maybe there is a way to do away with using the set entirely, but looking around I couldn't see anything obvious.
This is an example of my psuedoish (pretty complete) node.js code, note the set size is not a problem as it will rarely be > 15.
Register Function:
var register = function(username, passwordhash, email){
// Get new ID by incrementing idcounter
redis.incr('db:users:idcounter', function(err, userid){
// Setup user hash with user information, using new userid as key
redis.hmset('db:user:'+userid, {
'username':username,
'passwordhash':passwordhash,
'email':email
},function(err, reply){
// Add userid to complete list of all users
redis.sadd('db:users:all', userid);
}
});
});
}
Records retrieval function:
var getRecords = function(fcallback){
// Grab a list of all the id's
redis.smembers('db:users:all', function(err, allusersids){
// Empty the returned (global) array
completeArray = [];
// Start the recursive function, on the allusersids Array.
recursive_getNextUserHash(allusersids, fcallback);
});
}
Recursive function used to retrieve individual records:
// Global complete Array (so recursive function has access)
var completeArray = [];
// recursive method for filling up our completeArray
var recursive_getNextUserHash = function(userArray, callback){
// If userArray==0 this means we have cycled entire list,
// call the callback, and pass it the completeArray which
// is now full of our usernames + emails
if(userArray.length==0){
callback.apply(this, [completeArray]);
return;
}
// If still more items, start by popping the next user
var userid = userArray.pop();
// grab this users information
redis.hvals('db:user:'+userid, function(err, fields){
// Add users information to global array
completeArray.push({username:fields[0],email:fields[2]});
// Now move on to the next user
recursive_getNextUserHash(userArray, callback);
});
}
Use would be something like this:
register('bob', 'ASDADSFASDSA', 'bob#example.com');
register('bill', 'DDDASDADSAD', 'bill#example.com');
getRecords(function(records){
for(var i=0;i<records.length;i++){
console.log("u:"+records[i]['username']+',#:'+records[i]['email']);
}
});
Summary: What is a good way to retrieve many fields of Hash's using node.js and redis? After writing this question, I started to wonder if this is just the way you do it in redis, you make many roundtrips, regardless if this is the case, there must be a way to avoid the horrible recursion!
Assuming you are using https://github.com/mranney/node_redis - have a look at Multi and Exec. You can send all of your commands in a single request and wait for all the responses at once. No need for recursion.
For anyone else having a similar question, here is the syntax I ended up using:
redis.smembers('db:users:all', function(err, reply){
var multi = redisClient.multi();
for(var i=0;i<reply.length;i++){
multi.hmget('db:user:'+reply[i], ['username', 'email']);
}
multi.exec(function(err, replies){
for(var j=0;j<replies.length;j++){
console.log("-->"+replies[j]);
}
});
});

Resources