Saving reference to a mongoose document, after findOneAndUpdate - node.js

I feel like I'm encountering something completely simple, yet can not figure it out, will be glad, if you can help me.
I'm using mongoose + socket.io as CRUD between client and server. As I'm using sockets, there is a private scope individual for each client's socket, in which, for future use without making db find calls, I would like to store a reference of mongoose document, that I once found for this user.
One client is creanting a Room:
var currentroom;
client.on('roomcreate', function (data) {
currentroom = new Room({
Roomid: id,
UsersMeta: [UserMeta],
///other stuff///
})
currentroom.save(function (err, room) {
if (err) console.log(err);
else console.log('success');
});
});
Then, whenewer I want, on another creator's call I can just simply
currentroom.Roomnaid = data;
curretroom.save()
And it's working fine, the problem is - I do not understand how I can get the same reference on not creating, but Room search, for the moment i'm using this for search:
Room.findOneAndUpdate({ Roomid: roomid }, { $push: { UsersMeta: UserMeta}}, { new: false }, function (err, room) {
if (err) console.log(err);
console.log('room output:');
console.log(room);
client.emit('others', room);
})
The thing is, that in one call I want to:
1: find a doc in db,
3: send it to user (in pre-updated state),
4: update found document,
2: save a reference (of the current updated doc)
With findOneAndUpdate I can do all, but not saving a current reference.
So, how I need to approach it then?

Like this;
Room.findOne({ Roomid: roomid }, function (err, oldRoom) {
//make changes to oldRoom
//then save it like this
oldRoom.save(function(err,newRoom) {
//newRoom is updated document
//now you have reference to both old and new docs
});
})

In the end of the road i found out that i was trying to make it wrong. The idea of storing reference (instance) of doc, accessed by different users is obviously bad, as it leads to data conflicts between those instances. Also, approach with separate find and save can cause racing conditions and conflics as being non atomic operation, its far better to use findOneAnd*** to make mongoose\mongoDB handle db requests querieng by itself and make sure that no conflicts will occure.

Related

Node Js MongoDB Query Against returned array

I have a mongodb Relationships collection that stores the user_id and the followee_id(person the user is following). If I query for against the user_id I can find all the the individuals the user is following. Next I need to query the Users collection against all of the returned followee ids to get their personal information. This is where I confused. How would I accomplish this?
NOTE: I know I can embed the followees in the individual user's document and use and $in operator but I do not want to go this route. I want to maintain the most flexibility I can.
You can use an $in query without denormalizing the followees on the user. You just need to do a little bit of data manipulation:
Relationship.find({user_id: user_id}, function(error, relationships) {
var followee_ids = relationships.map(function(relationship) {
return relationship.followee_id;
});
User.find({_id: { $in: followee_ids}}, function(error, users) {
// voila
});
};
if i got your problem right(i think so).
you need to query each of the "individuals the user is following".
that means to query the database multiple queries about each one and get the data.
because the queries in node.js (i assume you using mongoose) are asynchronies you need to get your code more asynchronies for this task.
if you not familier with the async module in node.js it's about time to know it.
see npm async for docs.
i made you a sample code for your query and how it needs to be.
/*array of followee_id from the last query*/
function query(followee_id_arr, callback) {
var async = require('async')
var allResults = [];
async.eachSerias(followee_id_arr, function (f_id, callback){
db.userCollection.findOne({_id : f_id},{_id : 1, personalData : 1},function(err, data){
if(err) {/*handel error*/}
else {
allResults.push(data);
callback()
}
}, function(){
callback(null, allResults);
})
})
}
you can even make all the queries in parallel (for better preformance) by using async.map

How do you save multiple records in SailsJS in one action?

I am wondering what the best practice would be for saving multiple records in an action that makes changes, particularly add and remove, to multiple records. The end goal is to have one function have access to all of the changed data in the action. Currently I am nesting the saves in order for the innermost saves to have access to the data in all the updated records. Here is an example of how I am saving:
record1.save(function (error, firstRecord) {
record2.save(function (erro, secondRecord) {
record3.save(function (err, thirdRecord) {
res.send({recordOne: firstRecord, recordTwo: secondRecord, recordThree: thirdRecord});
});
});
});
With this structure of saving, recordOne, recordTwo, and recordThree display the expected values on the server. However, checking localhost/1337/modelName reveals that the models did not properly update and have incorrect data.
You could use the built in promise engine Bluebird to to that.
Promise.then(function() {
return [record1.save(),record2.save(),record3.save()];
}).spread(function(record1_saved,record2_saved,record3_saved){
res.send({recordOne: record1_saved, recordTwo: record2_saved, recordThree: record3_saved});
req.namespamce = this;
}).catch(function(err){
res.badRequest({
error: err.message
});
req.namespamce = this;
});

how insert csv data to mongodb with nodejs

Hi im developing an app with nodeJS, express and a mongoDB, i need to take users data from a csv file and upload it to my database this db has a schema designed with mongoose.
but i don know how to do this, what is the best approach to read the csv file check for duplicates against the db and if the user (one column in the csv) is not here insert it?
are there some module to do this? or i need to build it from scratch? im pretty new to nodeJS
i need a few advices here
Thanks
this app have an angular frontend so the user can upload the file, maybe i should read the csv in the front end and transform it into an array for node, then insert it?
Use one of the several node.js csv libraries like this one, and then you can probably just run an upsert on the user name.
An upsert is an update query with the upsert flag set to true: {upsert: true}. This will insert a new record only if the search returns zero results. So you query may look something like this:
db.collection.update({username: userName}, newDocumentObj, {upsert: true})
Where userName is the current username you're working with and newDocumentObj is the json document that may need to be inserted.
However, if the query does return a result, it performs an update on those records.
EDIT:
I've decided that an upsert is not appropriate for this but I'm going to leave the description.
You're probably going to need to do two queries here, a find and a conditional insert. For this find query I'd use the toArray() function (instead of a stream) since you are expecting 0 or 1 results. Check if you got a result on the username and if not insert the data.
Read about node's mongodb library here.
EDIT in response to your comment:
It looks like you're reading data from a local csv file, so you should be able to structure you program like:
function connect(callback) {
connStr = 'mongodb://' + host + ':' + port + '/' + schema; //command line args, may or may not be needed, hard code if not I guess
MongoClient.connect(connStr, function(err, db) {
if(err) {
callback(err, null);
} else {
colObj = db.collection(collection); //command line arg, hard code if not needed
callback(null, colObj);
}
});
}
connect(function(err, colObj) {
if(err) {
console.log('Error:', err.stack);
process.exit(0);
} else {
console.log('Connected');
doWork(colObj, function(err) {
if(err) {
console.log(err.stack);
process.exit(0);
}
});
}
});
function doWork(colObj, callback) {
csv().from('/path/to/file.csv').on('data', function(data) {
//mongo query(colObj.find) for data.username or however the data is structured
//inside callback for colObj.find, check for results, if no results insert data with colObj.insert, callback for doWork inside callback for insert or else of find query check
});
}

socket.io - getting more than one field for a socket?

I have the following code when a user disconnects. I want to emit a signal with the room name and user name.
client.get('nickname', function(err, name) {
client.get('room', function(err2, room) {
io.sockets.in(room).emit('disconnect', name);
});
});
My question is, is there any way to avoid wrapping the .get calls like this? For my application, they are going to add up quickly. Can I get more than one value from one get() command? Or am I just handling this all wrong?
If you need to get a lot of values, take a look at a flow control library like async. For example, here's how you might get several values from the client in parallel:
var async = require('async');
async.parallel([
client.get.bind(this, 'nickname'),
client.get.bind(this, 'room'),
client.get.bind(this, 'anotherValue')
], function(err, results) {
// here, `err` is any error returned from any of the three calls to `get`,
// and results is an array:
// results[0] is the value of 'nickname',
// results[1] is the value of 'room',
// results[2] is the value of 'anotherValue'
});
If you had all the attributes of a user in an object/array, and all the attributes of a room in an object/array, you'd still only need these 2 nested calls. You're doing it right.

How to handle Node/MongoDB connection management?

I'm using the node-mongodb-native to connect to a local MongoDB instance. I'm having a little trouble wrapping my head around how to handle the connections. I've attempted to abstract the MongoDB stuff into a custom Database module:
Database.js
var mongo = require('mongodb');
var Database = function() { return this; };
Database.prototype.doStuff = function doStuff(callback) {
mongo.connect('mongodb://127.0.0.1:27017/testdb', function(err, conn) {
conn.collection('test', function(err, coll) {
coll.find({}, function(err, cursor) {
cursor.toArray(function(err, items) {
conn.close();
return callback(err, items);
});
});
});
});
};
// Testing
new Database().doStuff(function(err, items) {
console.log(err, items);
});
Is a new connection required for each method? That seems like it would get expensive awfully quick. I imagined that perhaps the connection would be established in the constructor and subsequent calls would leverage the existing connection.
This next question may be more of a design question, but considering how connection setup and tear-down may be expensive operations, I'm considering adding a Database object that is global to my application that can be leveraged to make calls to the database. Does this seem reasonable?
Please note that the code above was roughly taken from here. Thanks for your help.
You don't need a new connection for each method - you can open it once and use it for subsequent calls. The same applies to the individual collection variables - you can cache the result of a single call to collection() and this will let you only need those callbacks once, leaving them out everywhere else.

Resources