socket.io - getting more than one field for a socket? - node.js

I have the following code when a user disconnects. I want to emit a signal with the room name and user name.
client.get('nickname', function(err, name) {
client.get('room', function(err2, room) {
io.sockets.in(room).emit('disconnect', name);
});
});
My question is, is there any way to avoid wrapping the .get calls like this? For my application, they are going to add up quickly. Can I get more than one value from one get() command? Or am I just handling this all wrong?

If you need to get a lot of values, take a look at a flow control library like async. For example, here's how you might get several values from the client in parallel:
var async = require('async');
async.parallel([
client.get.bind(this, 'nickname'),
client.get.bind(this, 'room'),
client.get.bind(this, 'anotherValue')
], function(err, results) {
// here, `err` is any error returned from any of the three calls to `get`,
// and results is an array:
// results[0] is the value of 'nickname',
// results[1] is the value of 'room',
// results[2] is the value of 'anotherValue'
});

If you had all the attributes of a user in an object/array, and all the attributes of a room in an object/array, you'd still only need these 2 nested calls. You're doing it right.

Related

Using Mongodb variables out of its functions

So I'm making a web application and I'm trying to send variables to an EJS file but when they are sent out of the mongo functions they come out as undefined because it's a different scope for some reason. It's hard to explain so let me try to show you.
router.get("/", function(req, res){
var bookCount;
var userCount;
Books.count({}, function(err, stats){
if(err){
console.log("Books count failed to load.");
}else{
bookCount = stats;
}
});
User.count({}, function(err, count){
if(err){
console.log("User count failed to load.")
}else{
userCount = count;
console.log(userCount);
}
});
console.log(userCount);
//Get All books from DB
Books.find({}, function(err, allbooks){
if(err){
console.log("Problem getting all books");
}else{
res.render("index", {allbooks: allbooks, bookCount: bookCount, userCount: userCount});
}
});
});
So in the User.Count and Books.count I'm finding the number of documents in a collection which works and the number is stored inside of the variables declared at the very top.
After assigning the numbers like userCount i did console.log(userCount) which outputs the correct number which is 3, If was to do console.log(userCount) out of the User.count function it would return undefined, which is a reference to the declaration at the very top.
What is really weird is that Book.Find() has the correct userCount even though its a totally different function. The whole goal im trying to accomplish is doing res.render("index", {userCount: userCount}); outside of the Books.find(). I can do it but of course for some reason it passes undefined instead of 3. I hope this made a shred of sense.
I seem to have found a solution. but if anyone knows a different way I would love to know. So basically all you need to do is move the User.Count function outside of the router.get() function. Not completely sure about the logic of that but it works...
This is a classic asynchronous-operation problem: Your methods (Books.count, Books.find, User.count) are called immediately, but the callback functions you pass to them are not. userCount is undefined in your log because console.log is called before the assignment in the callback function is made. Your code is similar to:
var userCount;
setTimeout(function() {
userCount = 3;
}, 1000);
console.log(userCount); // undefined
User.count takes time to execute before calling back with the result, just like setTimeout takes the specified time to execute before calling its callback. The problem is JS doesn't pause and wait for the timeout to complete before moving on and calling console.log below it, it calls setTimeout, calls console.log immediately after, then the callback function is called one second later.
To render a complete view, you need to be sure you have all of the data before you call res.render. To do so you need to wait for all of the methods to call back before calling res.render. But wait, I just told you that JS doesn't pause and wait, so how can this be accomplished? Promise is the answer. Multiple promises, actually.
It looks like you are using Mongoose models. Mongoose has been written so that if you don't pass a callback function to your methods, they return a promise.
Books.count({}) // returns a promise
JS promises have a method then which takes a callback function that is called when the promise has been resolved with the value of the asynchronous method call.
Books.count({}) // takes some time
.then(function(bookCount) { // called when Books.count is done
// use the bookCount here
})
The problem is, you want to wait for multiple operations to complete, and multiple promises, before continuing. Luckily JS has a utility just for this purpose:
Promise.all( // wait for all of these operations to finish before calling the callback
Books.count({}),
User.count({}),
Books.find({})
)
.then(function(array) { // all done!
// the results are in an array
bookCount = array[0];
userC0unt = array[1];
allBooks = array[2];
})

Iterate through Array, update/create Objects asynchronously, when everything is done call callback

I have a problem, but I have no idea how would one go around this.
I'm using loopback, but I think I would've face the same problem in mongodb sooner or later. Let me explain what am I doing:
I fetch entries from another REST services, then I prepare entries for my API response (entries are not ready yet, because they don't have id from my database)
Before I send response I want to check if entry exist in database, if it doesn't:
Create it, if it does (determined by source_id):
Use it & update it to newer version
Send response with entries (entries now have database ids assigned to them)
This seems okay, and easy to implement but it's not as far as my knowledge goes. I will try to explain further in code:
//This will not work since there are many async call, and fixedResults will be empty at the end
var fixedResults = [];
//results is array of entries
results.forEach(function(item) {
Entry.findOne({where: {source_id: item.source_id}}, functioN(err, res) {
//Did we find it in database?
if(res === null) {
//Create object, another async call here
fixedResults.push(newObj);
} else {
//Update object, another async call here
fixedResults.push(updatedObj);
}
});
});
callback(null, fixedResults);
Note: I left some of the code out, but I think its pretty self explanatory if you read through it.
So I want to iterate through all objects, create or update them in database, then when all are updated/created, use them. How would I do this?
You can use promises. They are callbacks that will be invoked after some other condition has completed. Here's an example of chaining together promises https://coderwall.com/p/ijy61g.
The q library is a good one - https://github.com/kriskowal/q
This question how to use q.js promises to work with multiple asynchronous operations gives a nice code example of how you might build these up.
This pattern is generically called an 'async map'
var fixedResults = [];
var outstanding = 0;
//results is array of entries
results.forEach(function(item, i) {
Entry.findOne({where: {source_id: item.source_id}}, functioN(err, res) {
outstanding++;
//Did we find it in database?
if(res === null) {
//Create object, another async call here
DoCreateObject(function (err, result) {
if (err) callback(err);
fixedResults[i] = result;
if (--outstanding === 0) callback (null, fixedResults);
});
} else {
//Update object, another async call here
DoOtherCall(function (err, result) {
if(err) callback(err);
fixedResults[i] = result;
if (--outstanding === 0) callback (null, fixedResults);
});
}
});
});
callback(null, fixedResults);
You could use async.map for this. For each element in the array, run the array iterator function doing what you want to do to each element, then run the callback with the result (instead of fixedResults.push), triggering the map callback when all are done. Each iteration ad database call would then be run in parallel.
Mongo has a function called upsert.
http://docs.mongodb.org/manual/reference/method/db.collection.update/
It does exactly what you ask for without needing the checks. You can fire all three requests asnc and just validate the result comes back as true. No need for additional processing.

node.js for loop execution in a synchronous manner

I have to implement a program in node.js which looks like the following code snippet. It has an array though which I have to traverse and match the values with database table entries. I need to wait till the loop ends and send the result back to the calling function:
var arr=[];
arr=[one,two,three,four,five];
for(int j=0;j<arr.length;j++) {
var str="/^"+arr[j]+"/";
// consider collection to be a variable to point to a database table
collection.find({value:str}).toArray(function getResult(err, result) {
//do something incase a mathc is found in the database...
});
}
However, as the str="/^"+arr[j]+"/"; (which is actually a regex to be passed to find function of MongoDB in order to find partial match) executes asynchronously before the find function, I am unable to traverse through the array and get required output.
Also, I am having hard time traversing through array and send the result back to calling function as I do not have any idea when will the loop finish executing.
Try using async each. This will let you iterate over an array and execute asynchronous functions. Async is a great library that has solutions and helpers for many common asynchronous patterns and problems.
https://github.com/caolan/async#each
Something like this:
var arr=[];
arr=[one,two,three,four,five];
asych.each(arr, function (item, callback) {
var str="/^"+item+"/";
// consider collection to be a variable to point to a database table
collection.find({value:str}).toArray(function getResult(err, result) {
if (err) { return callback(err); }
// do something incase a mathc is found in the database...
// whatever logic you want to do on result should go here, then execute callback
// to indicate that this iteration is complete
callback(null);
});
} function (error) {
// At this point, the each loop is done and you can continue processing here
// Be sure to check for errors!
})

In Node.js how can I programmatically retrieve many hash's from a redis db using a set as indices

I have a whole bunch of fields for each user in my redis database, and I want to be able to retrieve all their records and display them.
The way I do it, is store a set of all userids, When I want all their records, I recursively iterate the set grabbing their records using the userids in the set and adding them to a global array, then finally returning this global array. Anyway I don't particularly like this method and would like to hear some suggestions of alternatives, I feel there must be better functionality in node.js or redis for this very problem. Maybe there is a way to do away with using the set entirely, but looking around I couldn't see anything obvious.
This is an example of my psuedoish (pretty complete) node.js code, note the set size is not a problem as it will rarely be > 15.
Register Function:
var register = function(username, passwordhash, email){
// Get new ID by incrementing idcounter
redis.incr('db:users:idcounter', function(err, userid){
// Setup user hash with user information, using new userid as key
redis.hmset('db:user:'+userid, {
'username':username,
'passwordhash':passwordhash,
'email':email
},function(err, reply){
// Add userid to complete list of all users
redis.sadd('db:users:all', userid);
}
});
});
}
Records retrieval function:
var getRecords = function(fcallback){
// Grab a list of all the id's
redis.smembers('db:users:all', function(err, allusersids){
// Empty the returned (global) array
completeArray = [];
// Start the recursive function, on the allusersids Array.
recursive_getNextUserHash(allusersids, fcallback);
});
}
Recursive function used to retrieve individual records:
// Global complete Array (so recursive function has access)
var completeArray = [];
// recursive method for filling up our completeArray
var recursive_getNextUserHash = function(userArray, callback){
// If userArray==0 this means we have cycled entire list,
// call the callback, and pass it the completeArray which
// is now full of our usernames + emails
if(userArray.length==0){
callback.apply(this, [completeArray]);
return;
}
// If still more items, start by popping the next user
var userid = userArray.pop();
// grab this users information
redis.hvals('db:user:'+userid, function(err, fields){
// Add users information to global array
completeArray.push({username:fields[0],email:fields[2]});
// Now move on to the next user
recursive_getNextUserHash(userArray, callback);
});
}
Use would be something like this:
register('bob', 'ASDADSFASDSA', 'bob#example.com');
register('bill', 'DDDASDADSAD', 'bill#example.com');
getRecords(function(records){
for(var i=0;i<records.length;i++){
console.log("u:"+records[i]['username']+',#:'+records[i]['email']);
}
});
Summary: What is a good way to retrieve many fields of Hash's using node.js and redis? After writing this question, I started to wonder if this is just the way you do it in redis, you make many roundtrips, regardless if this is the case, there must be a way to avoid the horrible recursion!
Assuming you are using https://github.com/mranney/node_redis - have a look at Multi and Exec. You can send all of your commands in a single request and wait for all the responses at once. No need for recursion.
For anyone else having a similar question, here is the syntax I ended up using:
redis.smembers('db:users:all', function(err, reply){
var multi = redisClient.multi();
for(var i=0;i<reply.length;i++){
multi.hmget('db:user:'+reply[i], ['username', 'email']);
}
multi.exec(function(err, replies){
for(var j=0;j<replies.length;j++){
console.log("-->"+replies[j]);
}
});
});

Trying to understand how the node.js programming model works

I've been reading about node.js recently (like many others). I find interesting for some use cases, but am a bit struggling to understand the inner workings, specifically the interaction between closure functions and the process flow of the code.
Let's say I have a function which accepts a key-value array. The function must check that the values follow certain data-quality guidelines (for example some keys must have a value, other keys must have numbers as values etc) before storing the data somewhere (for the purpose of this question let's assume data validation has to be done in the application itself).
In "regular" developments models I'd write something like this:
resultName = validateName(data.name)
resultAddress = validateAddress(data.address)
resultID = validateID(data.id)
if (resultName && resultAddress && resultID) {
store(data)
else {
sendErrorToUser(data)
}
Get the results of the validations, and either explain the error(s) to the user or store data and return some kind of confirmation. The flow is very clear.
The way I understand node.js, the way to do this would be to delegate the validations to a different function (to avoid waiting for each validation to finish), and supply two callback functions to the functions which validate the chunks of data:
* a callback to call when validation is successful
* a callback to call when validation fails
It's easy to now return to the user with a "please wait" message, but I have to wait for all validations to clear (or fail) before storing the data or explaining the problem to the user. As a simple way to figure out if all the validations are done I thought of using a variable that counts the number of functions that called the callback, and emitting a "validation complete" event to store the validated data (or get back to the user with any errors). Or, alternatively, emit an event after each validation is complete and in that event's code check if all validations are complete before emitting the "store" / "error" events.
My question is -- am I approaching this correctly? Or is there a more suitable way to do these kinds of things with node.js (or similar event-based systems).
Thank you!
Alon
Are your validations asynchronous? If they are not you can use the code you posted, the "regular" one.
If the validations are asynchronous (checking uniqueness of an email for instance), you need to provide callbacks:
var validateUniqueEmail = function (data, callback) {
db.find({email: data.email}, function (err, result) {
callback(err, result === null);
})
};
var validateAndStore = function (data, callback) {
asyncValidation(data, function (err, is_valid) {
if (err) {
callback(err, null);
} else if (!is_valid) {
callback('Email is not unique', null);
} else {
db.store(data, callback);
}
});
}
The code above can be simplified a lot by using some validator or ORM modules already existing
example: mongolia validator module.
Let's go. Basically, what you want to do is something along the lines of :
var validate(data, cb){
var allOk = true;
for(var key in data){
allOk = allOk && validate[key](data.key); // validator depends on the key
}
if (allOk) cb(null, data); else cb(new Error "bleh");
}
This could be done the following way (note how we pass the failed keys as the first (error) argument to the callback):
var validate(data, cb){
var status = {true:[], false:[]},
total = Object.keys(data).length,
done = 0;
for (var key in data)
(function(key){
validate[key](data[key], function(ok){
status[ok].push(key);
if (++done == total){
status[false].length ? cb(status[false]) : cb(null);
}
});
})(key);
}
Which you can use this way :
validate(data, function(failures){
if (failures){
// tell the user the input does not validate on the keys stored in failures
} else {
// all ok
store(data);
}
});
Correct me if I'm wrong, but I think what you're asking is how to handle the response from multiple asynchronous calls.
Here's how I do it (using your validation example):
var result = {};
function isAllDataAvailable() {
return result.name !== undefined
&& result.address !== undefined
&& result.id !== undefined;
}
function callback(error) {
if (error) {
showError(error);
// terminate here (?)
return;
}
if (isAllDataAvailable()) {
showOutput();
}
}
validateName(data, callback);
validateAddress(data, callback);
validateEmail(data, callback);
The key here is the result object, which starts out as empty. As each field gets validated, it gets added to the result object (by the validation functions, which I've left out in the above snippet). I've used a single callback method, but you could have multiple, say callbackName, callbackAddress, etc. The validation results are processed only if and when the result object has been fully populated, which is checked in isAllDataAvailable.
Hope this helps.
Consider using: https://github.com/asaf/nodejs-model
It will make your life much easier when dealing with validators.

Resources