I have to implement a program in node.js which looks like the following code snippet. It has an array though which I have to traverse and match the values with database table entries. I need to wait till the loop ends and send the result back to the calling function:
var arr=[];
arr=[one,two,three,four,five];
for(int j=0;j<arr.length;j++) {
var str="/^"+arr[j]+"/";
// consider collection to be a variable to point to a database table
collection.find({value:str}).toArray(function getResult(err, result) {
//do something incase a mathc is found in the database...
});
}
However, as the str="/^"+arr[j]+"/"; (which is actually a regex to be passed to find function of MongoDB in order to find partial match) executes asynchronously before the find function, I am unable to traverse through the array and get required output.
Also, I am having hard time traversing through array and send the result back to calling function as I do not have any idea when will the loop finish executing.
Try using async each. This will let you iterate over an array and execute asynchronous functions. Async is a great library that has solutions and helpers for many common asynchronous patterns and problems.
https://github.com/caolan/async#each
Something like this:
var arr=[];
arr=[one,two,three,four,five];
asych.each(arr, function (item, callback) {
var str="/^"+item+"/";
// consider collection to be a variable to point to a database table
collection.find({value:str}).toArray(function getResult(err, result) {
if (err) { return callback(err); }
// do something incase a mathc is found in the database...
// whatever logic you want to do on result should go here, then execute callback
// to indicate that this iteration is complete
callback(null);
});
} function (error) {
// At this point, the each loop is done and you can continue processing here
// Be sure to check for errors!
})
Related
I've got a problem with redis and nodejs. I have to loop through a list of phone numbers, and check if this number is present in my redis database. Here is my code :
function getContactList(contacts, callback) {
var contactList = {};
for(var i = 0; i < contacts.length; i++) {
var phoneNumber = contacts[i];
if(utils.isValidNumber(phoneNumber)) {
db.client().get(phoneNumber).then(function(reply) {
console.log("before");
contactList[phoneNumber] = reply;
});
}
}
console.log("after");
callback(contactList);
};
The "after" console log appears before the "before" console log, and the callback always return an empty contactList. This is because requests to redis are asynchronous if I understood well. But the thing is I don't know how to make it works.
How can I do ?
You have two main issues.
Your phoneNumber variable will not be what you want it to be. That can be fixed by changing to a .forEach() or .map() iteration of your array because that will create a local function scope for the current variable.
You have create a way to know when all the async operations are done. There are lots of duplicate questions/answers that show how to do that. You probably want to use Promise.all().
I'd suggest this solution that leverages the promises you already have:
function getContactList(contacts) {
var contactList = {};
return Promise.all(contacts.filter(utils.isValidNumber).map(function(phoneNumber) {
return db.client().get(phoneNumber).then(function(reply) {
// build custom object
constactList[phoneNumber] = reply;
});
})).then(function() {
// make contactList be the resolve value
return contactList;
});
}
getContactList.then(function(contactList) {
// use the contactList here
}, funtion(err) {
// process errors here
});
Here's how this works:
Call contacts.filter(utils.isValidNumber) to filter the array to only valid numbers.
Call .map() to iterate through that filtered array
return db.client().get(phoneNumber) from the .map() callback to create an array of promises.
After getting the data for the phone number, add that data to your custom contactList object (this is essentially a side effect of the .map() loop.
Use Promise.all() on the returned array of promises to know when they are all done.
Make the contactList object we built up be the resolve value of the returned promise.
Then, to call it just use the returned promise with .then() to get the final result. No need to add a callback argument when you already have a promise that you can just return.
The simplest solution may be to use MGET with a list of phone numbers and put the callback in the 'then' section.
You could also put the promises in an array and use Promise.all().
At some point you might want your function to return a promise rather than with callback, just to stay consistent.
Consider refactoring your NodeJS code to use Promises.
Bluebird is an excellent choice: http://bluebirdjs.com/docs/working-with-callbacks.html
you put async code into a for loop (sync operations). So, each iteration of the for loop is not waiting for the db.client(...) function to end.
Take a look at this stackoverflow answer, it explains how to make async loops :
Here
I have an orientdb database. I want to use nodejs with RESTfull calls to create a large number of records. I need to get the #rid of each for some later processing.
My psuedo code is:
for each record
write.to.db(record)
when the async of write.to.db() finishes
process based on #rid
carryon()
I have landed in serious callback hell from this. The version that was closest used a tail recursion in the .then function to write the next record to the db. However, I couldn't carry on with the rest of the processing.
A final constraint is that I am behind a corporate proxy and cannot use any other packages without going through the network administrator, so using the native nodejs packages is essential.
Any suggestions?
With a completion callback, the general design pattern for this type of problem makes use of a local function for doing each write:
var records = ....; // array of records to write
var index = 0;
function writeNext(r) {
write.to.db(r, function(err) {
if (err) {
// error handling
} else {
++index;
if (index < records.length) {
writeOne(records[index]);
}
}
});
}
writeNext(records[0]);
The key here is that you can't use synchronous iterators like .forEach() because they won't iterate one at a time and wait for completion. Instead, you do your own iteration.
If your write function returns a promise, you can use the .reduce() pattern that is common for iterating an array.
var records = ...; // some array of records to write
records.reduce(function(p, r) {
return p.then(function() {
return write.to.db(r);
});
}, Promsise.resolve()).then(function() {
// all done here
}, function(err) {
// error here
});
This solution chains promises together, waiting for each one to resolve before executing the next save.
It's kinda hard to tell which function would be best for your scenario w/o more detail, but I almost always use asyncjs for this kind of thing.
From what you say, one way to do it would be with async.map:
var recordsToCreate = [...];
function functionThatCallsTheApi(record, cb){
// do the api call, then call cb(null, rid)
}
async.map(recordsToCreate, functionThatCallsTheApi, function(err, results){
// here, err will be if anything failed in any function
// results will be an array of the rids
});
You can also check out other ones to enable throttling, which is probablya good idea.
I have to run mongoose query repeatedly within a for loop and once that completes, I want to render a page in express. Sample code given below. Since mongoose runs asynchronously, how can I make the 'commands/new' page to render only once the 'commands' array has been populated within the for loop?
...
...
var commands = [];
for (var index=0; index<ids.length; index++) {
mongoose.model('Command').find({_id : ids[index]}, function (err, command){
// do some biz logic with the 'command' object
// and add it to the 'commands' array
commands[index] = command;
});
}
res.render('commands/new', {
commands : commands
});
...
...
Your basic for loop here does not respect the callback completion of the asynchronous methods you are calling before executing each iteration. So simply use something that does instead. The node async library fits the bill here, and in fact for even better methods of array iteration:
var commands = [];
async.each(ids,function(id,callback) {
mongoose.model("Command").findById(id,function(err,command) {
if (command)
commands.push(command);
callback(err);
});
},function(err) {
// code to run on completion or err
})
And therefore async.each or possibly a variant like async.eachLimit which will only run a limited number of parallel tasks as set will be your better loop iteration control methods here.
NB The .findById() method of Mongoose also helps shorten coding here.
I have a problem, but I have no idea how would one go around this.
I'm using loopback, but I think I would've face the same problem in mongodb sooner or later. Let me explain what am I doing:
I fetch entries from another REST services, then I prepare entries for my API response (entries are not ready yet, because they don't have id from my database)
Before I send response I want to check if entry exist in database, if it doesn't:
Create it, if it does (determined by source_id):
Use it & update it to newer version
Send response with entries (entries now have database ids assigned to them)
This seems okay, and easy to implement but it's not as far as my knowledge goes. I will try to explain further in code:
//This will not work since there are many async call, and fixedResults will be empty at the end
var fixedResults = [];
//results is array of entries
results.forEach(function(item) {
Entry.findOne({where: {source_id: item.source_id}}, functioN(err, res) {
//Did we find it in database?
if(res === null) {
//Create object, another async call here
fixedResults.push(newObj);
} else {
//Update object, another async call here
fixedResults.push(updatedObj);
}
});
});
callback(null, fixedResults);
Note: I left some of the code out, but I think its pretty self explanatory if you read through it.
So I want to iterate through all objects, create or update them in database, then when all are updated/created, use them. How would I do this?
You can use promises. They are callbacks that will be invoked after some other condition has completed. Here's an example of chaining together promises https://coderwall.com/p/ijy61g.
The q library is a good one - https://github.com/kriskowal/q
This question how to use q.js promises to work with multiple asynchronous operations gives a nice code example of how you might build these up.
This pattern is generically called an 'async map'
var fixedResults = [];
var outstanding = 0;
//results is array of entries
results.forEach(function(item, i) {
Entry.findOne({where: {source_id: item.source_id}}, functioN(err, res) {
outstanding++;
//Did we find it in database?
if(res === null) {
//Create object, another async call here
DoCreateObject(function (err, result) {
if (err) callback(err);
fixedResults[i] = result;
if (--outstanding === 0) callback (null, fixedResults);
});
} else {
//Update object, another async call here
DoOtherCall(function (err, result) {
if(err) callback(err);
fixedResults[i] = result;
if (--outstanding === 0) callback (null, fixedResults);
});
}
});
});
callback(null, fixedResults);
You could use async.map for this. For each element in the array, run the array iterator function doing what you want to do to each element, then run the callback with the result (instead of fixedResults.push), triggering the map callback when all are done. Each iteration ad database call would then be run in parallel.
Mongo has a function called upsert.
http://docs.mongodb.org/manual/reference/method/db.collection.update/
It does exactly what you ask for without needing the checks. You can fire all three requests asnc and just validate the result comes back as true. No need for additional processing.
Using Node.js and the node-postgres module to communicate with a database, I'm attempting to write a function that accepts an array of queries and callbacks and executes them all asynchronously using the same database connection. The function accepts a two-dimensional array and calling it looks like this:
perform_queries_async([
['SELECT COUNT(id) as count FROM ideas', function(result) {
console.log("FUNCTION 1");
}],
["INSERT INTO ideas (name) VALUES ('test')", function(result) {
console.log("FUNCTION 2");
}]
]);
And the function iterates over the array, creating a query for each sub-array, like so:
function perform_queries_async(queries) {
var client = new pg.Client(process.env.DATABASE_URL);
for(var i=0; i<queries.length; i++) {
var q = queries[i];
client.query(q[0], function(err, result) {
if(err) {
console.log(err);
} else {
q[1](result);
}
});
}
client.on('drain', function() {
console.log("drained");
client.end();
});
client.connect();
}
When I ran the above code, I expected to see output like this:
FUNCTION 1
FUNCTION 2
drained
However, the output bizarrely appears like so:
FUNCTION 2
drained
FUNCTION 2
Not only is the second function getting called for both requests, it also seems as though the drain code is getting called before the client's queue of queries is finished running...yet the second query still runs perfectly fine even though the client.end() code ostensibly killed the client once the event is called.
I've been tearing my hair out about this for hours. I tried hardcoding in my sample array (thus removing the for loop), and my code worked as expected, which leads me to believe that there is some problem with my loop that I'm not seeing.
Any ideas on why this might be happening would be greatly appreciated.
The simplest way to properly capture the value of the q variable in a closure in modern JavaScript is to use forEach:
queries.forEach(function(q) {
client.query(q[0], function(err, result) {
if(err) {
console.log(err);
} else {
q[1](result);
}
});
});
If you don't capture the value, your code reflects the last value that q had, as the callback function executed later, in the context of the containing function.
forEach, by using a callback function isolates and captures the value of q so it can be properly evaluated by the inner callback.
A victim of the famous Javascript closure/loop gotcha. See my (and other) answers here:
I am trying to open 10 websocket connections with nodejs, but somehow my loop doesnt work
Basically, at the time your callback is executed, q is set to the last element of the input array. The way around it is to dynamically generate the closure.
It will be good to execute this using async module . It will help you to reuse the code also . and will make the code more readable . I just love the auto function provided by async module
Ref: https://github.com/caolan/async