I'm trying to use async eachSeries in order to code what's the report count for every category. Categories and Reports and stored in separate collections, then I first get available categories and perform a count search on them.
This is my code:
Category.find({},{_id:0, name: 1}, function (err, foundCategories) {
async.eachSeries(foundCategories,
function (item,callback) {
Report.count({category: item.name}, function (err,count) {
var name = item.name;
console.log(count);
return callback(null,{name: count});
});
}
,function (err, results) {
if (err)
response.send(err);
response.send(JSON.stringify(results));
});
});
The problem is that I'm receiving nothing, the console.log outputs actual numbers there, what am I doing wrong?
The API of eachSeries does not provide any results to the final callback - only an error in the failure case. In the success case, it's just a pure control flow "eachSeries is done" indicator, but does not provide a mechanism for passing values from the worker function. mapSeries does provide the functionality you need.
Similar as Peter's answer, async.waterfall provides you with waterfall-execution of your functions, while passing a return value to the next async function in the waterfall chain.
Related
I'm making a DNS Lookup API using Node.js and Express.js framework such that when it sends a POST request, it should return the addresses of different record types.
app.post('/', (req, res) => {
// Request format
// const l = {
// lookup: 'twitter.com',
// recordTypes: ['A', 'TXT']
// };
// Using destructor to fetch properties
const { lookup, recordTypes } = req.body;
console.log(lookup, recordTypes);
// For each record type
recordTypes.forEach(function(type) {
// setTimeout to get something async
setTimeout(function() {
dns.resolve(lookup.toLowerCase(), type, (err, addresses) => {
console.log(type);
if (err) {
return console.log(`\nType(${type}):\n`, err);
}
result = result + JSON.stringify({ type: `${type}`, response: { addresses } });
console.log(result);
});
}, 2000);
});
res.send(result);
});
It logs the correct stuff in the console but when it comes to the response, it returns an empty string. I used setTimeout to mimic the asynchronous nature of the request but it just does not work.
Please assume that I have declared stuff like result etc. because it is working. Also, please don't to redirect me to the Node.js documentation because I have already read that stuff and that's not the problem here. The problem is that I need to get every record type in an array and send that back as a response.
Here's what I have tried:
Tried to push response for each record type in the result array,
Tried to use a for of loop instead of forEach
Please help!
The way I'm reading your code is that for each item in the array you correctly use callbacks to do each individual bit of processing.
However, remember that forEach itself is not asynchronous. Thus you are setting up a bunch of tasks that will complete sometime, then returning undefined... then your results start to trickle in.
There's a couple ways to correctly. As you are using callbacks here I will use that style. You want to get a callback when all items in an array have been completely processed. The async module does this very well, providing a lot of high quality methods that act on arrays and such and give you a way to have a callback when they are all over.
Your function will look something like:
let res = []
async.each( recordTypes,
( type, done ) => {
dns.resolve(lookup.toLowerCase(), type, (err, addresses) => {
result = result + JSON.stringify({ type: `${type}`, response: { addresses } });
done(err)
} )
},
(allOverError) => {
res.send(result);
}
)
Notice there are two function parameters here: the first one is called for every item in the list, and the last is called when every item in the list has been completely processed.
There are other ways too, promises or the async/await keywords (confusing because of the name of the async module), but callbacks are good.
My goal is to insert the values gotten from a redis hash. I am using the redis package for node js.
My code is the following:
getFromHash(ids) {
const resultArray = [];
ids.forEach((id) => {
common.redisMaster.hget('mykey', id, (err, res) => {
resultArray.push(res);
});
});
console.log(resultArray);
},
The array logged at the end of the function is empty and res is not empty. What could i do to fill this array please ?
You need to use some control flow, either the async library or Promises (as described in reds docs)
Put your console.log inside the callback when the results return from the redis call. Then you will see more print out. Use one of the control flow patterns for your .forEach as well, as that is currently synchronous.
If you modify your code to something like this, it will work nicely:
var getFromHash = function getFromHash(ids) {
const resultArray = [];
ids.forEach((id) => {
common.redisMaster.hget('mykey', id, (err, res) => {
resultArray.push(res);
if (resultArray.length === ids.length) {
// All done.
console.log('getFromHash complete: ', resultArray);
}
});
});
};
In your original code you're printing the result array before any of the hget calls have returned.
Another approach will be to create an array of promises and then do a Promise.all on it.
You'll see this kind of behavior a lot with Node, remember it uses asynchronous calls for almost all i/o. When you're coming from a language where most function calls are synchronous you get tripped up by this kind of problem a lot!
I have a problem, but I have no idea how would one go around this.
I'm using loopback, but I think I would've face the same problem in mongodb sooner or later. Let me explain what am I doing:
I fetch entries from another REST services, then I prepare entries for my API response (entries are not ready yet, because they don't have id from my database)
Before I send response I want to check if entry exist in database, if it doesn't:
Create it, if it does (determined by source_id):
Use it & update it to newer version
Send response with entries (entries now have database ids assigned to them)
This seems okay, and easy to implement but it's not as far as my knowledge goes. I will try to explain further in code:
//This will not work since there are many async call, and fixedResults will be empty at the end
var fixedResults = [];
//results is array of entries
results.forEach(function(item) {
Entry.findOne({where: {source_id: item.source_id}}, functioN(err, res) {
//Did we find it in database?
if(res === null) {
//Create object, another async call here
fixedResults.push(newObj);
} else {
//Update object, another async call here
fixedResults.push(updatedObj);
}
});
});
callback(null, fixedResults);
Note: I left some of the code out, but I think its pretty self explanatory if you read through it.
So I want to iterate through all objects, create or update them in database, then when all are updated/created, use them. How would I do this?
You can use promises. They are callbacks that will be invoked after some other condition has completed. Here's an example of chaining together promises https://coderwall.com/p/ijy61g.
The q library is a good one - https://github.com/kriskowal/q
This question how to use q.js promises to work with multiple asynchronous operations gives a nice code example of how you might build these up.
This pattern is generically called an 'async map'
var fixedResults = [];
var outstanding = 0;
//results is array of entries
results.forEach(function(item, i) {
Entry.findOne({where: {source_id: item.source_id}}, functioN(err, res) {
outstanding++;
//Did we find it in database?
if(res === null) {
//Create object, another async call here
DoCreateObject(function (err, result) {
if (err) callback(err);
fixedResults[i] = result;
if (--outstanding === 0) callback (null, fixedResults);
});
} else {
//Update object, another async call here
DoOtherCall(function (err, result) {
if(err) callback(err);
fixedResults[i] = result;
if (--outstanding === 0) callback (null, fixedResults);
});
}
});
});
callback(null, fixedResults);
You could use async.map for this. For each element in the array, run the array iterator function doing what you want to do to each element, then run the callback with the result (instead of fixedResults.push), triggering the map callback when all are done. Each iteration ad database call would then be run in parallel.
Mongo has a function called upsert.
http://docs.mongodb.org/manual/reference/method/db.collection.update/
It does exactly what you ask for without needing the checks. You can fire all three requests asnc and just validate the result comes back as true. No need for additional processing.
I have to implement a program in node.js which looks like the following code snippet. It has an array though which I have to traverse and match the values with database table entries. I need to wait till the loop ends and send the result back to the calling function:
var arr=[];
arr=[one,two,three,four,five];
for(int j=0;j<arr.length;j++) {
var str="/^"+arr[j]+"/";
// consider collection to be a variable to point to a database table
collection.find({value:str}).toArray(function getResult(err, result) {
//do something incase a mathc is found in the database...
});
}
However, as the str="/^"+arr[j]+"/"; (which is actually a regex to be passed to find function of MongoDB in order to find partial match) executes asynchronously before the find function, I am unable to traverse through the array and get required output.
Also, I am having hard time traversing through array and send the result back to calling function as I do not have any idea when will the loop finish executing.
Try using async each. This will let you iterate over an array and execute asynchronous functions. Async is a great library that has solutions and helpers for many common asynchronous patterns and problems.
https://github.com/caolan/async#each
Something like this:
var arr=[];
arr=[one,two,three,four,five];
asych.each(arr, function (item, callback) {
var str="/^"+item+"/";
// consider collection to be a variable to point to a database table
collection.find({value:str}).toArray(function getResult(err, result) {
if (err) { return callback(err); }
// do something incase a mathc is found in the database...
// whatever logic you want to do on result should go here, then execute callback
// to indicate that this iteration is complete
callback(null);
});
} function (error) {
// At this point, the each loop is done and you can continue processing here
// Be sure to check for errors!
})
I have two problems implementing a RESTful service using Node.js / node-postgres lib / PostgreDB and both are due to the async nature of JS.
A) I need to pass an extra argument to a callback in client.query(query, callback) call
I am inside a callback of a query and going through an array of recently fetched rows from a DB and want to launch a subsequent query for each of them:
var query = client.query('SELECT * FROM event', queryAllEventsHandler);
function queryAllEventsHandler(err, result){
allEvents = result.rows;
/* allEvents is an JSON array with the following format
[ {"id_event":1, "name":"name of the event"},
{"id_event":1, "name":"name of the event"}
]
*/
for(var i = 0; i<allEvents.length; i++){
client.query('SELECT * FROM days where id_event = $1',[allEvents[i].id_event], function( err, result){
//I want to have a reference to variable i
}
}
In the above example I want to do something like:
client.query('SELECT * FROM days where id_event = $1',[allEvents[i].id_event], function( AN_EXTRA_ARG, err, result)
Where the AN_EXTRA_ARG is an extra argument or a closure in the callback function... How can I achieve this? Should I create an closure with the of i and pass it as a callback's arg? How ? :|
B) "Synchronizing" queries
I need to launch various queries and create a custom JSON from all of them. Since every query and it's callback are asynchronous (waiting for no one) I was looking for a way to "tame" it and among other stuff I found a solution that occured to me in the first place, but seemed a bit "bad/lousy":
Keeping the query count is really the way to go as #jslatts suggests in Synchronous database queries with Node.js?
Hope I
With regards to question A, you could create a function to handle both your queries and only return when the last query is executed and return both results to the callback.
for(var i = 0; i<allEvents.length; i++){
query(client, allEvents[i], function(result1, result2) {
//do something
});
}
function query(client, event, callback) {
client.query('SELECT * FROM days where id_event = $1',[event.id_event], function( err1, result1){
client.query('SELECT * FROM days where id_event = $1',[event.id_event], function( err2, result2){
callback(result1, result2);
});
});
}
I don't like answering my on question, but this might be of interest to someone.... Regarding the A part of my question. You can assign a custom object to this in your function.
As you know a keyword this corresponds to the Window (top) object when inside a function (unless it's a method function). Using the bind function you can change the reference of this to your own object...
So what I did was, I created a named function queryCallback
function queryCallback(err, result){
//this == Window (default)
}
changed the anonymous callback function to the named one queryCallback:
client.query('SELECT * ... where id_event = $1',[allEvents[i].id_event], queryCallback.bind( {"position":i}, err, result));
Now, note queryCallback.bind( {"position":i}, err, result));
What bind(my_custom_this, [other args]) does is it binds a custom object (in my case {"position":i}) to this inside the function upon which the bind was called...
Now we have this scenario:
function queryCallback(err, result){
//this == {"position":i}
}
Bind explained: http://fitzgeraldnick.com/weblog/26/
A) I personally like lodash (or underscore if you prefer) partial() for this. It takes a function and a number of arguments and returns a function with the provided arguments applied and the remaining arguments still open. It's very much like the functional concept of currying.
B) For combining multiple asynchronous results I highly recommend async. The syntax will take a little getting used to, but makes thing like this very easy. Quick sample:
async.parallel([
one: function(callback){
db.fetch(options, callback);
},
two: function(callback){
db.fetch(options, callback);
}
],
function(err, results){
// this callback will get called when either parallel call gives an error
// or when both have called the callback
if (err) {
// handle error
return;
}
// get the results from results.one and results.two
});
== Added in edit ==
Actually lodash also provides a nicer (imho) albeit slightly more expensive (due to function calls) solution for your problem A):
_(allEvents).each(function(event, index, array) {
client.query('SELECT * FROM days where id_event = $1',[event.id_event], function( err, result) {
// Just use 'index' here, which doesn't change during the each
}
});
For B), your options include async or via a Promise library (such as Q, when.js, Bluebird, etc...)