Node Async with functions - node.js

I'm trying to use Async module with 2 functions, here is the code. There's something wrong, the console.dir are working, but the last one in function(err, results) doesn't work. Can any one help me?
The last step that I want to do is render the oftArrayFullInfo and oftNextInfo
async.parallel({
one: function(callback){
auxFunctions.foofunc1(foo1, foo1, function(oftArrayFullInfo){
console.log("****** ASYNC 1 ARRAY");
console.dir(oftArrayFullInfo);
callback(oftArrayFullInfo);
});
},
two: function(callback){
auxFunctions.foofunc2(foo1, foo1, function(oftNextInfo){
console.log("****** ASYNC 2 ARRAY");
console.dir(oftNextInfo);
callback(oftNextInfo);
});
}
},
function(err, results){
console.log("****** RENDER 1");
console.dir(results.one);
console.log("****** RENDER 2")
console.dir(results.two);
//res.render('myView', {title: 'Job Info', oftArrayFullInfo}, {title: 'Next Jobs Info', oftNextInfo});
});

Your main issue is that you're not calling the callbacks correctly.
async callbacks, and asynchronous callbacks in Node.js in general, take two arguments: the first one is used to pass any errors (or null if there aren't any), the second one is used to pass the result.
You are calling them with only the first argument:
callback(oftArrayFullInfo);
This will make async think that the function has failed, which will cause results in the final callback to be undefined. When you subsequently try to access results.one, an error will be thrown.
To fix this, you should call the callbacks properly:
callback(null, oftArrayFullInfo)
callback(null, oftNextInfo)
And, as suggested already, you should uncomment the res.render().
Eventually, you should also make your auxilliary functions (auxFunctions.foofunc1 and auxFunctions.foofunc2) adhere to the same calling convention.

ok you did everything right except when u send data from re.render() to your templates you have to send in key,value pairs.so intead
res.render('myView', {title: 'Job Info', oftArrayFullInfo}, {title: 'Next Jobs Info', oftNextInfo});
you should use
res.render('myView', {title: 'Job Info', arrayFullInfo : oftArrayFullInfo}, {title: 'Next Jobs Info', nextInfo : oftNextInfo});
now you can access these values with the key names arrayFullInfo and nextInfo in your template

Related

values getting overriden in callbacks inside callback in setInterval method node js

I have situation where i have an array of employees data and i need to process something parallel for every employee.To implement it and achieve the task i broke the things to chunks to four methods and every method has a callback calling each other and returning callback.I am using
async.eachSeries
to start the process for each element of the employee array.
In the last method i have to set the setInterval to perform same task if required response is not achieved,this interval is not cleared till the process of repeating task continues to 5 times(if desired value is not received 5 times,but cleared after 5th time).
Now,the problem happening is that data which i am processing inside setInterval is getting overriden by values of last employees.
So i am not able to keep track of process happening for all the employee Array elements and also the details of processing for last employee are getting mixed up.
In between the four methods which i am using for performing the task are carrying out the process of saving data to redis , MongoDB , Outside Api's giving response in callback.
Can anyone suggest better way of doing this and also i feel that the problem is happening because i am not returning any callback from SetInterval method().But since that method itself is an asynchronous method so i am unware about how to handle the situation.
EmployeeArray
async.eachSeries() used to process EmployeeArray
for each i have Four callBack Medhods .
async.eachSeries() {
Callback1(){
Callback2(){
Callback3(){
Callback4(){
SetInterval(No CallBack inside this from my side)
}
}
}
}
}
As per I know the async each function does parallel processing. Also u can use async waterfall to make your code more clean. Try something like this.
async.each(openFiles, function(file, callback1) {
async.waterfall([
function(callback) {
callback(null, 'one', 'two');
},
function(arg1, arg2, callback) {
// arg1 now equals 'one' and arg2 now equals 'two'
callback(null, 'three');
},
function(arg1, callback) {
// arg1 now equals 'three'
callback(null, 'done');
}
], function (err, result) {
callback1(err);
});
}, function(err){
//if you come here without error your data is processed
});

How to properly get result array in async eachSeries for Node?

I'm trying to use async eachSeries in order to code what's the report count for every category. Categories and Reports and stored in separate collections, then I first get available categories and perform a count search on them.
This is my code:
Category.find({},{_id:0, name: 1}, function (err, foundCategories) {
async.eachSeries(foundCategories,
function (item,callback) {
Report.count({category: item.name}, function (err,count) {
var name = item.name;
console.log(count);
return callback(null,{name: count});
});
}
,function (err, results) {
if (err)
response.send(err);
response.send(JSON.stringify(results));
});
});
The problem is that I'm receiving nothing, the console.log outputs actual numbers there, what am I doing wrong?
The API of eachSeries does not provide any results to the final callback - only an error in the failure case. In the success case, it's just a pure control flow "eachSeries is done" indicator, but does not provide a mechanism for passing values from the worker function. mapSeries does provide the functionality you need.
Similar as Peter's answer, async.waterfall provides you with waterfall-execution of your functions, while passing a return value to the next async function in the waterfall chain.

Waiting until async call is done to proceed in Node

I'm trying to build a bot cluster, and I'm having some trouble here.
I'd like to call the following function, called BuildABot.
http://pastebin.com/zwP7rZay
Now after this call is finished, I'd like to call requestItems, which takes in the steamOffers object as a parameter.
http://pastebin.com/jsJ4fhwG
However, the object is null, because the call hasn't finished.
Is there any way to halt the call until buildABot is done?
There are various ways to handle your requirement, I will mention 2 options while I am sure you can find more. You may find many more examples all over the web.
Option 1: Using a callback function -
Pass a callback function to the async function, When async function is finished it will call the callback func. This way the code in the callback func will be executed only when async call is over.
Related topic - how-to-write-asynchronous-functions-for-node-js
Option 2: If you have more complicated logic and you want to perform one part after another you can use the async waterfall.
Code example:
async.waterfall([
function(callback){
callback(null, 'one', 'two');
},
function(arg1, arg2, callback){
// arg1 now equals 'one' and arg2 now equals 'two'
callback(null, 'three');
},
function(arg1, callback){
// arg1 now equals 'three'
callback(null, 'done');
}
], function (err, result) {
// result now equals 'done'
});
See async module site for more information.

async.waterfall jumping to end before functions complete

Hope you guys are well?
I am diving into Node.js and having to relearn a lot of the ways in which I used to code - and having to retrain myself to asynchronous ways... I am writing a server-side script (like a java POJO), that will run either from the command line or triggered from an event.
I wanted to have the output(return) of one function to be the input of the next so I decided to use async.waterfall - as I read this will execute the functions in order, using the output from one for the input of the other...
The idea of the script is to walk through a given folder structure, create an array of sub-folders and then pass that array to the next. Then do the same for each path in that array. I wanted to use underscore.js and the "_.each()" function as it seemed a good way to iterate through the array in sequence. But this is where I get stuck as it seems to fall through all the functions to the end, before the work is complete. So my logic is a little off somewhere..
I use a 'walk' function to go into the folder and return all sub-folders.. The idea is that the script will run and then 'process.exit()' at the end of the waterfall.
The code is:
async.waterfall([
function(callback){ /* Get List of Artists from MusicFolder */
console.log('first->IN');
walk(musicFolder, function(err, foldersFound) {
if (err) { return err;}
_.each(foldersFound, function(folderPath){
console.log('Folder: ' + folderPath);
});
console.log('first->OUT');
callback(null, foldersFound);
});
},
function(artistsFound, callback){ /* Get List of Albums from Artist Folders */
var eachLoop=null;
console.log('second->IN');
_.each(artistsFound, function(artistPath){
console.log('second->eachPath:Start:'+artistPath);
walk(artistPath, function(err, albumsFound) {
console.log('second->Walk:Found');
console.log(albumsFound);
if (err) { console.log(err);}
_.each(albumsFound, function(albumPath){
eachLoop++;
console.log('second->Walk:each:'+eachLoop);
});
console.log('second->Walk:End');
});
console.log('second->eachPath:End:'+artistPath);
});
console.log('second->OUT');
callback(null, albumsFound);
},
function(paths, callback){
console.log('third->IN');
console.log('third->OUT');
callback(null, paths);
}
], function (err, result) {
console.log('last->IN');
console.log(result);
console.log('last->OUT');
// process.exit();
});
I have commented out the 'process.exit()' in the example.
IF I uncomment the 'process.exit()' I get the following output:
first->IN
Folder: /music/Adele
Folder: /music/Alex Clare
first->OUT
second->IN
second->eachPath:Start:/music/Adele
second->eachPath:End:/music/Adele
second->eachPath:Start:/music/Alex Clare
second->eachPath:End:/music/Alex Clare
second->OUT
third->IN
third->OUT
last->IN
null
last->OUT
What I can see is it does not enter the 'walk' function in the second waterfall function, but skips the 'walk' altogether even though the 'walk' is inside the _.each() iteration.
IF I comment out the 'process.exit()' command in the last function I get the following:
first->IN
Folder: /music/Adele
Folder: /music/Alex Clare
first->OUT
second->IN
second->eachPath:Start:/music/Adele
second->eachPath:End:/music/Adele
second->eachPath:Start:/music/Alex Clare
second->eachPath:End:/music/Alex Clare
second->OUT
third->IN
third->OUT
last->IN
null
last->OUT
second->Walk:Found
[ '/music/Alex Clare/The Lateness of the Hour' ]
second->Walk:each:1
second->Walk:End
second->Walk:Found
[ '/music/Adele/19',
'/music/Adele/21',
'/music/Adele/Live At The Royal Albert Hall' ]
second->Walk:each:2
second->Walk:each:3
second->Walk:each:4
second->Walk:End
I'll admit this is frustrating. Any help would be greatly appreciated as I have been rewriting this over and over for the past week in various 'async' forms and they all jump out of the functions too early - so everything is out of order.
Thanks for your help or thoughts in advance :)
Mark
It seems that your walk function is asynchronous. And you want to fire parallel asynchronous jobs, combine the result and move down the waterfall. So what you can do is to combine async.waterfall with async.parallel. Your second function may look like this:
function(artistsFound, callback) {
// some code
var jobs = [];
artistsFound.forEach(function(artistPath) {
jobs.push(function(clb) {
walk(artistPath, function(err, albumsFound) {
// some code
clb(err, albumsFound);
});
});
});
// some code
async.parallel(jobs, callback);
}
Side note: you don't have to use underscore.js to simply loop over an array. Modern JavaScript has a builtin .forEach function.

How to know when finished

Im pretty new to node.js, so i'm wondering how to know when all elements are processed in lets say:
["one", "two", "three"].forEach(function(item){
processItem(item, function(result){
console.log(result);
});
});
...now if i want to do something that can only be done when all items are processed, how would i do that?
You can use async module. Simple example: The
async.map(['one','two','three'], processItem, function(err, results){
// results[0] -> processItem('one');
// results[1] -> processItem('two');
// results[2] -> processItem('three');
});
The callback function of async.map will when all items are processed. However, in processItem you should be careful, processItem should be something like this:
processItem(item, callback){
// database call or something:
db.call(myquery, function(){
callback(); // Call when async event is complete!
});
}
forEach is blocking, see this post:
JavaScript, Node.js: is Array.forEach asynchronous?
so to call a function when all items are done processing, it can be done inline:
["one", "two", "three"].forEach(function(item){
processItem(item, function(result){
console.log(result);
});
});
console.log('finished');
if there is a high io-bound load for each item to be processed, then take a look at the module Mustafa recommends. there is also a pattern referenced in the post linked above.
Albeit other answers are correct, since node.js supports ES6 henceforth, in my opinion using built-in Promise library will be more stable and tidy.
You don't even need to require something, Ecma took the Promises/A+ library and implemented it to the native Javascript.
Promise.all(["one", "two","three"].map(processItem))
.then(function (results) {
// here we got the results in the same order of array
} .catch(function (err) {
// do something with error if your function throws
}
As Javascript is a adequately problematic language (dynamic typing, asynchronous flow) when it comes to debugging, sticking with promises instead of callbacks will save your time at the end.

Resources