async.whilst - pausing between calls - node.js

I have a function which I need to call a number of time and instead of using a for loop I'm using async.whilst. But what I need is that the next call to function is not made before the previous call completes, which is not what's happening with async.whilst. Is there a way to implement this (I'm using setTimeout to pause between each call but it is not very clean).
Many thanks, C

i'd use the forever construct. Assuming your function's name is myFunction and accepts as parameter a callback:
var count = 0;
var limit = 10; // set as number of the execution of the function
async.forever(
function(next) {
myFunction(function () {
count++;
if(count < limit) {
next();
} else {
next(true);
}
})
},
function(ended) {
// function calling iteration ended
}
);

Related

how can call another function after excuted multiple function in loop node js

Please find below code
function get_btc(address) {
address_transaction(address, user_id, coin_key, deposite_txn_fee, function(callback) {
for (var j = 0; j < callback.response.data.txs.length; j++) {
let user_id = callback.user_id;
//some code//
}
});
}
get_label_info(function(err, data) {
for (var i = 0; i < data.length; i++) {
let address = data[i].address;
deposite_model.get_coin_info(function(err, data1) {
var coin_name = data1[0].coin_code;
const return_functions = get_switch(coin_name);
if (return_functions) {
obj[return_functions](address);
}
})
}
});
function all_completed() {
console.log('all functions has been completed');
}
By the help of above mentioned code i want to excecute all_completed loop when all functions has been completly done.
At the initial start get_label_info function is excuted then controller go on to get_btc function.
Please help me how could i run all_completed functions after all functions completed run.
I'll assume you are using es6, and that you know what a Promise is in that context. In that case wrap all your callback based things in a Promise that resolves when the callback completes. Then, in your loop, push all your Promises into an array variable. Finally call Promise.all with that array as an argument and call then on the result to encapsulate the code you want ti run after they all complete (resolve).

Can Node.js stream be made as coroutine?

Is there a way to make Node.js stream as coroutine.
Example
a Fibonacci numbers stream.
fibonacci.on('data', cb);
//The callback (cb) is like
function cb(data)
{
//something done with data here ...
}
Expectation
function* fibonacciGenerator()
{
fibonacci.on('data', cb);
//Don't know what has to be done further...
};
var fibGen = fibonacciGenerator();
fibGen.next().value(cb);
fibGen.next().value(cb);
fibGen.next().value(cb);
.
.
.
Take desired numbers from the generator. Here Fibonacci number series is just an example, in reality the stream could be of anything a file, mongodb query result, etc.
Maybe something like this
Make the 'stream.on' function as a generator.
Place yield inside the callback function.
Obtain generator object.
Call next and take the next value in stream.
Is it at-least possible if yes how and if not why? Maybe a dumb question :)
If you don't want to use a transpiler (e.g. Babel) or wait until async/await make it to Node.js, you can implement it yourself using generators and promises.
The downside is that your code must live inside a generator.
First, you can make a helper that receives a stream and returns a function that, when called, returns a promise for the next "event" of the stream (e.g. data).
function streamToPromises(stream) {
return function() {
if (stream.isPaused()) {
stream.resume();
}
return new Promise(function(resolve) {
stream.once('data', function() {
resolve.apply(stream, arguments);
stream.pause();
});
});
}
}
It pauses the stream when you're not using it, and resumes it when you ask it the next value.
Next, you have a helper that receives a generator as an argument, and every time it yields a promise, it resolves it and passes its result back to the generator.
function run(fn) {
var gen = fn();
var promise = gen.next().value;
var tick = function() {
promise.then(function() {
promise = gen.next.apply(gen, arguments).value;
}).catch(function(err) {
// TODO: Handle error.
}).then(function() {
tick();
});
}
tick();
}
Finally, you would do your own logic inside a generator, and run it with the run helper, like this:
run(function*() {
var nextFib = streamToPromises(fibonacci);
var n;
n = yield nextFib();
console.log(n);
n = yield nextFib();
console.log(n);
});
Your own generator will yield promises, pausing its execution and passing the control to the run function.
The run function will resolve the promise and pass its value back to your own generator.
That's the gist of it. You'd need to modify streamToPromises to check for other events as well (e.g. end or error).
class FibonacciGeneratorReader extends Readable {
_isDone = false;
_fibCount = null;
_gen = function *() {
let prev = 0, curr = 1, count = 1;
while (this._fibCount === -1 || count++ < this._fibCount) {
yield curr;
[prev, curr] = [curr, prev + curr];
}
return curr;
}.bind(this)();
constructor(fibCount) {
super({
objectMode: true,
read: size => {
if (this._isDone) {
this.push(null);
} else {
let fib = this._gen.next();
this._isDone = fib.done;
this.push(fib.value.toString() + '\n');
}
}
});
this._fibCount = fibCount || -1;
}
}
new FibonacciGeneratorReader(10).pipe(process.stdout);
Output should be:
1
1
2
3
5
8
13
21
34
55

Wait for all query to finish and fill at the same time asynchronously

I want to fill each object of the result of a query, with other querys, and I want to do all in asynchronously way
Here is an example of the way how I do actually
var q = knex.select().from('sector');
q.then(function (sectores) {
var i = -1;
(function getDetalles(sectores) {
i++;
if(i < sectores.length){
knex.select().from('sector_detalle')
.where('sector_id', sectores[i].id)
.then(function (detalles) {
// this what i want to do asynchronously
sectores[i].sector_detalles = detalles;
console.log(sectores[i]);
getDetalles(sectores);
});
} else {
res.send({sucess: true, rows: sectores});
}
})(sectores);
});
I do some reserch and found this wait for all promises to finish in nodejs with bluebird
is close to what I want but don't know how to implement
I think you're looking for the map method that works on a promise for an array, and will invoke an asynchronous (promise-returning) callback for each of the items in it:
knex.select().from('sector').map(function(sector) {
return knex.select().from('sector_detalle')
.where('sector_id', sector.id)
.then(function(detalles) {
sector.sector_detalles = detalles;
// console.log(sector);
return sector;
});
}).then(function(sectores) {
res.send({sucess: true, rows: sectores});
});

How I can stop async.queue after the first fail?

I want to stop of executing of my async.queue after first task error was occurred. I need to perform several similar actions in parallel with the concurrency restriction, but stop all the actions after first error. How can I do that or what should I use instead?
Assuming you fired 5 parallel functions, each will take 5 seconds. While in 3rd second, function 1 failed. Then how you can stop the execution of the rest?
It depends of what those functions do, you may poll using setInterval. However if your question is how to stop further tasks to be pushed to the queue. You may do this:
q.push(tasks, function (err) {
if (err && !called) {
//Will prevent async to push more tasks to the queue, however please note that
//whatever pushed to the queue, it will be processed anyway.
q.kill();
//This will not allow double calling for the final callback
called = true;
//This the main process callback, the final callback
main(err, results);
}
});
Here a full working example:
var async = require('async');
/*
This function is the actual work you are trying to do.
Please note for example if you are running child processes
here, by doing q.kill you will not stop the execution
of those processes, so you need actually to keep track the
spawned processed and then kill them when you call q.kill
in 'pushCb' function. In-case of just long running function,
you may poll using setInterval
*/
function worker(task, wcb) {
setTimeout(function workerTimeout() {
if (task === 11 || task === 12 || task === 3) {
return wcb('error in processing ' + task);
}
wcb(null, task + ' got processed');
}, Math.floor(Math.random() * 100));
}
/*
This function that will push the tasks to async.queue,
and then hand them to your worker function
*/
function process(tasks, concurrency, pcb) {
var results = [], called = false;
var q = async.queue(function qWorker(task, qcb) {
worker(task, function wcb(err, data) {
if (err) {
return qcb(err); //Here how we propagate error to qcb
}
results.push(data);
qcb();
});
}, concurrency);
/*
The trick is in this function, note that checking q.tasks.length
does not work q.kill introduced in async 0.7.0, it is just setting
the drain function to null and the tasks length to zero
*/
q.push(tasks, function qcb(err) {
if (err && !called) {
q.kill();
called = true;
pcb(err, results);
}
});
q.drain = function drainCb() {
pcb(null, results);
}
}
var tasks = [];
var concurrency = 10;
for (var i = 1; i <= 20; i += 1) {
tasks.push(i);
}
process(tasks, concurrency, function pcb(err, results) {
console.log(results);
if (err) {
return console.log(err);
}
console.log('done');
});
async documentation on github page is either outdated or incorrect, while inspecting the queue object returned by async.queue() method I do not see the method kill().
Nevertheless there is a way around it. Queue object has property tasks which is an array, simply assigning a reference to an empty array did the trick for me.
queue.push( someTasks, function ( err ) {
if ( err ) queue.tasks = [];
});

nodejs Async's whilst

Greeting all,
I want to call a function repeatedly, but wanted each call to run only when the previous call is completed. Does the Async's whilst fit what I need? Or do the calls happen in parallel?
Thanks!
Gary
Whilst will do what you need, it runs each function in series. Before each run it will do the "test" function to make sure it should run again.
Their example:
var count = 0;
async.whilst(
function () { return count < 5; },
function (callback) {
count++;
setTimeout(callback, 1000);
},
function (err) {
// 5 seconds have passed
}
);
As Chad noted, Async's whilst will do the job.
You may want to consider Async's until (inverse of whilst). Both do the same job however the key difference is:
async.whilst will call the function each time the test passes
async.until will call the function each time the test fails
Async's whilst should do the trick for you. Please note the version of async you will be using before referring to the code on the accepted answers. As one of the comments in the accepted answer suggests, the structure of this loop has slightly changed which might not be very easy to spot.
Async v2: Accepted Answer
Async v3: https://caolan.github.io/async/v3/docs.html#whilst
Their Example:
var count = 0;
async.whilst(
function test(cb) { cb(null, count < 5); },
function iter(callback) {
count++;
setTimeout(function() {
callback(null, count);
}, 1000);
},
function (err, n) {
// 5 seconds have passed, n = 5
}
);

Resources