I'm pretty sure I am over-complicating this, so some help is greatly appreciated!
I have a class that contains several slow methods. So I am using promises in order to use the methods sequentially without using callbacks.
One of the methods returns an array of data; it is necessary to loop through this array and execute a different method on each value. In order to do this, I am using Promise.all(). The problem is that this loop does not complete before the original promise chain moves onto the next .then() in the list.
Class:
class worker {
slowFunctionOne(list) {
return new Promise(function(resolve, reject) {
for ( var i in list ) {
list[i] = ( list[i] * 2 );
}
setTimeout(() => resolve(list), 1000);
});
}
slowFunctionTwo(number) {
return new Promise(function(resolve, reject) {
number = ( number / 2 );
setTimeout(() => resolve(number), 1000);
});
}
}
module.exports = worker;
Main:
const worker = require('./worker');
var w = new worker();
var list = [2,4,6,8,10];
var promises = [];
w.slowFunctionOne(list)
.then(function(value) {
console.log("After first method:",value);
for ( i in value ) {
promises.push(w.slowFunctionTwo(value[i]));
}
var output = [];
Promise.all(promises)
.then(function(number) {
console.log("After second method:",number);
output = number;
});
return output;
})
.then(function(val) {
console.log("Finally:",val);
});
Output:
After first method: [ 4, 8, 12, 16, 20 ]
Finally: []
After second method: [ 2, 4, 6, 8, 10 ]
So you can see that slowFunctionOne() runs and returns an array. It then iterates through that array and executes slowFunctionTwo() on each value. My intention is for all of the modified values from slowFunctionTwo() will be put into the "output" variable after the Promise.all() is complete; finally it should go to the last .then() in the original chain and echo out "output."
As you can see, the code inside the last .then() is executing before Promise.all() is complete.
So how do I make the final .then() wait until everything inside the nested promise chain is complete? I suspect that async/await is the right answer, but I can't figure out how to utilize that for the nested portion without screwing up the top level chain.
Your code does not wait for Promise.all before your top-level Promise chain continues on via return output.
The solution here is: don't use output at all. You need to return Promise.all(...).then(...) (instead of return output) so that your final top-level then waits for the Promise.all's then to finish first.
return Promise.all(promises)
.then(function(number) {
console.log("After second method:",number);
return number;
});
By doing return number inside the return Promise.all(...).then(...), you ensure that number is passed in as the val argument to the top-level then handler.
Related
I have a function with multiple forEach loops:
async insertKpbDocument(jsonFile) {
jsonFile.doc.annotations.forEach((annotation) => {
annotation.entities.forEach(async (entity) => {
await this.addVertex(entity);
});
annotation.relations.forEach(async (relation) => {
await this.addRelation(relation);
});
});
return jsonFile;
}
I need to make sure that the async code in the forEach loop calling the this.addVertex function is really done before executing the next one.
But when I log variables, It seems that the this.addRelation function is called before the first loop is really over.
So I tried adding await terms before every loops like so :
await jsonFile.doc.annotations.forEach(async (annotation) => {
await annotation.entities.forEach(async (entity) => {
await this.addVertex(entity);
});
await annotation.relations.forEach(async (relation) => {
await this.addRelation(relation);
});
});
But same behavior.
Maybe it is the log function that have a latency? Any ideas?
As we've discussed, await does not pause a .forEach() loop and does not make the 2nd item of the iteration wait for the first item to be processed. So, if you're really trying to do asynchronous sequencing of items, you can't really accomplish it with a .forEach() loop.
For this type of problem, async/await works really well with a plain for loop because they do pause the execution of the actual for statement to give you sequencing of asynchronous operations which it appears is what you want. Plus, it even works with nested for loops because they are all in the same function scope:
To show you how much simpler this can be using for/of and await, it could be done like this:
async insertKpbDocument(jsonFile) {
for (let annotation of jsonFile.doc.annotations) {
for (let entity of annotation.entities) {
await this.addVertex(entity);
}
for (let relation of annotation.relations) {
await this.addRelation(relation);
}
}
return jsonFile;
}
You get to write synchronous-like code that is actually sequencing asynchronous operations.
If you are really avoiding any for loop, and your real requirement is only that all calls to addVertex() come before any calls to addRelation(), then you can do this where you use .map() instead of .forEach() and you collect an array of promises that you then use Promise.all() to wait on the whole array of promises:
insertKpbDocument(jsonFile) {
return Promise.all(jsonFile.doc.annotations.map(async annotation => {
await Promise.all(annotation.entities.map(entity => this.addVertex(entity)));
await Promise.all(annotation.relations.map(relation => this.addRelation(relation)));
})).then(() => jsonFile);
}
To fully understand how this works, this runs all addVertex() calls in parallel for one annotation, waits for them all to finish, then runs all the addRelation() calls in parallel for one annotation, then waits for them all to finish. It runs all the annotations themselves in parallel. So, this isn't very much actual sequencing except within an annotation, but you accepted an answer that has this same sequencing and said it works so I show a little simpler version of this for completeness.
If you really need to sequence each individual addVertex() call so you don't call the next one until the previous one is done and you're still not going to use a for loop, then you can use the .reduce() promise pattern put into a helper function to manually sequence asynchronous access to an array:
// helper function to sequence asynchronous iteration of an array
// fn returns a promise and is passed an array item as an argument
function sequence(array, fn) {
return array.reduce((p, item) => {
return p.then(() => {
return fn(item);
});
}, Promise.resolve());
}
insertKpbDocument(jsonFile) {
return sequence(jsonFile.doc.annotations, async (annotation) => {
await sequence(annotation.entities, entity => this.addVertex(entity));
await sequence(annotation.relations, relation => this.addRelation(relation));
}).then(() => jsonFile);
}
This will completely sequence everything. It will do this type of order:
addVertex(annotation1)
addRelation(relation1);
addVertex(annotation2)
addRelation(relation2);
....
addVertex(annotationN);
addRelation(relationN);
where it waits for each operation to finish before going onto the next one.
foreach will return void so awaiting it will not do much. You can use map to return all the promises you create now in the forEach, and use Promise.all to await all:
async insertKpbDocument(jsonFile: { doc: { annotations: Array<{ entities: Array<{}>, relations: Array<{}> }> } }) {
await Promise.all(jsonFile.doc.annotations.map(async(annotation) => {
await Promise.all(annotation.entities.map(async (entity) => {
await this.addVertex(entity);
}));
await Promise.all(annotation.relations.map(async (relation) => {
await this.addRelation(relation);
}));
}));
return jsonFile;
}
I understand you can run all the addVertex concurrently. Combining reduce with map splitted into two different set of promises you can do it. My idea:
const first = jsonFile.doc.annotations.reduce((acc, annotation) => {
acc = acc.concat(annotation.entities.map(this.addVertex));
return acc;
}, []);
await Promise.all(first);
const second = jsonFile.doc.annotations.reduce((acc, annotation) => {
acc = acc.concat(annotation.relations.map(this.addRelation));
return acc;
}, []);
await Promise.all(second);
You have more loops, but it does what you need I think
forEach executes the callback against each element in the array and does not wait for anything. Using await is basically sugar for writing promise.then() and nesting everything that follows in the then() callback. But forEach doesn't return a promise, so await arr.forEach() is meaningless. The only reason it isn't a compile error is because the async/await spec says you can await anything, and if it isn't a promise you just get its value... forEach just gives you void.
If you want something to happen in sequence you can await in a for loop:
for (let i = 0; i < jsonFile.doc.annotations.length; i++) {
const annotation = jsonFile.doc.annotations[i];
for (let j = 0; j < annotation.entities.length; j++) {
const entity = annotation.entities[j];
await this.addVertex(entity);
});
// code here executes after all vertix have been added in order
Edit: While typing this a couple other answers and comments happened... you don't want to use a for loop, you can use Promise.all but there's still maybe some confusion, so I'll leave the above explanation in case it helps.
async/await does not within forEach.
A simple solution: Replace .forEach() with for(.. of ..) instead.
Details in this similar question.
If no-iterator linting rule is enabled, you will get a linting warning/error for using for(.. of ..). There are lots of discussion/opinions on this topic.
IMHO, this is a scenario where we can suppress the warning with eslint-disable-next-line or for the method/class.
Example:
const insertKpbDocument = async (jsonFile) => {
// eslint-disable-next-line no-iterator
for (let entity of annotation.entities) {
await this.addVertex(entity)
}
// eslint-disable-next-line no-iterator
for (let relation of annotation.relations) {
await this.addRelation(relation)
}
return jsonFile
}
The code is very readable and works as expected. To get similar functionality with .forEach(), we need some promises/observables acrobatics that i think is a waste of effort.
This question already has an answer here:
How to chain and share prior results with Promises [duplicate]
(1 answer)
Closed 6 years ago.
I'm new to promise. I am trying to use promise to send queries to mysql db. After some queries I will use the result from the query, do some calculations and then use the output as some parameters of next query. Looks like the following:
firstQuery(). then(secondQuery). then(thirdQuery). then(fourthQuery). ...
Say, in the fourthQuery, I need to use results coming from firstQuery and secondQuery, and will have some additional calculations. How should I do that?
I know I can get the result from the previous promise by passing a parameter to the function:
then(thirdQuery). then(cal("I can only get output from thirdQuery here")). then(fourthQuery("pass output from cal"))
In this case, I don't any advantages of Promise over callbacks, because I can always write a function to simplify the repeated callbacks.
If you can rewrite firstQuery, secondQuery etc you can do something like this
function firstQuery(allResult = {}) {
return doTheQuery()
.then(result => {
allResult.first = result;
return result;
});
}
function secondQuery(allResult = {}) {
return doTheQuery()
.then(result => {
allResult.second = result;
return result;
});
}
function thirdQuery(allResult = {}) {
return doTheQuery()
.then(result => {
allResult.third = result;
return result;
});
}
function fourthQuery(allResult = {}) {
return doTheQuery(allRessult.first, allResult.second)
.then(result => {
allResult.fourth = result;
return result;
});
}
then you can write use
firstQuery()
.then(secondQuery)
.then(thirdQuery)
.then(fourthQuery)
.then ...
the final result will be an object with the values of all the queries in {first, second, third, fourth} properties
Of course, if you want just the fourth query result
firstQuery()
.then(secondQuery)
.then(thirdQuery)
.then(fourthQuery)
.then(result => result.fourth)
.then ...
In that quite common cases you can move results of each then outside promise, and then they will be accessible in the remaining then blocks.
For example:
function first() { ... }
function second() { ... }
function third() { ... }
function fourth(firstValue, secondValue) { ... }
var firstResponse, secondResponse;
first()
.then(function(_firstResponse) {
firstResponse = _firstResponse;
return second();
})
.then(function(_secondResponse) {
secondResponse = _secondResponse;
return third();
})
.then(function() {
return fourth(firstResponse, secondResponse);
})
Usually you should think of promises as values and dependencies rather than a way to control the execution flow. One solid option is organizing your code something like this:
var firstResult = firstQuery();
var secondResult = firstResult.then(secondQuery);
var thirdResult = secondResult.then(thirdQuery);
var fourthResult = Promise.join(firstResult, secondResult, fourthQuery);
Basically I guess the key here is knowing of the join method of the bluebird library (others exist that could be used for the same effect), but an additional benefit is I find this kind of code much less error prone than one mixing raw variables and promise-variables.
Also note that this is possible because it's totally fine to call then on the same promise multiple times.
I want to do some prepare work, and my other work should start after these is done, so I call these work by Q.all, but some work is async, this is what I want.
May be my code will make you understand me, in this simple example I want to do this:
call foo2 for item in array
in foo2, I wait 10 * a ms(assume this to be some completed work), and change res.
I want to foo2 is running over then console.log(res), this means all wait is over and all item is added to res. So in my example, res is change to 6.
Here is the code
var Q = require("q");
var res = 0;
function foo(a) {
res += a;
}
function foo2(a) {
// this is a simple simulation of my situation, this is not exactly what I am doing. In one word, change my method to sync is a little bit difficult
return Q.delay(10 * a).then(function() {
res += a;
});
}
// Q.all([1, 2, 3].map(foo)).done(); // yes, this is what I want, this log 6
// however, because of some situation, my work is async function such as foo2 instead of sync method.
Q.all([1, 2, 3].map(function(a) {
return foo2(a);
})).done();
console.log(res); // I want 6 instead of 0
You are mixing sync and async style of programming.
In this case, your console.log statement will be run before any promise had time to fulfill (before res was modified by them), as it it not inside a promise block.
See here how console.log will be run after promises have been resolved
var Q = require("q"),
res = 0;
function foo(a) { res += a; }
function foo2(a) {
return Q
.delay(10 * a)
.then(function() { res += a; });
}
Q.all( [1, 2, 3].map(function(a) { return foo2(a); }) )
.then(function(){ console.log(res) })
.done();
var a = [0,1,2,3];
how do I pass the value of getVal to prod and tst.
function startFunc(){
var deferred = Q.resolve();
var a = [0,1,2,3];
a.forEach(function (num, i) {
var getVal = num;
deferred = deferred
.then(function(num){
var def = Q.defer();
console.log("\n\niteration:"+i + " a: "+getVal);
return def.resolve(getVal);
}).then(prod)
.then(tst)
.then(compare)
.catch(function(error){
console.log(error);
});
});
return deferred.promise;
}
here is a nodejs fiddle link. Goto the link and execute press shift+enter to exec.
https://tonicdev.com/pratikgala/5637ca07a6dfbf0c0043d7f9
When this is executed I want to pass the value of getVal to prod as a promise.
how do I do that. when I run the following function the getVal is not returend to prod.
You can significantly simplify your loop, stop using the deferred anti-pattern and get the val passed to prod() like this:
function startFunc() {
var a = [0, 1, 2, 3];
return a.reduce(function(p, val, i) {
return p.then(function() {
return prod(val);
}).then(test).then(compare).catch(function (error) {
console.log(error);
});
}, Q());
}
startFunc().then(function(finalVal) {
// completed successfully here
}, function(err) {
// error here
});
This is designed to iterate through the items in the array, passing each one to prod() and to run a chain of promises on each item in the array, one after the other so that the whole promise chain for the first item in the array finishes before the next item in the array starts processing. The final value will be a promise that resolves to whatever the last step in the .reduce() loop returns. If you wish to accumulate an array of return values, that can be done also.
FYI, you can read about avoiding the deferred anti-pattern here.
I have seen Chaining an arbitrary number of promises in Q ; my question is different.
How can I make a variable number of calls, each of which returns asynchronously, in order?
The scenario is a set of HTTP requests, the number and type of which is determined by the results of the first HTTP request.
I'd like to do this simply.
I have also seen this answer which suggests something like this:
var q = require('q'),
itemsToProcess = ["one", "two", "three", "four", "five"];
function getDeferredResult(prevResult) {
return (function (someResult) {
var deferred = q.defer();
// any async function (setTimeout for now will do, $.ajax() later)
setTimeout(function () {
var nextResult = (someResult || "Initial_Blank_Value ") + ".." + itemsToProcess[0];
itemsToProcess = itemsToProcess.splice(1);
console.log("tick", nextResult, "Array:", itemsToProcess);
deferred.resolve(nextResult);
}, 600);
return deferred.promise;
}(prevResult));
}
var chain = q.resolve("start");
for (var i = itemsToProcess.length; i > 0; i--) {
chain = chain.then(getDeferredResult);
}
...but it seems awkward to loop through the itemsToProcess in that way. Or to define a new function called "loop" that abstracts the recursion. What's a better way?
There's a nice clean way to to this with [].reduce.
var chain = itemsToProcess.reduce(function (previous, item) {
return previous.then(function (previousValue) {
// do what you want with previous value
// return your async operation
return Q.delay(100);
})
}, Q.resolve(/* set the first "previousValue" here */));
chain.then(function (lastResult) {
// ...
});
reduce iterates through the array, passing in the returned value of the previous iteration. In this case you're returning promises, and so each time you are chaining a then. You provide an initial promise (as you did with q.resolve("start")) to kick things off.
At first it can take a while to wrap your head around what's going on here but if you take a moment to work through it then it's an easy pattern to use anywhere, without having to set up any machinery.
I like this way better:
var q = require('q'),
itemsToProcess = ["one", "two", "three", "four", "five"];
function getDeferredResult(a) {
return (function (items) {
var deferred;
// end
if (items.length === 0) {
return q.resolve(true);
}
deferred = q.defer();
// any async function (setTimeout for now will do, $.ajax() later)
setTimeout(function () {
var a = items[0];
console.log(a);
// pop one item off the array of workitems
deferred.resolve(items.splice(1));
}, 600);
return deferred.promise.then(getDeferredResult);
}(a));
}
q.resolve(itemsToProcess)
.then(getDeferredResult);
The key here is to call .then() on the deferred.promise with a spliced version of the array of workitems. This then gets run after the initial deferred promise resolves, which is in the fn for the setTimeout. In a more realistic scenario, the deferred promise would get resolved in the http client callback.
The initial q.resolve(itemsToProcess) kicks things off by passing in the work items to the first call of the work fn.
I added this in hopes it would help others.
Here is a concept of a state machine defined with Q.
Suppose you have the HTTP function defined, so it returns a Q promise object:
var Q_http = function (url, options) {
return Q.when($.ajax(url, options));
}
You can define a recursive function nextState as following:
var states = [...]; // an array of states in the system.
// this is a state machine to control what url to get data from
// at the current state
function nextState(current) {
if (is_terminal_state(current))
return Q(true);
return Q_http(current.url, current.data).then(function (result) {
var next = process(current, result);
return nextState(next);
});
}
Where function process(current, result) is a function to find out what the next step would be according to the current state and the result from the HTTP call.
When you use it, use it like:
nextState(initial).then(function () {
// all requests are successful.
}, function (reason) {
// for some unexpected reason the request sequence fails in the middle.
});
I propose another solutions, which looks easier to understand to me.
You do the same as you would when chaining promises directly:
promise.then(doSomethingFunction).then(doAnotherThingFunction);
If we put that into a loop, we get this:
var chain = Q.when();
for(...) {
chain = chain.then(functionToCall.bind(this, arg1, arg2));
};
chain.then(function() {
console.log("whole chain resolved");
});
var functionToCall = function(arg1, arg2, resultFromPreviousPromise) {
}
We use function currying to use multiple arguments. In our example
functionToCall.bind(this, arg1, arg2) will return a function with one argument: functionToCall(resultFromPreviousPromise)
You do not need to use the result from the previous promise.