I have restructured my code to promises, and built a wonderful long flat promise chain, consisting of multiple .then() callbacks. In the end I want to return some composite value, and need to access multiple intermediate promise results. However the resolution values from the middle of the sequence are not in scope in the last callback, how do I access them?
function getExample() {
return promiseA(…).then(function(resultA) {
// Some processing
return promiseB(…);
}).then(function(resultB) {
// More processing
return // How do I gain access to resultA here?
});
}
Break the chain
When you need to access the intermediate values in your chain, you should split your chain apart in those single pieces that you need. Instead of attaching one callback and somehow trying to use its parameter multiple times, attach multiple callbacks to the same promise - wherever you need the result value. Don't forget, a promise just represents (proxies) a future value! Next to deriving one promise from the other in a linear chain, use the promise combinators that are given to you by your library to build the result value.
This will result in a very straightforward control flow, clear composition of functionalities and therefore easy modularisation.
function getExample() {
var a = promiseA(…);
var b = a.then(function(resultA) {
// some processing
return promiseB(…);
});
return Promise.all([a, b]).then(function([resultA, resultB]) {
// more processing
return // something using both resultA and resultB
});
}
Instead of the parameter destructuring in the callback after Promise.all that only became available with ES6, in ES5 the then call would be replaced by a nifty helper method that was provided by many promise libraries (Q, Bluebird, when, …): .spread(function(resultA, resultB) { ….
Bluebird also features a dedicated join function to replace that Promise.all+spread combination with a simpler (and more efficient) construct:
…
return Promise.join(a, b, function(resultA, resultB) { … });
ECMAScript Harmony
Of course, this problem was recognized by the language designers as well. They did a lot of work and the async functions proposal finally made it into
ECMAScript 8
You don't need a single then invocation or callback function anymore, as in an asynchronous function (that returns a promise when being called) you can simply wait for promises to resolve directly. It also features arbitrary control structures like conditions, loops and try-catch-clauses, but for the sake of convenience we don't need them here:
async function getExample() {
var resultA = await promiseA(…);
// some processing
var resultB = await promiseB(…);
// more processing
return // something using both resultA and resultB
}
ECMAScript 6
While we were waiting for ES8, we already did use a very similar kind of syntax. ES6 came with generator functions, which allow breaking the execution apart in pieces at arbitrarily placed yield keywords. Those slices can be run after each other, independently, even asynchronously - and that's just what we do when we want to wait for a promise resolution before running the next step.
There are dedicated libraries (like co or task.js), but also many promise libraries have helper functions (Q, Bluebird, when, …) that do this async step-by-step execution for you when you give them a generator function that yields promises.
var getExample = Promise.coroutine(function* () {
// ^^^^^^^^^^^^^^^^^ Bluebird syntax
var resultA = yield promiseA(…);
// some processing
var resultB = yield promiseB(…);
// more processing
return // something using both resultA and resultB
});
This did work in Node.js since version 4.0, also a few browsers (or their dev editions) did support generator syntax relatively early.
ECMAScript 5
However, if you want/need to be backward-compatible you cannot use those without a transpiler. Both generator functions and async functions are supported by the current tooling, see for example the documentation of Babel on generators and async functions.
And then, there are also many other compile-to-JS languages
that are dedicated to easing asynchronous programming. They usually use a syntax similar to await, (e.g. Iced CoffeeScript), but there are also others that feature a Haskell-like do-notation (e.g. LatteJs, monadic, PureScript or LispyScript).
Synchronous inspection
Assigning promises-for-later-needed-values to variables and then getting their value via synchronous inspection. The example uses bluebird's .value() method but many libraries provide similar method.
function getExample() {
var a = promiseA(…);
return a.then(function() {
// some processing
return promiseB(…);
}).then(function(resultB) {
// a is guaranteed to be fulfilled here so we can just retrieve its
// value synchronously
var aValue = a.value();
});
}
This can be used for as many values as you like:
function getExample() {
var a = promiseA(…);
var b = a.then(function() {
return promiseB(…)
});
var c = b.then(function() {
return promiseC(…);
});
var d = c.then(function() {
return promiseD(…);
});
return d.then(function() {
return a.value() + b.value() + c.value() + d.value();
});
}
Nesting (and) closures
Using closures for maintaining the scope of variables (in our case, the success callback function parameters) is the natural JavaScript solution. With promises, we can arbitrarily nest and flatten .then() callbacks - they are semantically equivalent, except for the scope of the inner one.
function getExample() {
return promiseA(…).then(function(resultA) {
// some processing
return promiseB(…).then(function(resultB) {
// more processing
return // something using both resultA and resultB;
});
});
}
Of course, this is building an indentation pyramid. If indentation is getting too large, you still can apply the old tools to counter the pyramid of doom: modularize, use extra named functions, and flatten the promise chain as soon as you don't need a variable any more.
In theory, you can always avoid more than two levels of nesting (by making all closures explicit), in practise use as many as are reasonable.
function getExample() {
// preprocessing
return promiseA(…).then(makeAhandler(…));
}
function makeAhandler(…)
return function(resultA) {
// some processing
return promiseB(…).then(makeBhandler(resultA, …));
};
}
function makeBhandler(resultA, …) {
return function(resultB) {
// more processing
return // anything that uses the variables in scope
};
}
You can also use helper functions for this kind of partial application, like _.partial from Underscore/lodash or the native .bind() method, to further decrease indentation:
function getExample() {
// preprocessing
return promiseA(…).then(handlerA);
}
function handlerA(resultA) {
// some processing
return promiseB(…).then(handlerB.bind(null, resultA));
}
function handlerB(resultA, resultB) {
// more processing
return // anything that uses resultA and resultB
}
Explicit pass-through
Similar to nesting the callbacks, this technique relies on closures. Yet, the chain stays flat - instead of passing only the latest result, some state object is passed for every step. These state objects accumulate the results of the previous actions, handing down all values that will be needed later again plus the result of the current task.
function getExample() {
return promiseA(…).then(function(resultA) {
// some processing
return promiseB(…).then(b => [resultA, b]); // function(b) { return [resultA, b] }
}).then(function([resultA, resultB]) {
// more processing
return // something using both resultA and resultB
});
}
Here, that little arrow b => [resultA, b] is the function that closes over resultA, and passes an array of both results to the next step. Which uses parameter destructuring syntax to break it up in single variables again.
Before destructuring became available with ES6, a nifty helper method called .spread() was provided by many promise libraries (Q, Bluebird, when, …). It takes a function with multiple parameters - one for each array element - to be used as .spread(function(resultA, resultB) { ….
Of course, that closure needed here can be further simplified by some helper functions, e.g.
function addTo(x) {
// imagine complex `arguments` fiddling or anything that helps usability
// but you get the idea with this simple one:
return res => [x, res];
}
…
return promiseB(…).then(addTo(resultA));
Alternatively, you can employ Promise.all to produce the promise for the array:
function getExample() {
return promiseA(…).then(function(resultA) {
// some processing
return Promise.all([resultA, promiseB(…)]); // resultA will implicitly be wrapped
// as if passed to Promise.resolve()
}).then(function([resultA, resultB]) {
// more processing
return // something using both resultA and resultB
});
}
And you might not only use arrays, but arbitrarily complex objects. For example, with _.extend or Object.assign in a different helper function:
function augment(obj, name) {
return function (res) { var r = Object.assign({}, obj); r[name] = res; return r; };
}
function getExample() {
return promiseA(…).then(function(resultA) {
// some processing
return promiseB(…).then(augment({resultA}, "resultB"));
}).then(function(obj) {
// more processing
return // something using both obj.resultA and obj.resultB
});
}
While this pattern guarantees a flat chain and explicit state objects can improve clarity, it will become tedious for a long chain. Especially when you need the state only sporadically, you still have to pass it through every step. With this fixed interface, the single callbacks in the chain are rather tightly coupled and inflexible to change. It makes factoring out single steps harder, and callbacks cannot be supplied directly from other modules - they always need to be wrapped in boilerplate code that cares about the state. Abstract helper functions like the above can ease the pain a bit, but it will always be present.
Mutable contextual state
The trivial (but inelegant and rather errorprone) solution is to just use higher-scope variables (to which all callbacks in the chain have access) and write result values to them when you get them:
function getExample() {
var resultA;
return promiseA(…).then(function(_resultA) {
resultA = _resultA;
// some processing
return promiseB(…);
}).then(function(resultB) {
// more processing
return // something using both resultA and resultB
});
}
Instead of many variables one might also use an (initially empty) object, on which the results are stored as dynamically created properties.
This solution has several drawbacks:
Mutable state is ugly, and global variables are evil.
This pattern doesn't work across function boundaries, modularising the functions is harder as their declarations must not leave the shared scope
The scope of the variables does not prevent to access them before they are initialized. This is especially likely for complex promise constructions (loops, branching, excptions) where race conditions might happen. Passing state explicitly, a declarative design that promises encourage, forces a cleaner coding style which can prevent this.
One must choose the scope for those shared variables correctly. It needs to be local to the executed function to prevent race conditions between multiple parallel invocations, as would be the case if, for example, state was stored on an instance.
The Bluebird library encourages the use of an object that is passed along, using their bind() method to assign a context object to a promise chain. It will be accessible from each callback function via the otherwise unusable this keyword. While object properties are more prone to undetected typos than variables, the pattern is quite clever:
function getExample() {
return promiseA(…)
.bind({}) // Bluebird only!
.then(function(resultA) {
this.resultA = resultA;
// some processing
return promiseB(…);
}).then(function(resultB) {
// more processing
return // something using both this.resultA and resultB
}).bind(); // don't forget to unbind the object if you don't want the
// caller to access it
}
This approach can be easily simulated in promise libraries that do not support .bind (although in a somewhat more verbose way and cannot be used in an expression):
function getExample() {
var ctx = {};
return promiseA(…)
.then(function(resultA) {
this.resultA = resultA;
// some processing
return promiseB(…);
}.bind(ctx)).then(function(resultB) {
// more processing
return // something using both this.resultA and resultB
}.bind(ctx));
}
A less harsh spin on "Mutable contextual state"
Using a locally scoped object to collect the intermediate results in a promise chain is a reasonable approach to the question you posed. Consider the following snippet:
function getExample(){
//locally scoped
const results = {};
return promiseA(paramsA).then(function(resultA){
results.a = resultA;
return promiseB(paramsB);
}).then(function(resultB){
results.b = resultB;
return promiseC(paramsC);
}).then(function(resultC){
//Resolve with composite of all promises
return Promise.resolve(results.a + results.b + resultC);
}).catch(function(error){
return Promise.reject(error);
});
}
Global variables are bad, so this solution uses a locally scoped variable which causes no harm. It is only accessible within the function.
Mutable state is ugly, but this does not mutate state in an ugly manner. The ugly mutable state traditionally refers to modifying the state of function arguments or global variables, but this approach simply modifies the state of a locally scoped variable that exists for the sole purpose of aggregating promise results...a variable that will die a simple death once the promise resolves.
Intermediate promises are not prevented from accessing the state of the results object, but this does not introduce some scary scenario where one of the promises in the chain will go rogue and sabotage your results. The responsibility of setting the values in each step of the promise is confined to this function and the overall result will either be correct or incorrect...it will not be some bug that will crop up years later in production (unless you intend it to!)
This does not introduce a race condition scenario that would arise from parallel invocation because a new instance of the results variable is created for every invocation of the getExample function.
Example is available on jsfiddle
Node 7.4 now supports async/await calls with the harmony flag.
Try this:
async function getExample(){
let response = await returnPromise();
let response2 = await returnPromise2();
console.log(response, response2)
}
getExample()
and run the file with:
node --harmony-async-await getExample.js
Simple as can be!
Another answer, using babel-node version <6
Using async - await
npm install -g babel#5.6.14
example.js:
async function getExample(){
let response = await returnPromise();
let response2 = await returnPromise2();
console.log(response, response2)
}
getExample()
Then, run babel-node example.js and voila!
This days, I also hava meet some questions like you. At last, I find a good solution with the quesition, it's simple and good to read. I hope this can help you.
According to how-to-chain-javascript-promises
ok, let's look at the code:
const firstPromise = () => {
return new Promise((resolve, reject) => {
setTimeout(() => {
console.log('first promise is completed');
resolve({data: '123'});
}, 2000);
});
};
const secondPromise = (someStuff) => {
return new Promise((resolve, reject) => {
setTimeout(() => {
console.log('second promise is completed');
resolve({newData: `${someStuff.data} some more data`});
}, 2000);
});
};
const thirdPromise = (someStuff) => {
return new Promise((resolve, reject) => {
setTimeout(() => {
console.log('third promise is completed');
resolve({result: someStuff});
}, 2000);
});
};
firstPromise()
.then(secondPromise)
.then(thirdPromise)
.then(data => {
console.log(data);
});
I am not going to use this pattern in my own code since I'm not a big fan of using global variables. However, in a pinch it will work.
User is a promisified Mongoose model.
var globalVar = '';
User.findAsync({}).then(function(users){
globalVar = users;
}).then(function(){
console.log(globalVar);
});
Another answer, using sequential executor nsynjs:
function getExample(){
var response1 = returnPromise1().data;
// promise1 is resolved at this point, '.data' has the result from resolve(result)
var response2 = returnPromise2().data;
// promise2 is resolved at this point, '.data' has the result from resolve(result)
console.log(response, response2);
}
nynjs.run(getExample,{},function(){
console.log('all done');
})
Update: added working example
function synchronousCode() {
var urls=[
"https://ajax.googleapis.com/ajax/libs/jquery/1.7.0/jquery.min.js",
"https://ajax.googleapis.com/ajax/libs/jquery/1.8.0/jquery.min.js",
"https://ajax.googleapis.com/ajax/libs/jquery/1.9.0/jquery.min.js"
];
for(var i=0; i<urls.length; i++) {
var len=window.fetch(urls[i]).data.text().data.length;
// ^ ^
// | +- 2-nd promise result
// | assigned to 'data'
// |
// +-- 1-st promise result assigned to 'data'
//
console.log('URL #'+i+' : '+urls[i]+", length: "+len);
}
}
nsynjs.run(synchronousCode,{},function(){
console.log('all done');
})
<script src="https://rawgit.com/amaksr/nsynjs/master/nsynjs.js"></script>
When using bluebird, you can use .bind method to share variables in promise chain:
somethingAsync().bind({})
.spread(function (aValue, bValue) {
this.aValue = aValue;
this.bValue = bValue;
return somethingElseAsync(aValue, bValue);
})
.then(function (cValue) {
return this.aValue + this.bValue + cValue;
});
please check this link for further information:
http://bluebirdjs.com/docs/api/promise.bind.html
function getExample() {
var retA, retB;
return promiseA(…).then(function(resultA) {
retA = resultA;
// Some processing
return promiseB(…);
}).then(function(resultB) {
// More processing
//retA is value of promiseA
return // How do I gain access to resultA here?
});
}
easy way :D
I think you can use hash of RSVP.
Something like as below :
const mainPromise = () => {
const promise1 = new Promise((resolve, reject) => {
setTimeout(() => {
console.log('first promise is completed');
resolve({data: '123'});
}, 2000);
});
const promise2 = new Promise((resolve, reject) => {
setTimeout(() => {
console.log('second promise is completed');
resolve({data: '456'});
}, 2000);
});
return new RSVP.hash({
prom1: promise1,
prom2: promise2
});
};
mainPromise()
.then(data => {
console.log(data.prom1);
console.log(data.prom2);
});
Solution:
You can put intermediate values in scope in any later 'then' function explicitly, by using 'bind'. It is a nice solution that doesn't require changing how Promises work, and only requires a line or two of code to propagate the values just like errors are already propagated.
Here is a complete example:
// Get info asynchronously from a server
function pGetServerInfo()
{
// then value: "server info"
} // pGetServerInfo
// Write into a file asynchronously
function pWriteFile(path,string)
{
// no then value
} // pWriteFile
// The heart of the solution: Write formatted info into a log file asynchronously,
// using the pGetServerInfo and pWriteFile operations
function pLogInfo(localInfo)
{
var scope={localInfo:localInfo}; // Create an explicit scope object
var thenFunc=p2.bind(scope); // Create a temporary function with this scope
return (pGetServerInfo().then(thenFunc)); // Do the next 'then' in the chain
} // pLogInfo
// Scope of this 'then' function is {localInfo:localInfo}
function p2(serverInfo)
{
// Do the final 'then' in the chain: Writes "local info, server info"
return pWriteFile('log',this.localInfo+','+serverInfo);
} // p2
This solution can be invoked as follows:
pLogInfo("local info").then().catch(err);
(Note: a more complex and complete version of this solution has been tested, but not this example version, so it could have a bug.)
What I learn about promises is to use it only as return values avoid referencing them if possible. async/await syntax is particularly practical for that. Today all latest browsers and node support it: https://caniuse.com/#feat=async-functions , is a simple behavior and the code is like reading synchronous code, forget about callbacks...
In cases I do need to reference a promises is when creation and resolution happen at independent/not-related places. So instead an artificial association and probably an event listener just to resolve the "distant" promise, I prefer to expose the promise as a Deferred, which the following code implements it in valid es5
/**
* Promise like object that allows to resolve it promise from outside code. Example:
*
```
class Api {
fooReady = new Deferred<Data>()
private knower() {
inOtherMoment(data=>{
this.fooReady.resolve(data)
})
}
}
```
*/
var Deferred = /** #class */ (function () {
function Deferred(callback) {
var instance = this;
this.resolve = null;
this.reject = null;
this.status = 'pending';
this.promise = new Promise(function (resolve, reject) {
instance.resolve = function () { this.status = 'resolved'; resolve.apply(this, arguments); };
instance.reject = function () { this.status = 'rejected'; reject.apply(this, arguments); };
});
if (typeof callback === 'function') {
callback.call(this, this.resolve, this.reject);
}
}
Deferred.prototype.then = function (resolve) {
return this.promise.then(resolve);
};
Deferred.prototype.catch = function (r) {
return this.promise.catch(r);
};
return Deferred;
}());
transpiled form a typescript project of mine:
https://github.com/cancerberoSgx/misc-utils-of-mine/blob/2927c2477839f7b36247d054e7e50abe8a41358b/misc-utils-of-mine-generic/src/promise.ts#L31
For more complex cases I often use these guy small promise utilities without dependencies tested and typed. p-map has been useful several times. I think he covered most use cases:
https://github.com/sindresorhus?utf8=%E2%9C%93&tab=repositories&q=promise&type=source&language=
I need to use bluebird in my code and I have no idea how to use it. My code contains nested loops. When the user logs in, my code will run. It will begin to look for any files under the user, and if there are files then, it will loop through to get the name of the files, since the name is stored in a dictionary. Once it got the name, it will store the name in an array. Once all the names are stored, it will be passed along in res.render().
Here is my code:
router.post('/login', function(req, res){
var username = req.body.username;
var password = req.body.password;
Parse.User.logIn(username, password, {
success: function(user){
var Files = Parse.Object.extend("File");
var object = [];
var query = new Parse.Query(Files);
query.equalTo("user", Parse.User.current());
var temp;
query.find({
success:function(results){
for(var i=0; i< results.length; i++){
var file = results[i].toJSON();
for(var k in file){
if (k ==="javaFile"){
for(var t in file[k]){
if (t === "name"){
temp = file[k][t];
var getname = temp.split("-").pop();
object[i] = getname;
}
}
}
}
}
}
});
console.log(object);
res.render('filename', {title: 'File Name', FIles: object});
console.log(object);
},
error: function(user, error) {
console.log("Invalid username/password");
res.render('logins');
}
})
});
EDIT:The code doesn't work, because on the first and second console.log(object), I get an empty array. I am suppose to get one item in that array, because I have one file saved
JavaScript code is all parsed from top to bottom, but it doesn't necessarily execute in that order with asynchronous code. The problem is that you have the log statements inside of the success callback of your login function, but it's NOT inside of the query's success callback.
You have a few options:
Move the console.log statements inside of the inner success callback so that while they may be parsed at load time, they do not execute until both callbacks have been invoked.
Promisify functions that traditionally rely on and invoke callback functions, and hang then handlers off of the returned value to chain the promises together.
The first option is not using promises at all, but relying solely on callbacks. To flatten your code you will want to promisify the functions and then chain them.
I'm not familiar with the syntax you're using there with the success and error callbacks, nor am I familiar with Parse. Typically you would do something like:
query.find(someArgsHere, function(success, err) {
});
But then you would have to nest another callback inside of that, and another callback inside of that. To "flatten" the pyramid, we make the function return a promise instead, and then we can chain the promises. Assuming that Parse.User.logIn is a callback-style function (as is Parse.Query.find), you might do something like:
var Promise = require('bluebird');
var login = Promise.promisify(Parse.User.logIn);
var find = Promise.promisify(Parse.Query.find);
var outerOutput = [];
return login(yourArgsHere)
.then(function(user) {
return find(user.someValue);
})
.then(function(results) {
var innerOutput = [];
// do something with innerOutput or outerOutput and render it
});
This should look familiar to synchronous code that you might be used to, except instead of saving the returned value into a variable and then passing that variable to your next function call, you use "then" handlers to chain the promises together. You could either create the entire output variable inside of the second then handler, or you can declare the variable output prior to even starting this promise chain, and then it will be in scope for all of those functions. I have shown you both options above, but obviously you don't need to define both of those variables and assign them values. Just pick the option that suits your needs.
You can also use Bluebird's promisifyAll() function to wrap an entire library with equivalent promise-returning functions. They will all have the same name of the functions in the library suffixed with Async. So assuming the Parse library contains callback-style functions named someFunctionName() and someOtherFunc() you could do this:
var Parse = Promise.promisifyAll(require("Parse"));
var promiseyFunction = function() {
return Parse.someFunctionNameAsync()
.then(function(result) {
return Parse.someOtherFuncAsync(result.someProperty);
})
.then(function(otherFuncResult) {
var something;
// do stuff to assign a value to something
return something;
});
}
I have a few pointers. ... Btw tho, are you trying to use Parse's Promises?
You can get rid of those inner nested loops and a few other changes:
Use some syntax like this to be more elegant:
/// You could use a map function like this to get the files into an array of just thier names
var fileNames = matchedFiles.map(function _getJavaFile(item) {
return item && item.javaFile && item.javaFile.name // NOT NULL
&& item.javaFile.name.split('-')[0]; // RETURN first part of name
});
// Example to filter/retrieve only valid file objs (with dashes in name)
var matchedFiles = results.filter(function _hasJavaFile(item) {
return item && item.javaFile && item.javaFile.name // NOT NULL
&& item.javaFile.name.indexOf('-') > -1; // and has a dash
});
And here is an example on using Parse's native promises (add code above to line 4/5 below, note the 'then()' function, that's effectively now your 'callback' handler):
var GameScore = Parse.Object.extend("GameScore");
var query = new Parse.Query(GameScore);
query.select("score", "playerName");
query.find().then(function(results) {
// each of results will only have the selected fields available.
});
I have a recursive function that builds asynchronously a tree on the server side and I would like to 'observe' it and have the calling method in Meteor rerun every time there is a change.
I have made a simplified example that builds a tree with a recursive readdir call (in the real application there is a computation that may take several minutes per node and its results depend on the nodes already explored)
in server/methods.js
var fs = Meteor.npmRequire('fs')
var path = Meteor.npmRequire('path')
var tree = function (dir, r) {
try
{
fs.readdir (dir, function (error, files) {
if (files && files.length)
for (var i = 0; i < files.length; i++)
{
r[i] = { name : files[i], children : [] }
tree(path.resolve(dir, files[i]), r[i].children)
}
})
} catch (e) { console.log("exception", e)}
}
Meteor.methods({
'build_tree' : function () {
var r = []
tree("/tmp/", r)
return r // Wrong !
}
})
in client/client.js
Meteor.call('build_tree', function (error, result) {
console.log(error, result)
}
I have already used futures in other parts of the code based on https://www.discovermeteor.com/patterns/5828399.
But in this case I am somehow lost due to
the recursive nature of the server-side code
the fact I want the client-side to update automatically every time the server-side data structure is updated
The only workaround that comes to my mind is to insert progressively the asynchronous results in a 'flat' Mongo collection and reactively rebuild it as a tree on the client side.
I managed to do this by
counting the number of times an asynchronous computation was started
or finished
resolving the future only when those numbers are equal
relaunching the function every time an asynchronous computation ends
(in case it returned to launch more asynchronous computations or resolve the future)
[line to close list markup or code doesn't format properly]
Future = Meteor.npmRequire('fibers/future')
FS = Meteor.npmRequire('fs')
Path = Meteor.npmRequire('path')
const all_files = []
const future = new Future()
const to_process = [dir]
let started = 0
let ended = 0
const tree = function () {
while (to_process.length) {
let dir = to_process.pop()
started++
FS.readdir (dir, function (error, files) {
if (error) {
if (error.code == 'ENOTDIR') all_files.push(dir)
}
else if (files && files.length)
{
for (let i = 0, leni = files.length; i < leni; i++)
{
let f = Path.resolve(dir, files[i])
to_process.push(f)
}
}
ended++
tree()
})
}
if (!to_process.length && started == ended)
future['return']()
}
tree()
future.wait()
It doesn't have the "progressive update" feeling that you get by updating the database and letting the reactivity manage it since all computations are waiting for that final Future['return']() but the code is simpler and self-contained.
That would be indeed very complicated. First of all, as your tree code runs async you need to either provide a success callback/resolve a promise/return a future or something else, so that you can control when the Meteor method returns. Then you need to use Futures to defer the return of the method util you have your result.
But even then I don't see how the server is supposed to know that something has changed.
The only workaround that comes to my mind is to insert progressively the asynchronous results in a 'flat' Mongo collection and reactively rebuild it as a tree on the client side.
This is actually a workable straightforward solution.
I have seen Chaining an arbitrary number of promises in Q ; my question is different.
How can I make a variable number of calls, each of which returns asynchronously, in order?
The scenario is a set of HTTP requests, the number and type of which is determined by the results of the first HTTP request.
I'd like to do this simply.
I have also seen this answer which suggests something like this:
var q = require('q'),
itemsToProcess = ["one", "two", "three", "four", "five"];
function getDeferredResult(prevResult) {
return (function (someResult) {
var deferred = q.defer();
// any async function (setTimeout for now will do, $.ajax() later)
setTimeout(function () {
var nextResult = (someResult || "Initial_Blank_Value ") + ".." + itemsToProcess[0];
itemsToProcess = itemsToProcess.splice(1);
console.log("tick", nextResult, "Array:", itemsToProcess);
deferred.resolve(nextResult);
}, 600);
return deferred.promise;
}(prevResult));
}
var chain = q.resolve("start");
for (var i = itemsToProcess.length; i > 0; i--) {
chain = chain.then(getDeferredResult);
}
...but it seems awkward to loop through the itemsToProcess in that way. Or to define a new function called "loop" that abstracts the recursion. What's a better way?
There's a nice clean way to to this with [].reduce.
var chain = itemsToProcess.reduce(function (previous, item) {
return previous.then(function (previousValue) {
// do what you want with previous value
// return your async operation
return Q.delay(100);
})
}, Q.resolve(/* set the first "previousValue" here */));
chain.then(function (lastResult) {
// ...
});
reduce iterates through the array, passing in the returned value of the previous iteration. In this case you're returning promises, and so each time you are chaining a then. You provide an initial promise (as you did with q.resolve("start")) to kick things off.
At first it can take a while to wrap your head around what's going on here but if you take a moment to work through it then it's an easy pattern to use anywhere, without having to set up any machinery.
I like this way better:
var q = require('q'),
itemsToProcess = ["one", "two", "three", "four", "five"];
function getDeferredResult(a) {
return (function (items) {
var deferred;
// end
if (items.length === 0) {
return q.resolve(true);
}
deferred = q.defer();
// any async function (setTimeout for now will do, $.ajax() later)
setTimeout(function () {
var a = items[0];
console.log(a);
// pop one item off the array of workitems
deferred.resolve(items.splice(1));
}, 600);
return deferred.promise.then(getDeferredResult);
}(a));
}
q.resolve(itemsToProcess)
.then(getDeferredResult);
The key here is to call .then() on the deferred.promise with a spliced version of the array of workitems. This then gets run after the initial deferred promise resolves, which is in the fn for the setTimeout. In a more realistic scenario, the deferred promise would get resolved in the http client callback.
The initial q.resolve(itemsToProcess) kicks things off by passing in the work items to the first call of the work fn.
I added this in hopes it would help others.
Here is a concept of a state machine defined with Q.
Suppose you have the HTTP function defined, so it returns a Q promise object:
var Q_http = function (url, options) {
return Q.when($.ajax(url, options));
}
You can define a recursive function nextState as following:
var states = [...]; // an array of states in the system.
// this is a state machine to control what url to get data from
// at the current state
function nextState(current) {
if (is_terminal_state(current))
return Q(true);
return Q_http(current.url, current.data).then(function (result) {
var next = process(current, result);
return nextState(next);
});
}
Where function process(current, result) is a function to find out what the next step would be according to the current state and the result from the HTTP call.
When you use it, use it like:
nextState(initial).then(function () {
// all requests are successful.
}, function (reason) {
// for some unexpected reason the request sequence fails in the middle.
});
I propose another solutions, which looks easier to understand to me.
You do the same as you would when chaining promises directly:
promise.then(doSomethingFunction).then(doAnotherThingFunction);
If we put that into a loop, we get this:
var chain = Q.when();
for(...) {
chain = chain.then(functionToCall.bind(this, arg1, arg2));
};
chain.then(function() {
console.log("whole chain resolved");
});
var functionToCall = function(arg1, arg2, resultFromPreviousPromise) {
}
We use function currying to use multiple arguments. In our example
functionToCall.bind(this, arg1, arg2) will return a function with one argument: functionToCall(resultFromPreviousPromise)
You do not need to use the result from the previous promise.