(node.js version 7 or above, not C#) multiple await call with node.js [duplicate] - node.js

As far as I understand, in ES7/ES2016 putting multiple await's in code will work similar to chaining .then() with promises, meaning that they will execute one after the other rather than in parallel. So, for example, we have this code:
await someCall();
await anotherCall();
Do I understand it correctly that anotherCall() will be called only when someCall() is completed? What is the most elegant way of calling them in parallel?
I want to use it in Node, so maybe there's a solution with async library?
EDIT: I'm not satisfied with the solution provided in this question: Slowdown due to non-parallel awaiting of promises in async generators, because it uses generators and I'm asking about a more general use case.

You can await on Promise.all():
await Promise.all([someCall(), anotherCall()]);
To store the results:
let [someResult, anotherResult] = await Promise.all([someCall(), anotherCall()]);
Note that Promise.all fails fast, which means that as soon as one of the promises supplied to it rejects, then the entire thing rejects.
const happy = (v, ms) => new Promise((resolve) => setTimeout(() => resolve(v), ms))
const sad = (v, ms) => new Promise((_, reject) => setTimeout(() => reject(v), ms))
Promise.all([happy('happy', 100), sad('sad', 50)])
.then(console.log).catch(console.log) // 'sad'
If, instead, you want to wait for all the promises to either fulfill or reject, then you can use Promise.allSettled. Note that Internet Explorer does not natively support this method.
const happy = (v, ms) => new Promise((resolve) => setTimeout(() => resolve(v), ms))
const sad = (v, ms) => new Promise((_, reject) => setTimeout(() => reject(v), ms))
Promise.allSettled([happy('happy', 100), sad('sad', 50)])
.then(console.log) // [{ "status":"fulfilled", "value":"happy" }, { "status":"rejected", "reason":"sad" }]
Note: If you use Promise.all actions that managed to finish before rejection happen are not rolled back, so you may need to take care of such situation. For example
if you have 5 actions, 4 quick, 1 slow and slow rejects. Those 4
actions may be already executed so you may need to roll back. In such situation consider using Promise.allSettled while it will provide exact detail which action failed and which not.

TL;DR
Use Promise.all for the parallel function calls, the answer behaviors not correctly when the error occurs.
First, execute all the asynchronous calls at once and obtain all the Promise objects. Second, use await on the Promise objects. This way, while you wait for the first Promise to resolve the other asynchronous calls are still progressing. Overall, you will only wait for as long as the slowest asynchronous call. For example:
// Begin first call and store promise without waiting
const someResult = someCall();
// Begin second call and store promise without waiting
const anotherResult = anotherCall();
// Now we await for both results, whose async processes have already been started
const finalResult = [await someResult, await anotherResult];
// At this point all calls have been resolved
// Now when accessing someResult| anotherResult,
// you will have a value instead of a promise
JSbin example: http://jsbin.com/xerifanima/edit?js,console
Caveat: It doesn't matter if the await calls are on the same line or on different lines, so long as the first await call happens after all of the asynchronous calls. See JohnnyHK's comment.
Update: this answer has a different timing in error handling according to the #bergi's answer, it does NOT throw out the error as the error occurs but after all the promises are executed.
I compare the result with #jonny's tip: [result1, result2] = Promise.all([async1(), async2()]), check the following code snippet
const correctAsync500ms = () => {
return new Promise(resolve => {
setTimeout(resolve, 500, 'correct500msResult');
});
};
const correctAsync100ms = () => {
return new Promise(resolve => {
setTimeout(resolve, 100, 'correct100msResult');
});
};
const rejectAsync100ms = () => {
return new Promise((resolve, reject) => {
setTimeout(reject, 100, 'reject100msError');
});
};
const asyncInArray = async (fun1, fun2) => {
const label = 'test async functions in array';
try {
console.time(label);
const p1 = fun1();
const p2 = fun2();
const result = [await p1, await p2];
console.timeEnd(label);
} catch (e) {
console.error('error is', e);
console.timeEnd(label);
}
};
const asyncInPromiseAll = async (fun1, fun2) => {
const label = 'test async functions with Promise.all';
try {
console.time(label);
let [value1, value2] = await Promise.all([fun1(), fun2()]);
console.timeEnd(label);
} catch (e) {
console.error('error is', e);
console.timeEnd(label);
}
};
(async () => {
console.group('async functions without error');
console.log('async functions without error: start')
await asyncInArray(correctAsync500ms, correctAsync100ms);
await asyncInPromiseAll(correctAsync500ms, correctAsync100ms);
console.groupEnd();
console.group('async functions with error');
console.log('async functions with error: start')
await asyncInArray(correctAsync500ms, rejectAsync100ms);
await asyncInPromiseAll(correctAsync500ms, rejectAsync100ms);
console.groupEnd();
})();

Update:
The original answer makes it difficult (and in some cases impossible) to correctly handle promise rejections. The correct solution is to use Promise.all:
const [someResult, anotherResult] = await Promise.all([someCall(), anotherCall()]);
Original answer:
Just make sure you call both functions before you await either one:
// Call both functions
const somePromise = someCall();
const anotherPromise = anotherCall();
// Await both promises
const someResult = await somePromise;
const anotherResult = await anotherPromise;

There is another way without Promise.all() to do it in parallel:
First, we have 2 functions to print numbers:
function printNumber1() {
return new Promise((resolve,reject) => {
setTimeout(() => {
console.log("Number1 is done");
resolve(10);
},1000);
});
}
function printNumber2() {
return new Promise((resolve,reject) => {
setTimeout(() => {
console.log("Number2 is done");
resolve(20);
},500);
});
}
This is sequential:
async function oneByOne() {
const number1 = await printNumber1();
const number2 = await printNumber2();
}
//Output: Number1 is done, Number2 is done
This is parallel:
async function inParallel() {
const promise1 = printNumber1();
const promise2 = printNumber2();
const number1 = await promise1;
const number2 = await promise2;
}
//Output: Number2 is done, Number1 is done

I've created a gist testing some different ways of resolving promises, with results. It may be helpful to see the options that work.
Edit: Gist content as per Jin Lee's comment
// Simple gist to test parallel promise resolution when using async / await
function promiseWait(time) {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve(true);
}, time);
});
}
async function test() {
return [
await promiseWait(1000),
await promiseWait(5000),
await promiseWait(9000),
await promiseWait(3000),
]
}
async function test2() {
return {
'aa': await promiseWait(1000),
'bb': await promiseWait(5000),
'cc': await promiseWait(9000),
'dd': await promiseWait(3000),
}
}
async function test3() {
return await {
'aa': promiseWait(1000),
'bb': promiseWait(5000),
'cc': promiseWait(9000),
'dd': promiseWait(3000),
}
}
async function test4() {
const p1 = promiseWait(1000);
const p2 = promiseWait(5000);
const p3 = promiseWait(9000);
const p4 = promiseWait(3000);
return {
'aa': await p1,
'bb': await p2,
'cc': await p3,
'dd': await p4,
};
}
async function test5() {
return await Promise.all([
await promiseWait(1000),
await promiseWait(5000),
await promiseWait(9000),
await promiseWait(3000),
]);
}
async function test6() {
return await Promise.all([
promiseWait(1000),
promiseWait(5000),
promiseWait(9000),
promiseWait(3000),
]);
}
async function test7() {
const p1 = promiseWait(1000);
const p2 = promiseWait(5000);
const p3 = promiseWait(9000);
return {
'aa': await p1,
'bb': await p2,
'cc': await p3,
'dd': await promiseWait(3000),
};
}
let start = Date.now();
test().then((res) => {
console.log('Test Done, elapsed', (Date.now() - start) / 1000, res);
start = Date.now();
test2().then((res) => {
console.log('Test2 Done, elapsed', (Date.now() - start) / 1000, res);
start = Date.now();
test3().then((res) => {
console.log('Test3 Done, elapsed', (Date.now() - start) / 1000, res);
start = Date.now();
test4().then((res) => {
console.log('Test4 Done, elapsed', (Date.now() - start) / 1000, res);
start = Date.now();
test5().then((res) => {
console.log('Test5 Done, elapsed', (Date.now() - start) / 1000, res);
start = Date.now();
test6().then((res) => {
console.log('Test6 Done, elapsed', (Date.now() - start) / 1000, res);
});
start = Date.now();
test7().then((res) => {
console.log('Test7 Done, elapsed', (Date.now() - start) / 1000, res);
});
});
});
});
});
});
/*
Test Done, elapsed 18.006 [ true, true, true, true ]
Test2 Done, elapsed 18.009 { aa: true, bb: true, cc: true, dd: true }
Test3 Done, elapsed 0 { aa: Promise { <pending> },
bb: Promise { <pending> },
cc: Promise { <pending> },
dd: Promise { <pending> } }
Test4 Done, elapsed 9 { aa: true, bb: true, cc: true, dd: true }
Test5 Done, elapsed 18.008 [ true, true, true, true ]
Test6 Done, elapsed 9.003 [ true, true, true, true ]
Test7 Done, elapsed 12.007 { aa: true, bb: true, cc: true, dd: true }
*/

In my case, I have several tasks I want to execute in parallel, but I need to do something different with the result of those tasks.
function wait(ms, data) {
console.log('Starting task:', data, ms);
return new Promise(resolve => setTimeout(resolve, ms, data));
}
var tasks = [
async () => {
var result = await wait(1000, 'moose');
// do something with result
console.log(result);
},
async () => {
var result = await wait(500, 'taco');
// do something with result
console.log(result);
},
async () => {
var result = await wait(5000, 'burp');
// do something with result
console.log(result);
}
]
await Promise.all(tasks.map(p => p()));
console.log('done');
And the output:
Starting task: moose 1000
Starting task: taco 500
Starting task: burp 5000
taco
moose
burp
done
(async function(){
function wait(ms, data) {
console.log('Starting task:', data, ms);
return new Promise(resolve => setTimeout(resolve, ms, data));
}
var tasks = [
async () => {
var result = await wait(1000, 'moose');
// do something with result
console.log(result);
},
async () => {
var result = await wait(500, 'taco');
// do something with result
console.log(result);
},
async () => {
var result = await wait(5000, 'burp');
// do something with result
console.log(result);
}
]
await Promise.all(tasks.map(p => p()));
console.log('done');
})();

await Promise.all([someCall(), anotherCall()]); as already mention will act as a thread fence (very common in parallel code as CUDA), hence it will allow all the promises in it to run without blocking each other, but will prevent the execution to continue until ALL are resolved.
another approach that is worth to share is the Node.js async that will also allow you to easily control the amount of concurrency that is usually desirable if the task is directly linked to the use of limited resources as API call, I/O operations, etc.
// create a queue object with concurrency 2
var q = async.queue(function(task, callback) {
console.log('Hello ' + task.name);
callback();
}, 2);
// assign a callback
q.drain = function() {
console.log('All items have been processed');
};
// add some items to the queue
q.push({name: 'foo'}, function(err) {
console.log('Finished processing foo');
});
q.push({name: 'bar'}, function (err) {
console.log('Finished processing bar');
});
// add some items to the queue (batch-wise)
q.push([{name: 'baz'},{name: 'bay'},{name: 'bax'}], function(err) {
console.log('Finished processing item');
});
// add some items to the front of the queue
q.unshift({name: 'bar'}, function (err) {
console.log('Finished processing bar');
});
Credits to the Medium article autor (read more)

You can call multiple asynchronous functions without awaiting them. This will execute them in parallel. While doing so, save the returned promises in variables, and await them at some point either individually or using Promise.all() and process the results.
You can also wrap the function calls with try...catch to handle failures of individual asynchronous actions and provide fallback logic.
Here's an example:
Observe the logs, the logs printed at the beginning of execution of the individual asynchronous functions get printed immediately even though the first function takes 5 seconds to resolve.
function someLongFunc () {
return new Promise((resolve, reject)=> {
console.log('Executing function 1')
setTimeout(resolve, 5000)
})
}
function anotherLongFunc () {
return new Promise((resolve, reject)=> {
console.log('Executing function 2')
setTimeout(resolve, 5000)
})
}
async function main () {
let someLongFuncPromise, anotherLongFuncPromise
const start = Date.now()
try {
someLongFuncPromise = someLongFunc()
}
catch (ex) {
console.error('something went wrong during func 1')
}
try {
anotherLongFuncPromise = anotherLongFunc()
}
catch (ex) {
console.error('something went wrong during func 2')
}
await someLongFuncPromise
await anotherLongFuncPromise
const totalTime = Date.now() - start
console.log('Execution completed in ', totalTime)
}
main()

// A generic test function that can be configured
// with an arbitrary delay and to either resolve or reject
const test = (delay, resolveSuccessfully) => new Promise((resolve, reject) => setTimeout(() => {
console.log(`Done ${ delay }`);
resolveSuccessfully ? resolve(`Resolved ${ delay }`) : reject(`Reject ${ delay }`)
}, delay));
// Our async handler function
const handler = async () => {
// Promise 1 runs first, but resolves last
const p1 = test(10000, true);
// Promise 2 run second, and also resolves
const p2 = test(5000, true);
// Promise 3 runs last, but completes first (with a rejection)
// Note the catch to trap the error immediately
const p3 = test(1000, false).catch(e => console.log(e));
// Await all in parallel
const r = await Promise.all([p1, p2, p3]);
// Display the results
console.log(r);
};
// Run the handler
handler();
/*
Done 1000
Reject 1000
Done 5000
Done 10000
*/
Whilst setting p1, p2 and p3 is not strictly running them in parallel, they do not hold up any execution and you can trap contextual errors with a catch.

This can be accomplished with Promise.allSettled(), which is similar to Promise.all() but without the fail-fast behavior.
async function Promise1() {
throw "Failure!";
}
async function Promise2() {
return "Success!";
}
const [Promise1Result, Promise2Result] = await Promise.allSettled([Promise1(), Promise2()]);
console.log(Promise1Result); // {status: "rejected", reason: "Failure!"}
console.log(Promise2Result); // {status: "fulfilled", value: "Success!"}
Note: This is a bleeding edge feature with limited browser support, so I strongly recommend including a polyfill for this function.

I create a helper function waitAll, may be it can make it sweeter.
It only works in nodejs for now, not in browser chrome.
//const parallel = async (...items) => {
const waitAll = async (...items) => {
//this function does start execution the functions
//the execution has been started before running this code here
//instead it collects of the result of execution of the functions
const temp = [];
for (const item of items) {
//this is not
//temp.push(await item())
//it does wait for the result in series (not in parallel), but
//it doesn't affect the parallel execution of those functions
//because they haven started earlier
temp.push(await item);
}
return temp;
};
//the async functions are executed in parallel before passed
//in the waitAll function
//const finalResult = await waitAll(someResult(), anotherResult());
//const finalResult = await parallel(someResult(), anotherResult());
//or
const [result1, result2] = await waitAll(someResult(), anotherResult());
//const [result1, result2] = await parallel(someResult(), anotherResult());

I vote for:
await Promise.all([someCall(), anotherCall()]);
Be aware of the moment you call functions, it may cause unexpected result:
// Supposing anotherCall() will trigger a request to create a new User
if (callFirst) {
await someCall();
} else {
await Promise.all([someCall(), anotherCall()]); // --> create new User here
}
But following always triggers request to create new User
// Supposing anotherCall() will trigger a request to create a new User
const someResult = someCall();
const anotherResult = anotherCall(); // ->> This always creates new User
if (callFirst) {
await someCall();
} else {
const finalResult = [await someResult, await anotherResult]
}

Related

How to use Await Inside Array.map for API's response [duplicate]

Consider the following code that reads an array of files in a serial/sequential manner. readFiles returns a promise, which is resolved only once all files have been read in sequence.
var readFile = function(file) {
... // Returns a promise.
};
var readFiles = function(files) {
return new Promise((resolve, reject) => {
var readSequential = function(index) {
if (index >= files.length) {
resolve();
} else {
readFile(files[index]).then(function() {
readSequential(index + 1);
}).catch(reject);
}
};
readSequential(0); // Start with the first file!
});
};
The above code works, but I don't like having to do recursion for things to occur sequentially. Is there a simpler way that this code can be re-written so that I don't have to use my weird readSequential function?
Originally I tried to use Promise.all, but that caused all of the readFile calls to happen concurrently, which is not what I want:
var readFiles = function(files) {
return Promise.all(files.map(function(file) {
return readFile(file);
}));
};
Update 2017: I would use an async function if the environment supports it:
async function readFiles(files) {
for(const file of files) {
await readFile(file);
}
};
If you'd like, you can defer reading the files until you need them using an async generator (if your environment supports it):
async function* readFiles(files) {
for(const file of files) {
yield await readFile(file);
}
};
Update: In second thought - I might use a for loop instead:
var readFiles = function(files) {
var p = Promise.resolve(); // Q() in q
files.forEach(file =>
p = p.then(() => readFile(file));
);
return p;
};
Or more compactly, with reduce:
var readFiles = function(files) {
return files.reduce((p, file) => {
return p.then(() => readFile(file));
}, Promise.resolve()); // initial
};
In other promise libraries (like when and Bluebird) you have utility methods for this.
For example, Bluebird would be:
var Promise = require("bluebird");
var fs = Promise.promisifyAll(require("fs"));
var readAll = Promise.resolve(files).map(fs.readFileAsync,{concurrency: 1 });
// if the order matters, you can use Promise.each instead and omit concurrency param
readAll.then(function(allFileContents){
// do stuff to read files.
});
Although there is really no reason not to use async await today.
Here is how I prefer to run tasks in series.
function runSerial() {
var that = this;
// task1 is a function that returns a promise (and immediately starts executing)
// task2 is a function that returns a promise (and immediately starts executing)
return Promise.resolve()
.then(function() {
return that.task1();
})
.then(function() {
return that.task2();
})
.then(function() {
console.log(" ---- done ----");
});
}
What about cases with more tasks? Like, 10?
function runSerial(tasks) {
var result = Promise.resolve();
tasks.forEach(task => {
result = result.then(() => task());
});
return result;
}
This question is old, but we live in a world of ES6 and functional JavaScript, so let's see how we can improve.
Because promises execute immediately, we can't just create an array of promises, they would all fire off in parallel.
Instead, we need to create an array of functions that returns a promise. Each function will then be executed sequentially, which then starts the promise inside.
We can solve this a few ways, but my favorite way is to use reduce.
It gets a little tricky using reduce in combination with promises, so I have broken down the one liner into some smaller digestible bites below.
The essence of this function is to use reduce starting with an initial value of Promise.resolve([]), or a promise containing an empty array.
This promise will then be passed into the reduce method as promise. This is the key to chaining each promise together sequentially. The next promise to execute is func and when the then fires, the results are concatenated and that promise is then returned, executing the reduce cycle with the next promise function.
Once all promises have executed, the returned promise will contain an array of all the results of each promise.
ES6 Example (one liner)
/*
* serial executes Promises sequentially.
* #param {funcs} An array of funcs that return promises.
* #example
* const urls = ['/url1', '/url2', '/url3']
* serial(urls.map(url => () => $.ajax(url)))
* .then(console.log.bind(console))
*/
const serial = funcs =>
funcs.reduce((promise, func) =>
promise.then(result => func().then(Array.prototype.concat.bind(result))), Promise.resolve([]))
ES6 Example (broken down)
// broken down to for easier understanding
const concat = list => Array.prototype.concat.bind(list)
const promiseConcat = f => x => f().then(concat(x))
const promiseReduce = (acc, x) => acc.then(promiseConcat(x))
/*
* serial executes Promises sequentially.
* #param {funcs} An array of funcs that return promises.
* #example
* const urls = ['/url1', '/url2', '/url3']
* serial(urls.map(url => () => $.ajax(url)))
* .then(console.log.bind(console))
*/
const serial = funcs => funcs.reduce(promiseReduce, Promise.resolve([]))
Usage:
// first take your work
const urls = ['/url1', '/url2', '/url3', '/url4']
// next convert each item to a function that returns a promise
const funcs = urls.map(url => () => $.ajax(url))
// execute them serially
serial(funcs)
.then(console.log.bind(console))
To do this simply in ES6:
function(files) {
// Create a new empty promise (don't do that with real people ;)
var sequence = Promise.resolve();
// Loop over each file, and add on a promise to the
// end of the 'sequence' promise.
files.forEach(file => {
// Chain one computation onto the sequence
sequence =
sequence
.then(() => performComputation(file))
.then(result => doSomething(result));
// Resolves for each file, one at a time.
})
// This will resolve after the entire chain is resolved
return sequence;
}
Addition example
const addTwo = async () => 2;
const addThree = async (inValue) => new Promise((resolve) => setTimeout(resolve(inValue + 3), 2000));
const addFour = (inValue) => new Promise((res) => setTimeout(res(inValue + 4), 1000));
const addFive = async (inValue) => inValue + 5;
// Function which handles promises from above
async function sequenceAddition() {
let sum = await [addTwo, addThree, addFour, addFive].reduce(
(promise, currPromise) => promise.then((val) => currPromise(val)),
Promise.resolve()
);
console.log('sum:', sum); // 2 + 3 + 4 + 5 = 14
}
// Run function. See console for result.
sequenceAddition();
General syntax to use reduce()
function sequence(tasks, fn) {
return tasks.reduce((promise, task) => promise.then(() => fn(task)), Promise.resolve());
}
UPDATE
items-promise is a ready to use NPM package doing the same.
I've had to run a lot of sequential tasks and used these answers to forge a function that would take care of handling any sequential task...
function one_by_one(objects_array, iterator, callback) {
var start_promise = objects_array.reduce(function (prom, object) {
return prom.then(function () {
return iterator(object);
});
}, Promise.resolve()); // initial
if(callback){
start_promise.then(callback);
}else{
return start_promise;
}
}
The function takes 2 arguments + 1 optional. First argument is the array on which we will be working. The second argument is the task itself, a function that returns a promise, the next task will be started only when this promise resolves. The third argument is a callback to run when all tasks have been done. If no callback is passed, then the function returns the promise it created so we can handle the end.
Here's an example of usage:
var filenames = ['1.jpg','2.jpg','3.jpg'];
var resize_task = function(filename){
//return promise of async resizing with filename
};
one_by_one(filenames,resize_task );
Hope it saves someone some time...
With Async/Await (if you have the support of ES7)
function downloadFile(fileUrl) { ... } // This function return a Promise
async function main()
{
var filesList = [...];
for (const file of filesList) {
await downloadFile(file);
}
}
(you must use for loop, and not forEach because async/await has problems running in forEach loop)
Without Async/Await (using Promise)
function downloadFile(fileUrl) { ... } // This function return a Promise
function downloadRecursion(filesList, index)
{
index = index || 0;
if (index < filesList.length)
{
downloadFile(filesList[index]).then(function()
{
index++;
downloadRecursion(filesList, index); // self invocation - recursion!
});
}
else
{
return Promise.resolve();
}
}
function main()
{
var filesList = [...];
downloadRecursion(filesList);
}
My preferred solution:
function processArray(arr, fn) {
return arr.reduce(
(p, v) => p.then((a) => fn(v).then(r => a.concat([r]))),
Promise.resolve([])
);
}
It's not fundamentally different from others published here but:
Applies the function to items in series
Resolves to an array of results
Doesn't require async/await (support is still quite limited, circa 2017)
Uses arrow functions; nice and concise
Example usage:
const numbers = [0, 4, 20, 100];
const multiplyBy3 = (x) => new Promise(res => res(x * 3));
// Prints [ 0, 12, 60, 300 ]
processArray(numbers, multiplyBy3).then(console.log);
Tested on reasonable current Chrome (v59) and NodeJS (v8.1.2).
First, you need to understand that a promise is executed at the time of creation.
So for example if you have a code:
["a","b","c"].map(x => returnsPromise(x))
You need to change it to:
["a","b","c"].map(x => () => returnsPromise(x))
Then we need to sequentially chain promises:
["a", "b", "c"].map(x => () => returnsPromise(x))
.reduce(
(before, after) => before.then(_ => after()),
Promise.resolve()
)
executing after(), will make sure that promise is created (and executed) only when its time comes.
Nicest solution that I was able to figure out was with bluebird promises. You can just do Promise.resolve(files).each(fs.readFileAsync); which guarantees that promises are resolved sequentially in order.
With async/await of ES2016 (and maybe some features of ES2018), this can be reduced to this form:
function readFile(file) {
... // Returns a promise.
}
async function readFiles(files) {
for (file in files) {
await readFile(file)
}
}
I haven't seen another answer express that simplicity. The OP said parallel execution of readFile was not desired. However, with IO like this it really makes sense to not be blocking on a single file read, while keeping the loop execution synchronous (you don't want to do the next step until all files have been read). Since I just learned about this and am a bit excited about it, I'll share that approach of parallel asynchronous execution of readFile with overall synchronous execution of readFiles.
async function readFiles(files) {
await Promise.all(files.map(readFile))
}
Isn't that a thing of beauty?
This is a slight variation of another answer above. Using native Promises:
function inSequence(tasks) {
return tasks.reduce((p, task) => p.then(task), Promise.resolve())
}
Explanation
If you have these tasks [t1, t2, t3], then the above is equivalent to Promise.resolve().then(t1).then(t2).then(t3). It's the behavior of reduce.
How to use
First You need to construct a list of tasks! A task is a function that accepts no argument. If you need to pass arguments to your function, then use bind or other methods to create a task. For example:
var tasks = files.map(file => processFile.bind(null, file))
inSequence(tasks).then(...)
I created this simple method on the Promise object:
Create and add a Promise.sequence method to the Promise object
Promise.sequence = function (chain) {
var results = [];
var entries = chain;
if (entries.entries) entries = entries.entries();
return new Promise(function (yes, no) {
var next = function () {
var entry = entries.next();
if(entry.done) yes(results);
else {
results.push(entry.value[1]().then(next, function() { no(results); } ));
}
};
next();
});
};
Usage:
var todo = [];
todo.push(firstPromise);
if (someCriterium) todo.push(optionalPromise);
todo.push(lastPromise);
// Invoking them
Promise.sequence(todo)
.then(function(results) {}, function(results) {});
The best thing about this extension to the Promise object, is that it is consistent with the style of promises. Promise.all and Promise.sequence is invoked the same way, but have different semantics.
Caution
Sequential running of promises is not usually a very good way to use promises. It's usually better to use Promise.all, and let the browser run the code as fast as possible. However, there are real use cases for it - for example when writing a mobile app using javascript.
My answer based on https://stackoverflow.com/a/31070150/7542429.
Promise.series = function series(arrayOfPromises) {
var results = [];
return arrayOfPromises.reduce(function(seriesPromise, promise) {
return seriesPromise.then(function() {
return promise
.then(function(result) {
results.push(result);
});
});
}, Promise.resolve())
.then(function() {
return results;
});
};
This solution returns the results as an array like Promise.all().
Usage:
Promise.series([array of promises])
.then(function(results) {
// do stuff with results here
});
Use Array.prototype.reduce, and remember to wrap your promises in a function otherwise they will already be running!
// array of Promise providers
const providers = [
function(){
return Promise.resolve(1);
},
function(){
return Promise.resolve(2);
},
function(){
return Promise.resolve(3);
}
]
const inSeries = function(providers){
const seed = Promise.resolve(null);
return providers.reduce(function(a,b){
return a.then(b);
}, seed);
};
nice and easy...
you should be able to re-use the same seed for performance, etc.
It's important to guard against empty arrays or arrays with only 1 element when using reduce, so this technique is your best bet:
const providers = [
function(v){
return Promise.resolve(v+1);
},
function(v){
return Promise.resolve(v+2);
},
function(v){
return Promise.resolve(v+3);
}
]
const inSeries = function(providers, initialVal){
if(providers.length < 1){
return Promise.resolve(null)
}
return providers.reduce((a,b) => a.then(b), providers.shift()(initialVal));
};
and then call it like:
inSeries(providers, 1).then(v => {
console.log(v); // 7
});
Using modern ES:
const series = async (tasks) => {
const results = [];
for (const task of tasks) {
const result = await task;
results.push(result);
}
return results;
};
//...
const readFiles = await series(files.map(readFile));
Most of the answers dont include the results of ALL promises individually, so in case someone is looking for this particular behaviour, this is a possible solution using recursion.
It follows the style of Promise.all:
Returns the array of results in the .then() callback.
If some promise fails, its returned immediately in the .catch() callback.
const promiseEach = (arrayOfTasks) => {
let results = []
return new Promise((resolve, reject) => {
const resolveNext = (arrayOfTasks) => {
// If all tasks are already resolved, return the final array of results
if (arrayOfTasks.length === 0) return resolve(results)
// Extract first promise and solve it
const first = arrayOfTasks.shift()
first().then((res) => {
results.push(res)
resolveNext(arrayOfTasks)
}).catch((err) => {
reject(err)
})
}
resolveNext(arrayOfTasks)
})
}
// Lets try it 😎
const promise = (time, shouldThrowError) => new Promise((resolve, reject) => {
const timeInMs = time * 1000
setTimeout(()=>{
console.log(`Waited ${time} secs`)
if (shouldThrowError) reject(new Error('Promise failed'))
resolve(time)
}, timeInMs)
})
const tasks = [() => promise(1), () => promise(2)]
promiseEach(tasks)
.then((res) => {
console.log(res) // [1, 2]
})
// Oops some promise failed
.catch((error) => {
console.log(error)
})
Note about the tasks array declaration:
In this case is not possible to use the following notation like Promise.all would use:
const tasks = [promise(1), promise(2)]
And we have to use:
const tasks = [() => promise(1), () => promise(2)]
The reason is that JavaScript starts executing the promise immediatelly after its declared. If we use methods like Promise.all, it just checks that the state of all of them is fulfilled or rejected, but doesnt start the exection itself. Using () => promise() we stop the execution until its called.
You can use this function that gets promiseFactories List:
function executeSequentially(promiseFactories) {
var result = Promise.resolve();
promiseFactories.forEach(function (promiseFactory) {
result = result.then(promiseFactory);
});
return result;
}
Promise Factory is just simple function that returns a Promise:
function myPromiseFactory() {
return somethingThatCreatesAPromise();
}
It works because a promise factory doesn't create the promise until it's asked to. It works the same way as a then function – in fact, it's the same thing!
You don't want to operate over an array of promises at all. Per the Promise spec, as soon as a promise is created, it begins executing. So what you really want is an array of promise factories...
If you want to learn more on Promises, you should check this link:
https://pouchdb.com/2015/05/18/we-have-a-problem-with-promises.html
If you want you can use reduce to make a sequential promise, for example:
[2,3,4,5,6,7,8,9].reduce((promises, page) => {
return promises.then((page) => {
console.log(page);
return Promise.resolve(page+1);
});
}, Promise.resolve(1));
it'll always works in sequential.
I really liked #joelnet's answer, but to me, that style of coding is a little bit tough to digest, so I spent a couple of days trying to figure out how I would express the same solution in a more readable manner and this is my take, just with a different syntax and some comments.
// first take your work
const urls = ['/url1', '/url2', '/url3', '/url4']
// next convert each item to a function that returns a promise
const functions = urls.map((url) => {
// For every url we return a new function
return () => {
return new Promise((resolve) => {
// random wait in milliseconds
const randomWait = parseInt((Math.random() * 1000),10)
console.log('waiting to resolve in ms', randomWait)
setTimeout(()=>resolve({randomWait, url}),randomWait)
})
}
})
const promiseReduce = (acc, next) => {
// we wait for the accumulator to resolve it's promise
return acc.then((accResult) => {
// and then we return a new promise that will become
// the new value for the accumulator
return next().then((nextResult) => {
// that eventually will resolve to a new array containing
// the value of the two promises
return accResult.concat(nextResult)
})
})
};
// the accumulator will always be a promise that resolves to an array
const accumulator = Promise.resolve([])
// we call reduce with the reduce function and the accumulator initial value
functions.reduce(promiseReduce, accumulator)
.then((result) => {
// let's display the final value here
console.log('=== The final result ===')
console.log(result)
})
As Bergi noticed, I think the best and clear solution is use BlueBird.each, code below:
const BlueBird = require('bluebird');
BlueBird.each(files, fs.readFileAsync);
I find myself coming back to this question many times and the answers aren't exactly giving me what I need, so putting this here for anyone that needs this too.
The code below does sequential promises execution (one after another), and each round consists of multiple callings:
async function sequence(list, cb) {
const result = [];
await list.reduce(async (promise, item) => promise
.then(() => cb(item))
.then((res) => result.push(res)
), Promise.resolve());
return result;
}
Showcase:
<script src="https://cdnjs.cloudflare.com/ajax/libs/axios/0.15.3/axios.min.js"></script>
<script src="https://unpkg.com/#babel/standalone#7/babel.min.js"></script>
<script type="text/babel">
function sleep(ms) {
return new Promise(resolve => setTimeout(resolve, ms));
}
async function readFile(url, index) {
console.log('Running index: ', index);
// First action
const firstTime = await axios.get(url);
console.log('First API response: ', firstTime.data.activity);
// Second action
await sleep(1000);
// Third action
const secondTime = await axios.get(url);
console.log('Second API response: ', secondTime.data.activity);
// Fourth action
await sleep(1000);
return secondTime.data;
}
async function sequence(urls, fn) {
const result = [];
await urls.reduce(async (promise, url, index) => promise.then(() => fn(url, index)).then((res) => result.push(res)), Promise.resolve());
return result;
}
const urls = [
'https://www.boredapi.com/api/activity',
'https://www.boredapi.com/api/activity',
'https://www.boredapi.com/api/activity',
];
(async function init() {
const result = await sequence(urls, readFile);
console.log('result', result);
})()
</script>
I use the following code to extend the Promise object. It handles rejection of the promises and returns an array of results
Code
/*
Runs tasks in sequence and resolves a promise upon finish
tasks: an array of functions that return a promise upon call.
parameters: an array of arrays corresponding to the parameters to be passed on each function call.
context: Object to use as context to call each function. (The 'this' keyword that may be used inside the function definition)
*/
Promise.sequence = function(tasks, parameters = [], context = null) {
return new Promise((resolve, reject)=>{
var nextTask = tasks.splice(0,1)[0].apply(context, parameters[0]); //Dequeue and call the first task
var output = new Array(tasks.length + 1);
var errorFlag = false;
tasks.forEach((task, index) => {
nextTask = nextTask.then(r => {
output[index] = r;
return task.apply(context, parameters[index+1]);
}, e=>{
output[index] = e;
errorFlag = true;
return task.apply(context, parameters[index+1]);
});
});
// Last task
nextTask.then(r=>{
output[output.length - 1] = r;
if (errorFlag) reject(output); else resolve(output);
})
.catch(e=>{
output[output.length - 1] = e;
reject(output);
});
});
};
Example
function functionThatReturnsAPromise(n) {
return new Promise((resolve, reject)=>{
//Emulating real life delays, like a web request
setTimeout(()=>{
resolve(n);
}, 1000);
});
}
var arrayOfArguments = [['a'],['b'],['c'],['d']];
var arrayOfFunctions = (new Array(4)).fill(functionThatReturnsAPromise);
Promise.sequence(arrayOfFunctions, arrayOfArguments)
.then(console.log)
.catch(console.error);
Your approach is not bad, but it does have two issues: it swallows errors and it employs the Explicit Promise Construction Antipattern.
You can solve both of these issues, and make the code cleaner, while still employing the same general strategy:
var Q = require("q");
var readFile = function(file) {
... // Returns a promise.
};
var readFiles = function(files) {
var readSequential = function(index) {
if (index < files.length) {
return readFile(files[index]).then(function() {
return readSequential(index + 1);
});
}
};
// using Promise.resolve() here in case files.length is 0
return Promise.resolve(readSequential(0)); // Start!
};
This is my sequentially implementation that I use in various projects:
const file = [file1, file2, file3];
const fileContents = sequentially(readFile, files);
// somewhere else in the code:
export const sequentially = async <T, P>(
toPromise: (element: T) => Promise<P>,
elements: T[]
): Promise<P[]> => {
const results: P[] = [];
await elements.reduce(async (sequence, element) => {
await sequence;
results.push(await toPromise(element));
}, Promise.resolve());
return results;
};
Here is my Angular/TypeScript approach, using RxJS:
Given an array of URL strings, convert it into an Observable using the from function.
Use pipe to wrap the Ajax request, immediate response logic, any desired delay, and error handling.
Inside of the pipe, use concatMap to serialize the requests. Otherwise, using Javascript forEach or map would make the requests at the same time.
Use RxJS ajax to make the call, and also to add any desired delay after each call returns.
Working example: https://stackblitz.com/edit/rxjs-bnrkix?file=index.ts
The code looks like this (I left in some extras so you can choose what to keep or discard):
import { ajax } from 'rxjs/ajax';
import { catchError, concatMap, delay, from, of, map, Observable } from 'rxjs';
const urls = [
'https://randomuser.me/api/',
'https://randomuser.me/api/',
'https://randomuser.me/api/',
];
const delayAfterCall = 500;
from(urls)
.pipe(
concatMap((url: string) => {
return ajax.getJSON(url).pipe(
map((response) => {
console.log('Done! Received:', response);
return response;
}),
catchError((error) => {
console.error('Error: ', error);
return of(error);
}),
delay(delayAfterCall)
);
})
)
.subscribe((response) => {
console.log('received email:', response.results[0].email);
});
On the basis of the question's title, "Resolve promises one after another (i.e. in sequence)?", we might understand that the OP is more interested in the sequential handling of promises on settlement than sequential calls per se.
This answer is offered :
to demonstrate that sequential calls are not necessary for sequential handling of responses.
to expose viable alternative patterns to this page's visitors - including the OP if he is still interested over a year later.
despite the OP's assertion that he does not want to make calls concurrently, which may genuinely be the case but equally may be an assumption based on the desire for sequential handling of responses as the title implies.
If concurrent calls are genuinely not wanted then see Benjamin Gruenbaum's answer which covers sequential calls (etc) comprehensively.
If however, you are interested (for improved performance) in patterns which allow concurrent calls followed by sequential handling of responses, then please read on.
It's tempting to think you have to use Promise.all(arr.map(fn)).then(fn) (as I have done many times) or a Promise lib's fancy sugar (notably Bluebird's), however (with credit to this article) an arr.map(fn).reduce(fn) pattern will do the job, with the advantages that it :
works with any promise lib - even pre-compliant versions of jQuery - only .then() is used.
affords the flexibility to skip-over-error or stop-on-error, whichever you want with a one line mod.
Here it is, written for Q.
var readFiles = function(files) {
return files.map(readFile) //Make calls in parallel.
.reduce(function(sequence, filePromise) {
return sequence.then(function() {
return filePromise;
}).then(function(file) {
//Do stuff with file ... in the correct sequence!
}, function(error) {
console.log(error); //optional
return sequence;//skip-over-error. To stop-on-error, `return error` (jQuery), or `throw error` (Promises/A+).
});
}, Q()).then(function() {
// all done.
});
};
Note: only that one fragment, Q(), is specific to Q. For jQuery you need to ensure that readFile() returns a jQuery promise. With A+ libs, foreign promises will be assimilated.
The key here is the reduction's sequence promise, which sequences the handling of the readFile promises but not their creation.
And once you have absorbed that, it's maybe slightly mind-blowing when you realise that the .map() stage isn't actually necessary! The whole job, parallel calls plus serial handling in the correct order, can be achieved with reduce() alone, plus the added advantage of further flexibility to :
convert from parallel async calls to serial async calls by simply moving one line - potentially useful during development.
Here it is, for Q again.
var readFiles = function(files) {
return files.reduce(function(sequence, f) {
var filePromise = readFile(f);//Make calls in parallel. To call sequentially, move this line down one.
return sequence.then(function() {
return filePromise;
}).then(function(file) {
//Do stuff with file ... in the correct sequence!
}, function(error) {
console.log(error); //optional
return sequence;//Skip over any errors. To stop-on-error, `return error` (jQuery), or `throw error` (Promises/A+).
});
}, Q()).then(function() {
// all done.
});
};
That's the basic pattern. If you wanted also to deliver data (eg the files or some transform of them) to the caller, you would need a mild variant.
If someone else needs a guaranteed way of STRICTLY sequential way of resolving Promises when performing CRUD operations you also can use the following code as a basis.
As long as you add 'return' before calling each function, describing a Promise, and use this example as a basis the next .then() function call will CONSISTENTLY start after the completion of the previous one:
getRidOfOlderShoutsPromise = () => {
return readShoutsPromise('BEFORE')
.then(() => {
return deleteOlderShoutsPromise();
})
.then(() => {
return readShoutsPromise('AFTER')
})
.catch(err => console.log(err.message));
}
deleteOlderShoutsPromise = () => {
return new Promise ( (resolve, reject) => {
console.log("in deleteOlderShouts");
let d = new Date();
let TwoMinuteAgo = d - 1000 * 90 ;
All_Shouts.deleteMany({ dateTime: {$lt: TwoMinuteAgo}}, function(err) {
if (err) reject();
console.log("DELETED OLDs at "+d);
resolve();
});
});
}
readShoutsPromise = (tex) => {
return new Promise( (resolve, reject) => {
console.log("in readShoutsPromise -"+tex);
All_Shouts
.find({})
.sort([['dateTime', 'ascending']])
.exec(function (err, data){
if (err) reject();
let d = new Date();
console.log("shouts "+tex+" delete PROMISE = "+data.length +"; date ="+d);
resolve(data);
});
});
}
Array push and pop method can be used for sequence of promises. You can also push new promises when you need additional data. This is the code, I will use in React Infinite loader to load sequence of pages.
var promises = [Promise.resolve()];
function methodThatReturnsAPromise(page) {
return new Promise((resolve, reject) => {
setTimeout(() => {
console.log(`Resolve-${page}! ${new Date()} `);
resolve();
}, 1000);
});
}
function pushPromise(page) {
promises.push(promises.pop().then(function () {
return methodThatReturnsAPromise(page)
}));
}
pushPromise(1);
pushPromise(2);
pushPromise(3);
(function() {
function sleep(ms) {
return new Promise(function(resolve) {
setTimeout(function() {
return resolve();
}, ms);
});
}
function serial(arr, index, results) {
if (index == arr.length) {
return Promise.resolve(results);
}
return new Promise(function(resolve, reject) {
if (!index) {
index = 0;
results = [];
}
return arr[index]()
.then(function(d) {
return resolve(d);
})
.catch(function(err) {
return reject(err);
});
})
.then(function(result) {
console.log("here");
results.push(result);
return serial(arr, index + 1, results);
})
.catch(function(err) {
throw err;
});
}
const a = [5000, 5000, 5000];
serial(a.map(x => () => sleep(x)));
})();
Here the key is how you call the sleep function. You need to pass an array of functions which itself returns a promise instead of an array of promises.

How to fix MongoError: Cannot use a session that has ended

I'm trying to read data from a MongoDB Atlas collection using Node.js. When I try to read the contents of my collection I get the error MongoError: Cannot use a session that has ended. Here is my code
client.connect(err => {
const collection = client
.db("sample_airbnb")
.collection("listingsAndReviews");
const test = collection.find({}).toArray((err, result) => {
if (err) throw err;
});
client.close();
});
I'm able to query for a specific document, but I'm not sure how to return all documents of a collection. I've searched for this error, I can't find much on it. Thanks
In your code, it doesn't wait for the find() to complete its execution and goes on to the client.close() statement. So by the time it tries to read data from the db, the connection has already ended. I faced this same problem and solved it like this:
// connect to your cluster
const client = await MongoClient.connect('yourMongoURL', {
useNewUrlParser: true,
useUnifiedTopology: true,
});
// specify the DB's name
const db = client.db('nameOfYourDB');
// execute find query
const items = await db.collection('items').find({}).toArray();
console.log(items);
// close connection
client.close();
EDIT: this whole thing should be in an async function.
Ran into the same issue when I updated the MongoClient from 3.3.2 to the latest version (3.5.2 as of this writing.) Either install only 3.3.2 version by changing the package.json "mongodb": "3.3.2", or just use async and await wrapper.
If still the issue persists, remove the node_modules and install again.
One option is to use aPromise chain. collection.find({}).toArray() can either receive a callback function or return a promise, so you can chain calls with .then()
collection.find({}).toArray() // returns the 1st promise
.then( items => {
console.log('All items', items);
return collection.find({ name: /^S/ }).toArray(); //return another promise
})
.then( items => {
console.log("All items with field 'name' beginning with 'S'", items);
client.close(); // Last promise in the chain closes the database
);
Of course, this daisy chaining makes the code more synchronous. This is useful when the next call in the chain relates to the previous one, like getting a user id in the first one, then looking up user detail in the next.
Several unrelated queries should be executed in parallel (async) and when all the results are back, dispose of the database connection.
You could do this by tracking each call in an array or counter, for example.
const totalQueries = 3;
let completedQueries = 0;
collection.find({}).toArray()
.then( items => {
console.log('All items', items);
dispose(); // Increments the counter and closes the connection if total reached
})
collection.find({ name: /^S/ }).toArray()
.then( items => {
console.log("All items with field 'name' beginning with 'S'", items);
dispose(); // Increments the counter and closes the connection if total reached
);
collection.find({ age: 55 }).toArray()
.then( items => {
console.log("All items with field 'age' with value '55'", items);
dispose(); // Increments the counter and closes the connection if total reached
);
function dispose(){
if (++completedQueries >= totalQueries){
client.close();
}
}
You have 3 queries. As each one invokes dispose() the counter increments. When they've all invoked dispose(), the last one will also close the connection.
Async/Await should make it even easier, because they unwrap the Promise result from the then function.
async function test(){
const allItems = await collection.find({}).toArray();
const namesBeginningWithS = await collection.find({ name: /^S/ }).toArray();
const fiftyFiveYearOlds = await collection.find({ age: 55 }).toArray();
client.close();
}
test();
Below is an example of how Async/Await can end up making async code behave sequentially and run inefficiently by waiting for one async function to complete before invoking the next one, when the ideal scenario is to invoke them all immediately and only wait until they all are complete.
let counter = 0;
function doSomethingAsync(id, start) {
return new Promise(resolve => {
setTimeout(() => {
counter++;
const stop = new Date();
const runningTime = getSeconds(start, stop);
resolve(`result${id} completed in ${runningTime} seconds`);
}, 2000);
});
}
function getSeconds(start, stop) {
return (stop - start) / 1000;
}
async function test() {
console.log('Awaiting 3 Async calls');
console.log(`Counter before execution: ${counter}`);
const start = new Date();
let callStart = new Date();
const result1 = await doSomethingAsync(1, callStart);
callStart = new Date();
const result2 = await doSomethingAsync(2, callStart);
callStart = new Date();
const result3 = await doSomethingAsync(3, callStart);
const stop = new Date();
console.log(result1, result2, result3);
console.log(`Counter after all ran: ${counter}`);
console.log(`Total time to run: ${getSeconds(start, stop)}`);
}
test();
Note: Awaiting like in the example above makes the calls sequential again. If each takes 2 seconds to run, the function will take 6 seconds to complete.
Combining the best of all worlds, you would want to use Async/Await while running all calls immediately. Fortunately, Promise has a method to do this, so test() can be written like this: -
async function test(){
let [allItems, namesBeginningWithS, fiftyFiveYearOlds] = await Promise.all([
collection.find({}).toArray(),
collection.find({ name: /^S/ }).toArray(),
collection.find({ age: 55 }).toArray()
]);
client.close();
}
Here's a working example to demonstrate the difference in performance: -
let counter = 0;
function doSomethingAsync(id, start) {
return new Promise(resolve => {
setTimeout(() => {
counter++;
const stop = new Date();
const runningTime = getSeconds(start, stop);
resolve(`result${id} completed in ${runningTime} seconds`);
}, 2000);
});
}
function getSeconds(start, stop) {
return (stop - start) / 1000;
}
async function test() {
console.log('Awaiting 3 Async calls');
console.log(`Counter before execution: ${counter}`);
const start = new Date();
const [result1, result2, result3] = await Promise.all([
doSomethingAsync(1, new Date()),
doSomethingAsync(2, new Date()),
doSomethingAsync(3, new Date())
]);
const stop = new Date();
console.log(result1, result2, result3);
console.log(`Counter after all ran: ${counter}`);
console.log(`Total time to run: ${getSeconds(start, stop)}`);
}
test();
other people have touched on this but I just want to highlight that .toArray() is executed asynchronously so you need to make sure that it has finished before closing the session
this won't work
const randomUser = await db.collection('user').aggregate([ { $sample: { size: 1 } } ]);
console.log(randomUser.toArray());
await client.close();
this will
const randomUser = await db.collection('user').aggregate([ { $sample: { size: 1 } } ]).toArray();
console.log(randomUser);
await client.close();
client.connect(err => {
const collection = client
.db("sample_airbnb")
.collection("listingsAndReviews");
const test = collection.find({}).toArray((err, result) => {
if (err) throw err;
client.close();
});
});

How do I ensure a Lambda function waits for call to an async function with await?

I'm trying to write a lambda function which accepts an image file via web form, and writes it as a new commit to a repository using code commit. For some reason, my lambda function seems to be exiting before the call to createCommit, even though I'm using await in a similar way to my previous calls in the function.
I've tried rewriting the function that wraps createCommit to just use promises, but that doesn't seem to work either. I'm wondering if there is some quirk of lambda that I don't know about or if I'm using async/await incorrectly (I just recently learned how to use them)
this is my main lambda event handler:
exports.handler = async (event) => {
const [file, lastCommitId] = await Promise.all([getFile(event), getLastCommitId()]);
return await createCommit(file, lastCommitId)
.then(result => returnResponse(result, 200))
.catch(err => returnResponse(err, 500));
};
this is my wrapper function for createCommit
async function createCommit(file, parentCommitId) {
const fileContent = await file.content.toString('base64');
const params = {
"authorName": "admin",
branchName,
"commitMessage": "add image",
"email": "n/a",
"parentCommitId": parentCommitId,
"putFiles": [
{
fileContent,
"fileMode": "NORMAL",
"filePath": `src/images/${file.filename}`
}
],
repositoryName
};
console.log("creating commit against last commit id " + parentCommitId);
const result = await codecommit.createCommit(params).promise();
console.log(JSON.stringify(result));
return result;
}
I expect the lambda function to wait until the call to createCommit finished, but it simply prints out the console.log starting with "creating commit against last commit..." and exits.
You should not use await and .then together.
change your code to and trycatch if you want to catch the exception or failed case.
exports.handler = async (event) => {
const [file, lastCommitId] = await Promise.all([getFile(event), getLastCommitId()]);
return await createCommit(file, lastCommitId);
};
See below example to understand the result in a better way.
function resolveAfter2Seconds() {
return new Promise(resolve => {
setTimeout(() => {
resolve('resolved');
}, 2000);
});
}
async function asyncCall() {
console.log('calling');
var result = await resolveAfter2Seconds().then(x=>console.log('inside then ', x));
console.log('after await ',result);
}
asyncCall();
and with out then
function resolveAfter2Seconds() {
return new Promise(resolve => {
setTimeout(() => {
resolve('resolved');
}, 2000);
});
}
async function asyncCall() {
console.log('calling');
var result = await resolveAfter2Seconds();
console.log('after await ',result);
}
asyncCall();
So it turns out I was using async/await correctly, I just had a 3 second timeout on the lambda function, so it was exiting before it was able to get a response from the createCommit call.

How to setTimeout on async await call node

How can I add a setTimeout to my async await function call?
I have
request = await getProduct(productids[i]);
where
const getProduct = async productid => {
return requestPromise(url + productid);
};
I've tried
request = await setTimeout((getProduct(productids[i])), 5000);
and got the error TypeError: "callback" argument must be a function which makes sense. The request is inside of a loop which is making me hit the rate limit on an api call.
exports.getProducts = async (req, res) => {
let request;
for (let i = 0; i <= productids.length - 1; i++) {
request = await getProduct(productids[i]);
//I want to wait 5 seconds before making another call in this loop!
}
};
You can use a simple little function that returns a promise that resolves after a delay:
function delay(t, val) {
return new Promise(function(resolve) {
setTimeout(function() {
resolve(val);
}, t);
});
}
// or a more condensed version
const delay = (t, val) => new Promise(resolve => setTimeout(resolve, t, val));
And, then await that inside your loop:
exports.getProducts = async (req, res) => {
let request;
for (let id of productids) {
request = await getProduct(id);
await delay(5000);
}
};
Note: I also switched your for loop to use for/of which is not required, but is a bit cleaner than what you had.
Or, in modern versions of nodejs, you can use timersPromises.setTimeout() which is a built-in timer that returns a promise (as of nodejs v15):
const setTimeoutP = require('timers/promises').setTimeout;
exports.getProducts = async (req, res) => {
let request;
for (let id of productids) {
request = await getProduct(id);
await setTimeoutP(5000);
}
};
Actually, I have a pretty standard chunk of code that I use to do that:
function PromiseTimeout(delayms) {
return new Promise(function (resolve, reject) {
setTimeout(resolve, delayms);
});
}
Usage:
await PromiseTimeout(1000);
If you're using Bluebird promises, then it's built in as Promise.timeout.
More to your problem: Have you checked API docs? Some APIs tell you how much you have to wait before next request. Or allow downloading data in larger bulk.
As of node v15 you can use the Timers Promises API:
const timersPromises = require('timers/promises');
async function test() {
await timersPromises.setTimeout(1000);
}
test();
Note that this feature is experimental and may change in future versions.
Since Node 15 and above, there is the new Timers Promises API that let you to avoid to build the wrapping:
import {
setTimeout,
setImmediate,
setInterval,
} from 'timers/promises';
console.log('before')
await setTimeout(1000)
console.log('after 1 sec')
So your issues you could write it with async iterator:
import {
setTimeout
} from 'timers/promises'
async function getProducts (req, res) {
const productids = [1, 2, 3]
for await (const product of processData(productids)) {
console.log(product)
}
}
async function * processData (productids) {
while (productids.length > 0) {
const id = productids.pop()
const product = { id }
yield product
await setTimeout(5000)
}
}
getProducts()
I have done api delay test as below.
It is possible to delay it as if by hanging setTimeout.
sleep(ms) {
const wakeUpTime = Date.now() + ms;
while (Date.now() < wakeUpTime) {}
}
callAPI = async() => {
... // Execute api logic
await this.sleep(2147483647);
... // Execute api logic
}
await callAPI();

Timeout in async/await

I'm with Node.js and TypeScript and I'm using async/await.
This is my test case:
async function doSomethingInSeries() {
const res1 = await callApi();
const res2 = await persistInDB(res1);
const res3 = await doHeavyComputation(res1);
return 'simle';
}
I'd like to set a timeout for the overall function. I.e. if res1 takes 2 seconds, res2 takes 0.5 seconds, res3 takes 5 seconds I'd like to have a timeout that after 3 seconds let me throw an error.
With a normal setTimeout call is a problem because the scope is lost:
async function doSomethingInSeries() {
const timerId = setTimeout(function() {
throw new Error('timeout');
});
const res1 = await callApi();
const res2 = await persistInDB(res1);
const res3 = await doHeavyComputation(res1);
clearTimeout(timerId);
return 'simle';
}
And I cannot catch it with normal Promise.catch:
doSomethingInSeries().catch(function(err) {
// errors in res1, res2, res3 will be catched here
// but the setTimeout thing is not!!
});
Any ideas on how to resolve?
You can use Promise.race to make a timeout:
Promise.race([
doSomethingInSeries(),
new Promise((_, reject) => setTimeout(() => reject(new Error('timeout')), 11.5e3))
]).catch(function(err) {
// errors in res1, res2, res3 and the timeout will be caught here
})
You cannot use setTimeout without wrapping it in a promise.
Ok I found this way:
async function _doSomethingInSeries() {
const res1 = await callApi();
const res2 = await persistInDB(res1);
const res3 = await doHeavyComputation(res1);
return 'simle';
}
async function doSomethingInSeries(): Promise<any> {
let timeoutId;
const delay = new Promise(function(resolve, reject){
timeoutId = setTimeout(function(){
reject(new Error('timeout'));
}, 1000);
});
// overall timeout
return Promise.race([delay, _doSomethingInSeries()])
.then( (res) => {
clearTimeout(timeoutId);
return res;
});
}
Anyone errors?
The things that smells a bit to me is that using Promises as asynchronicity strategy will send us to allocate too many object that some other strategy needs but this is off-topic.
Problem with #Bergi answer that doSomethingInSeries continues executing even if you already rejected the promise. It is much better to cancel it.
LATEST ANSWER
You can try use AbortController for that. Check the old answer to see how to use it - api is similar.
Keep in mind that task is not cancelled immediately, so continuation (awaiting, then or catch) is not called exactly after timeout.
To guarantee that you can combine this and #Bergi approach.
OLD ANSWER
This is how it should look like:
async const doSomethingInSeries = (cancellationToken) => {
cancellationToken.throwIfCancelled();
const res1 = await callApi();
cancellationToken.throwIfCancelled();
const res2 = await persistInDB(res1);
cancellationToken.throwIfCancelled();
const res3 = await doHeavyComputation(res1);
cancellationToken.throwIfCancelled();
return 'simle';
}
Here is simple implementation:
const makeCancellationToken = (tag) => {
let cancelled = false;
return {
isCancelled: () => cancelled,
cancel: () => {
cancelled = true;
},
throwIfCancelled: () => {
if (cancelled) {
const error = new Error(`${tag ?? 'Task'} cancelled`);
error.cancelled = true;
throw error;
}
}
}
}
And finally usage:
const cancellationToken = makeCancellationToken('doSomething')
setTimeout(cancellationToken.cancel, 5000);
try {
await doSomethingInSeries(cancellationToken);
} catch (error) {
if (error.cancelled) {
// handle cancellation
}
}
Keep in mind that task is not cancelled immediately, so continuation (awaiting, then or catch) is not called exactly after 5 secs.
To guarantee that you can combine this and #Bergi approach.

Resources