For example, I am writing a random generator with crypto.randomBytes(...) along with another async functions. To avoiding fall in callback hell, I though I could use the sync function of crypto.randomBytes. My doubt is if I do that my node program will stop each time I execute the code?. Then I thought if there are a list of async functions which their time to run is very short, these could work as synchronous function, then developing with this list of functions would be easy.
Using the mz module you can make crypto.randomBytes() return a promise. Using await (available in Node 7.x using the --harmony flag) you can use it like this:
let crypto = require('mz/crypto');
async function x() {
let bytes = await crypto.randomBytes(4);
console.log(bytes);
}
x();
The above is nonblocking even though it looks like it's blocking.
For a better demonstration consider this example:
function timeout(time) {
return new Promise(res => setTimeout(res, time));
}
async function x() {
for (let i = 0; i < 10; i++) {
console.log('x', i);
await timeout(2000);
}
}
async function y() {
for (let i = 0; i < 10; i++) {
console.log('y', i);
await timeout(3000);
}
}
x();
y();
And note that those two functions take a lot of time to execute but they don't block each other.
Run it with Node 7.x using:
node --harmony script-name.js
Or with Node 8.x with:
node script-name.js
I show you those examples to demonstrate that it's not a choice of async with callback hell and sync with nice code. You can actually run async code in a very elegant manner using the new async function and await operator available in ES2017 - it's good to read about it because not a lot of people know about those features.
They're asynchronous, learn to deal with it.
Promises now, and in the future ES2017's await and async will make your life a lot easier.
Bluebirds promisifyAll is extremely useful when dealing with any standard Node.js callback API. It adds functions tagged with Async that return a promise instead of requiring a callback.
const Promise = require('bluebird')
const crypto = Promise.promisifyAll(require('crypto'))
function randomString() {
return crypto.randomBytesAsync(4).then(bytes => {
console.log('got bytes', bytes)
return bytes.toString('hex')
})
}
randomString()
.then(string => console.log('string is', string))
.catch(error => console.error(error))
Related
Building a node.js CLI application. Users should choose some tasks to run and based on that, tasks should work and then spinners (using ora package) should show success and stop spin.
The issue here is spinner succeed while tasks are still going on. Which means it doesn't wait.
Tried using typical Async/Await as to have an async function and await each function under condition. Didn't work.
Tried using promise.all. Didn't work.
Tried using waterfall. Same.
Here's the code of the task runner, I create an array of functions and pass it to waterfall (Async-waterfall package) or promise.all() method.
const runner = async () => {
let tasks = [];
spinner.start('Running tasks');
if (syncOptions.includes('taskOne')) {
tasks.push(taskOne);
}
if (syncOptions.includes('taskTwo')) {
tasks.push(taskTwo);
}
if (syncOptions.includes('taskThree')) {
tasks.push(taskThree);
}
if (syncOptions.includes('taskFour')) {
tasks.push(taskFour);
}
// Option One
waterfall(tasks, () => {
spinner.succeed('Done');
});
// Option Two
Promise.all(tasks).then(() => {
spinner.succeed('Done');
});
};
Here's an example of one of the functions:
const os = require('os');
const fs = require('fs');
const homedir = os.homedir();
const outputDir = `${homedir}/output`;
const file = `${homedir}/.file`;
const targetFile = `${outputDir}/.file`;
module.exports = async () => {
await fs.writeFileSync(targetFile, fs.readFileSync(file));
};
I tried searching concepts. Talked to the best 5 people I know who can write JS properly. No clue.. What am I doing wrong ?
You don't show us all your code, but the first warning sign is that it doesn't appear you are actually running taskOne(), taskTwo(), etc...
You are pushing what look like functions into an array with code like:
tasks.push(taskFour);
And, then attempting to do:
Promise.all(tasks).then(...)
That won't do anything useful because the tasks themselves are never executed. To use Promise.all(), you need to pass it an array of promises, not an array of functions.
So, you would use:
tasks.push(taskFour());
and then:
Promise.all(tasks).then(...);
And, all this assumes that taskOne(), taskTwo(), etc... are function that return a promise that resolves/rejects when their asynchronous operation is complete.
In addition, you also need to either await Promise.all(...) or return Promise.all() so that the caller will be able to know when they are all done. Since this is the last line of your function, I'd generally just use return Promise.all(...) and this will let the caller get the resolved results from all the tasks (if those are relevant).
Also, this doesn't make much sense:
module.exports = async () => {
await fs.writeFileSync(targetFile, fs.readFileSync(file));
};
You're using two synchronous file operations. They are not asynchronous and do not use promises so there's no reason to put them in an async function or to use await with them. You're mixing two models incorrectly. If you want them to be synchronous, then you can just do this:
module.exports = () => {
fs.writeFileSync(targetFile, fs.readFileSync(file));
};
If you want them to be asynchronous and return a promise, then you can do this:
module.exports = async () => {
return fs.promises.writeFile(targetFile, await fs.promises.readFile(file));
};
Your implementation was attempting to be half and half. Pick one architecture or the other (synchronous or asynchronous) and be consistent in the implementation.
FYI, the fs module now has multiple versions of fs.copyFile() so you could also use that and let it do the copying for you. If this file was large, copyFile() would likely use less memory in doing so.
As for your use of waterfall(), it is probably not necessary here and waterfall uses a very different calling model than Promise.all() so you certainly can't use the same model with Promise.all() as you do with waterfall(). Also, waterfall() runs your functions in sequence (one after the other) and you pass it an array of functions that have their own calling convention.
So, assuming that taskOne, taskTwo, etc... are functions that return a promise that resolve/reject when their asynchronous operations are done, then you would do this:
const runner = () => {
let tasks = [];
spinner.start('Running tasks');
if (syncOptions.includes('taskOne')) {
tasks.push(taskOne());
}
if (syncOptions.includes('taskTwo')) {
tasks.push(taskTwo());
}
if (syncOptions.includes('taskThree')) {
tasks.push(taskThree());
}
if (syncOptions.includes('taskFour')) {
tasks.push(taskFour());
}
return Promise.all(tasks).then(() => {
spinner.succeed('Done');
});
};
This would run the tasks in parallel.
If you want to run the tasks in sequence (one after the other), then you would do this:
const runner = async () => {
spinner.start('Running tasks');
if (syncOptions.includes('taskOne')) {
await taskOne();
}
if (syncOptions.includes('taskTwo')) {
await taskTwo();
}
if (syncOptions.includes('taskThree')) {
await taskThree();
}
if (syncOptions.includes('taskFour')) {
await taskFour();
}
spinner.succeed('Done');
};
I am looking for a answer on what to use in my nodeJS app.
I have code which handles my generic dB access to mssql. This code is written using an async functions and then I used a promise to call that function and all works fine.
As my app is getting bigger and code larger I am planning to move some of the logic into functions and then call them.
So my question is: is there a drawback to using a mix of async/await and promises or does it really not matter?
Async / await makes it easier to write more readable code as I have to read and write to multiple db’s before I return something and I need results of some of these.
So the question is what is the better approach?
Async / await on dB layer that’s set and can’t change
The logic layer async / await which would allow me a async / and await on the function call or if I go with promise for logic then I am stuck with promise on function call.
So I hope someone can give me more insight if one has more advantages than the other, besides being able to write cleaner code.
async/await and promises are closely related. async functions return promises, and await is syntactic sugar for waiting for a promise to be resolved.
The only drawback from having a mix of promises and async functions might be readability and maintainability of the code, but you can certainly use the return value of async functions as promises as well as await for regular functions that return a promise.
Whether you choose one vs the other mostly depends on availability (does your node.js / browser support async?) and on your aesthetic preference, but a good rule of thumb (based on my own preference at the time of writing) could be:
If you need to run asynchronous code in series: consider using async/await:
return asyncFunction()
.then(result => f1(result))
.then(result2 => f2(result2));
vs
const result = await asyncFunction();
const result2 = await f1(result);
return await f2(result2);
If you need nested promises: use async/await:
return asyncFunction()
.then(result => {
return f1(result)
.then(result2 => f2(result, result2);
})
vs
const result = await asyncFunction();
const result2 = await f1(result);
return await f2(result, result2);
If you need to run it in parallel: use promises.
return Promise.all(arrayOfIDs.map(id => asyncFn(id)))
It has been suggested you can use await within an expression to await multiple tasks like so:
*note, this still awaits in sequence from left to right, which is OK if you don't expect errors. Otherwise the behaviour is different due to fail fast behaviour of Promise.all()
const [r1, r2, r3] = [await task1, await task2, await task3];
(async function() {
function t1(t) {
console.time(`task ${t}`);
console.log(`start task ${t}`);
return new Promise((resolve, reject) => {
setTimeout(() => {
console.timeEnd(`task ${t}`);
resolve();
}, t);
})
}
console.log('Create Promises');
const task1 = t1(100);
const task2 = t1(200);
const task3 = t1(10);
console.log('Await for each task');
const [r1, r2, r3] = [await task1, await task2, await task3];
console.log('Done');
}())
But as with Promise.all, the parallel promises need to be properly handled in case of an error. You can read more about that here.
Be careful not to confuse the previous code with the following:
let [r1, r2] = [await t1(100), await t2(200)];
function t1(t) {
console.time(`task ${t}`);
console.log(`start task ${t}`);
return new Promise((resolve, reject) => {
setTimeout(() => {
console.timeEnd(`task ${t}`);
resolve();
}, t);
})
}
console.log('Promise');
Promise.all([t1(100), t1(200), t1(10)]).then(async() => {
console.log('Await');
let [r1, r2, r3] = [await t1(100), await t1(200), await t1(10)]
});
Using these two methods is not equivalent. Read more about the difference.
In the end, Promise.all is a cleaner approach that scales better to an arbitrary number of tasks.
Actually it depends on your node version, But if you can use async/await then your code will be more readable and easier to maintain.
When you define a function as 'async' then it returns a native Promise, and when you call it using await it executes Promise.then.
Note:
Put your await calls inside a try/catch, because if the Promise fails it issues 'catch' which you can handle inside the catch block.
try{
let res1 = await your-async-function(parameters);
let res2 = await your-promise-function(parameters);
await your-async-or-promise-function(parameters);
}
catch(ex){
// your error handler goes here
// error is caused by any of your called functions which fails its promise
// this methods breaks your call chain
}
also you can handle your 'catch' like this:
let result = await your-asyncFunction(parameters).catch((error)=>{//your error handler goes here});
this method mentioned does not produce an exception so the execution goes on.
I do not think there is any performance difference between async/await other than the native Promise module implementation.
I would suggest to use bluebird module instead of native promise built into node.
At this point the only reason to use Promises is to call multiple asynchronous jobs using Promise.all() Otherwise you’re usually better with async/await or Observables.
Its depending upon what approach you are good with, both promise and async/await are good, but if you want to write asynchronous code, using synchronous code structure you should use async/await approach.Like following example, a function return user with both Promise or async/await style.
if we use Promise:
function getFirstUser() {
return getUsers().then(function(users) {
return users[0].name;
}).catch(function(err) {
return {
name: 'default user'
};
});
}
if we use aysnc/await
async function getFirstUser() {
try {
let users = await getUsers();
return users[0].name;
} catch (err) {
return {
name: 'default user'
};
}
}
Here in promise approach we need a thenable structure to follow and in async/await approach we use 'await' to hold execution of asynchronous function.
you can checkout this link for more clarity Visit https://medium.com/#bluepnume/learn-about-promises-before-you-start-using-async-await-eb148164a9c8
Yesterday I made a tentative decision to switch from using Promises to using Async/Await, independent of nodejs, based on the difficulty in accessing previous values in the Promise chain. I did come up with a compact solution using 'bind' to save values inside the 'then' functions, but Async seemed much nicer (and it was) in allowing direct access to local variables and arguments. And the more obvious advantage of Async/Await is, of course, the elimination of the distracting explicit 'then' functions in favor of a linear notation that looks much like ordinary function calls.
However, my reading today uncovered problems with Async/Await, which derail my decision. I think I'll stick with Promises (possibly using a macro preprocessor to make the 'then' functions look simpler) until Async/Await gets fixed, a few years from now.
Here are the problems I found. I'd love to find out that I am wrong, that these have easy solutions.
Requires an outer try/catch or a final Promise.catch(), otherwise errors and exceptions are lost.
A final await requires either a Promise.then() or an extra outer async function.
Iteration can only be properly done with for/of, not with other iterators.
Await can only wait for only one Promise at a time, not parallel Promises like Promise chains with Promise.all.
Await doesn't support Promise.race(), should it be needed.
Below is code:
var fs = require('fs')
for(let i=0;i<6551200;i++){
fs.appendFile('file',i,function(err){
})
}
When I run this code, after a few seconds, it show:
FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
and yet nothing in file!
my qusetion is :
why is no byte in file?
where cause out of memory?
how to async write file in for loop no mater how large the write times?
thanks advance.
Bottom line here is that fs.appendFile() is an asynchronous call and you simply are not "awaiting" that call to complete on each loop iteration. This has a number of consequences, including but not limited to:
The callbacks keep getting allocated before they are resolved, which results in the "heap out of memory" eventually being reached.
You are contesting with a file handle, since you the function you are employing is actually opening/writing/closing the file given, and if you don't wait for each turn to do so, then you're simply going to clash.
So the simple solution here is to "wait", and some modern syntax sugar makes that easy:
const fs = require('mz/fs');
const x = 6551200;
(async function() {
try {
const fd = await fs.open('file','w');
for (let i = 0; i < x; i++) {
await fs.write(fd, `${i}\n`);
}
await fs.close(fd);
} catch(e) {
console.error(e)
} finally {
process.exit();
}
})()
That will of course take a while, but it's not going to "blow up" your system whilst it does it's work.
The very first simplified thing is to just get hold of the mz library, which already wraps common nodejs libraries with modernized versions of each function supporting promises. This will help clean up the syntax a lot as opposed to using callbacks.
The next thing to realize is what was mentioned about that fs.appendFile() in how it is "opening/writing/closing" all in one call. That's not great, so what you would typically do is simply open and then write the bytes in a loop, and when that is complete you can actually close the file handle.
That "sugar" comes in modern versions, and though "possible" with plain promise chaining, it's still not really that manageable. So if you don't actually have a nodejs environment that supports that async/await sugar or the tools to "transpile" such code, then you might alternately consider using the asyncjs libary with plain callbacks:
const Async = require('async');
const fs = require('fs');
const x = 6551200;
let i = 0;
fs.open('file','w',(err,fd) => {
if (err) throw err;
Async.whilst(
() => i < x,
callback => fs.write(fd,`${i}\n`,err => {
i++;
callback(err)
}),
err => {
if (err) throw err;
fs.closeSync(fd);
process.exit();
}
);
});
The same base principle applies as we are "waiting" for each callback to complete before continuing. the whilst() helper here allows iteration until the test condition is met, and of course does not do the next iteration until data is passed to the callback of the iterator itself.
There are other ways to approach this, but those are probably the two most sane for a "large loop" of iterations. Common approaches such as "chaining" via .reduce() are really more suited to a "reasonable" sized array of data you already have, and building arrays of such sizes here has inherent problems of it's own.
For instance, the following "works" ( on my machine at least ) but it really consumes a lot of resources to do it:
const fs = require('mz/fs');
const x = 6551200;
fs.open('file','w')
.then( fd =>
[ ...Array(x)].reduce(
(p,e,i) => p.then( () => fs.write(fd,`${i}\n`) )
, Promise.resolve()
)
.then(() => fs.close(fd))
)
.catch(e => console.error(e) )
.then(() => process.exit());
So that's really not that practical to essentially build such a large chain in memory and then allow it to resolve. You could put some "governance" on this, but the main two approaches as shown are a lot more straightforward.
For that case then you either have the async/await sugar available as it is within current LTS versions of Node ( LTS 8.x ), or I would stick with the other tried and true "async helpers" for callbacks where you were restricted to a version without that support
You can of course "promisify" any function with the last few releases of nodejs right "out of the box" as it where, as Promise has been a global thing for some time:
const fs = require('fs');
await new Promise((resolve, reject) => fs.open('file','w',(err,fd) => {
if (err) reject(err);
resolve(fd);
});
So there really is no need to import libraries just to do that, but the mz library given as example here does all of that for you. So it's really up to personal preferences on bringing in additional dependencies.
Javascript is a single threaded language, which means your code can execute one function at the time. So when you execute an async function, it will be "queued" in the stack to be executed next.
so in your code, you are sending 6551200 calls to the stack, which would of course crash your app before starting working "appendFile" on any of them.
you can achieve what you want by splitting your loop into smaller loops, use async and await functions, or iterators.
if what you are trying to achieve is as simple as your code, you can use the following:
const fs = require("fs");
function SomeTask(i=0){
fs.appendFile('file',i,function(err){
//err in the write function
if(err) console.log("Error", err);
//check if you want to continue (loop)
if(i<6551200) return SomeTask(i);
//on finish
console.log("done");
});
}
SomeTask();
In the above code, you write a single line, and when that is done, you call the next one.
This function is just for basic usage, it needs a refactor and use of Javascript Iterators for advanced usage check out Iterators and generators on MDN web docs
1 - The file is empty because none of the fs.append calls have ever finished, the Node.JS process broken before.
2 - The Node.JS heap memory is limited and stores the callback until it returns, not only the "i" variable.
3 - You could try to use promises to do that.
"use strict";
const Bluebird = require('bluebird');
const fs = Bluebird.promisifyAll(require('fs'));
let promisses = [];
for (let i = 0; i < 6551200; i++){
promisses.push(fs.appendFileAsync('file', i + '\n'));
}
Bluebird.all(promisses)
.then(data => {
console.log(data, 'End.');
})
.catch(e => console.error(e));
But no logic can avoid heap memory error for a loop this big. You could increase Node.JS Heep Memory or, the reasonable way, take chunks of data for interval:
'use strict';
const fs = require('fs');
let total = 6551200;
let interval = setInterval(() => {
fs.appendFile('file', total + '\n', () => {});
total--;
if (total < 1) {
clearInterval(interval);
}
}, 1);
Can I make a synchronous method into asynchronous by using promise?
For example reading a file synchronously (yes there is fs.readFile which has callback):
// Synchronous read
var data = fs.readFileSync('input.txt');
Should I do this:
function readFileAsync(){
return new Promise((resolve, reject) => {
try {
resolve(fs.readFileSync('input.txt'));
} catch(err) {
reject(err);
}
})
}
or use async/await:
function async readFileAsync(){
try {
let result = await fs.readFileSync('input.txt');
return result;
} catch(err) {
return err;
}
})
}
TL;DR NO, pure synchronous functions are not promisifiable in order to avoid blockage
No. For a method to be promisifiable it needs to be already asynchronous, i.e. return immediately, and also use callbacks upon finish.
For example:
function loop1000() {
for (let i = 0; i < 1000; ++i) {}
}
Is not promisifiable because it does not return immediately and does not use callbacks. But
function loop1000(err, callback) {
process.nextTick(() => {
for (let i = 0; i < 1000; ++i) { }
callback();
});
}
Is promisifiable as
function loop1000promisified() {
return new Promise((resolve, reject) => loop1000(resolve));
}
BUT all those approaches are going to block on the loop anyway. The original version blocks immediately and the one using process.nextTick() will block on the next processor tick. Making the application unresponsive for the duration of the loop.
If you wanted to make loop1000() asynchronous friendly you could rewrite it as:
function loop1000(err, callback) {
const segmentDuration = 10;
const loopEnd = 1000;
let i = 0;
function computeSegment() {
for (let segment = 0;
segment < segmentDuration && i < loopEnd;
++segment, ++i) { }
if (i == loopEnd) {
callback();
return;
}
process.nextTick(computeSegment);
}
computeSegment();
}
So instead of a longer blocking time it would have several smaller blockings. Then the promisified version loop1000promisified() could make some sense.
disclaimer: code typed directly on SO w/o any test.
Can I make a synchronous method into asynchronous by using promise?
No.
Can I make a synchronous method into asynchronous at all?
No. That's why promises don't help here. You need to use the natively asynchronous counterpart, i.e. fs.readFile instead of fs.readFileSync in your case.
Regarding your alternatives, you probably should do neither. But if you absolutely need a synchronous function that returns a fulfilled or rejected promise (instead of throwing exceptions), you can do
function readFileSync(){
return new Promise(resolve => {
resolve(fs.readFileSync('input.txt'))
});
}
or
async function readFileSync() {
return fs.readFileSync('input.txt');
}
I would re-phrase the the other to answers from "No" to "Not Really".
First a point of clarification: In NodeJS, everything is asynchronous, except your code. Specifically, one bit of your code will never run in parallel with another bit of your code -- but the NodeJS runtime may manage other tasks (namely IO) at the same time your code is being executed.
The beauty of functions like fs.readFile is that the IO happens in parallel with your code. For example:
fs.readFile("some/file",
function(err,data){console.log("done reading file (or failed)")});
do.some("work");
The second line of code will be executed while NodeJS is busily reading the file into memory. The problem with fs.readFileSync is that when you call it, NodeJS stops evaluating your code (all if it!) until the IO is done (i.e. the file has been read into memory, in this case). So if you mean to ask "can you take a blocking (presumably IO) function and make it non-blocking using promises?", the answer is definitely "no".
Can you use promises to control the order in which a blocking function is called? Of course. Promises are just a fancy way of declaring the order in which call backs are called -- but everything you can do with a promise, you can do with setImmediate() (albeit with a lot less clarity and a lot more effort).
I would disagree slightly with the others who say you should never promisify your function. There ARE cases when you want to promisify a function. For example a legacy code base that uses native processes and similar, where no callbacks and no promises were used, but you can assume the function is async and will execute within certain time.
Instead of writing a ton of setTimeout() callbacks you want to use promises.
This is how I do it for the testing purposes. Check the Ph library, especially the promisify function, and check how it is used to set up the mocha test in before function.
// Initial state
var foo = 1;
var xml = "";
// Promise helper library
var Ph = (function(){
return {
delay: function (milis){
var milis = milis || 200;
return function(){
return new Promise(function(resolve, reject){
setTimeout(function(){
resolve();
}, milis)
})
}
},
promisify: function(syncFunc){
return new Promise(function(resolve, reject){
syncFunc();
resolve();
})
}
}
}());
// 'Synchronous' functions to promisify
function setXML(){
console.log("setting XML");
xml = "<bar>";
}
function setVars(){
console.log("setting Vars");
foo = 2;
}
// Test setup
before(function(done) {
this.timeout(0);
Promise.resolve()
.then(promisify(setXML))
.then(Ph.delay(3000))
.then(Ph.promisify(setVars))
.then(Ph.delay(3000))
.then(function(){
done();
})
});
// Test assertions
describe("Async setup", function(done){
it("should have XML set", function(done){
expect(xml).to.be.not.equal("");
done();
});
it("should have foo not equal 1.", function(done){
expect(foo).to.be.not.equal(1);
done();
});
it("should have foo equal to 2.", function(done){
expect(foo).to.be.equal(2);
done();
});
});
To make it work in IE, I use Promise CDN:
<script src="https://cdnjs.cloudflare.com/ajax/libs/es6-promise/4.1.1/es6-promise.auto.min.js"></script>
I am trying to do a for loop in a promise but unfortunatly the output that comes out it is not what i am expecting:
My code
var ahaha = function mytestfunction(numb){
return new Promise(function(resolve, reject) {
console.log(numb);
return resolve('test');
})
.then(function(x) {
z='firststep' + x ;
console.log(z);
return z;
})
.then(function(z) {
console.log(z + 'secondstep');
return z
})
.then(function(z) {
console.log(z + 'thirdstep')
});
};
var promises = [];
for (var i = 1; i <= 2; i++) {
promises.push(ahaha(i));
}
Promise.all(promises)
.then(function(data){ console.log("everything is finished")});
What it returns is :
1
2
firststeptest
firststeptest
firststeptestsecondstep
firststeptestsecondstep
firststeptestthirdstep
firststeptestthirdstep
everything is finished
But i want it to return
1
firststeptest
firststeptestsecondstep
firststeptestthirdstep
2
firststeptest
firststeptestsecondstep
firststeptestthirdstep
everything is finished
I don't understand why the promises are not chained one after the other one.
Note that i succeed doing this operation using async.waterfall but i also want to know how to do it with promises.
Thanks a lot
Promise.all() is purposely used when you want things to run in parallel and it will tell you when all are done and they may each finish in any order.
There are lots of different ways to sequence things using promises. If you just have two function calls like your code shows, you can just do them manually:
ahaha(1).then(result => ahaha(2)).then(data => {
console.log("everything finished");
});
Or, a common pattern using .reduce():
[1,2].reduce(p, val => {
return p.then(() => ahaha(val));
}, Promise.resolve()).then(data => {
// everything done here
});
Or, my favorite using the Bluebird promise library:
Promise.mapSeries([1,2], ahaha).then(result => {
// everything done here
});
There are many other schemes, which you can see in these other answers:
How to synchronize a sequence of promises?
JavaScript: Perform a chain of promises synchronously
ES6 Promises - something like async.each?
How can I execute shell commands in sequence?
Can Promise load multi urls in order?
Promise.all is used to run the promises in parallel, not sequentially.
Using the popular BlueBird library, you could have used reduce but there's no equivalent function in standard ES6 but you can do this:
promises.reduce(
(p, next) => p.then(next),
Promise.resolve()
).then(data=>{ console.log("everything is finished")});