Trigger the execution of a function if any condition is met - node.js

I'm writing an HTTP API with expressjs in Node.js and here is what I'm trying to achieve:
I have a regular task that I would like to run regularly, approx every minute. This task is implemented with an async function named task.
In reaction to a call in my API I would like to have that task called immediately as well
Two executions of the task function must not be concurrent. Each execution should run to completion before another execution is started.
The code looks like this:
// only a single execution of this function is allowed at a time
// which is not the case with the current code
async function task(reason: string) {
console.log("do thing because %s...", reason);
await sleep(1000);
console.log("done");
}
// call task regularly
setIntervalAsync(async () => {
await task("ticker");
}, 5000) // normally 1min
// call task immediately
app.get("/task", async (req, res) => {
await task("trigger");
res.send("ok");
});
I've put a full working sample project at https://github.com/piec/question.js
If I were in go I would do it like this and it would be easy, but I don't know how to do that with Node.js.
Ideas I have considered or tried:
I could apparently put task in a critical section using a mutex from the async-mutex library. But I'm not too fond of adding mutexes in js code.
Many people seem to be using message queue libraries with worker processes (bee-queue, bullmq, ...) but this adds a dependency to an external service like redis usually. Also if I'm correct the code would be a bit more complex because I need a main entrypoint and an entrypoint for worker processes. Also you can't share objects with the workers as easily as in a "normal" single process situation.
I have tried RxJs subject in order to make a producer consumer channel. But I was not able to limit the execution of task to one at a time (task is async).
Thank you!

You can make your own serialized asynchronous queue and run the tasks through that.
This queue uses a flag to keep track of whether it's in the middle of running an asynchronous operation already. If so, it just adds the task to the queue and will run it when the current operation is done. If not, it runs it now. Adding it to the queue returns a promise so the caller can know when the task finally got to run.
If the tasks are asynchronous, they are required to return a promise that is linked to the asynchronous activity. You can mix in non-asynchronous tasks too and they will also be serialized.
class SerializedAsyncQueue {
constructor() {
this.tasks = [];
this.inProcess = false;
}
// adds a promise-returning function and its args to the queue
// returns a promise that resolves when the function finally gets to run
add(fn, ...args) {
let d = new Deferred();
this.tasks.push({ fn, args: ...args, deferred: d });
this.check();
return d.promise;
}
check() {
if (!this.inProcess && this.tasks.length) {
// run next task
this.inProcess = true;
const nextTask = this.tasks.shift();
Promise.resolve(nextTask.fn(...nextTask.args)).then(val => {
this.inProcess = false;
nextTask.deferred.resolve(val);
this.check();
}).catch(err => {
console.log(err);
this.inProcess = false;
nextTask.deferred.reject(err);
this.check();
});
}
}
}
const Deferred = function() {
if (!(this instanceof Deferred)) {
return new Deferred();
}
const p = this.promise = new Promise((resolve, reject) => {
this.resolve = resolve;
this.reject = reject;
});
this.then = p.then.bind(p);
this.catch = p.catch.bind(p);
if (p.finally) {
this.finally = p.finally.bind(p);
}
}
let queue = new SerializedAsyncQueue();
// utility function
const sleep = function(t) {
return new Promise(resolve => {
setTimeout(resolve, t);
});
}
// only a single execution of this function is allowed at a time
// so it is run only via the queue that makes sure it is serialized
async function task(reason: string) {
function runIt() {
console.log("do thing because %s...", reason);
await sleep(1000);
console.log("done");
}
return queue.add(runIt);
}
// call task regularly
setIntervalAsync(async () => {
await task("ticker");
}, 5000) // normally 1min
// call task immediately
app.get("/task", async (req, res) => {
await task("trigger");
res.send("ok");
});

Here's a version using RxJS#Subject that is almost working. How to finish it depends on your use-case.
async function task(reason: string) {
console.log("do thing because %s...", reason);
await sleep(1000);
console.log("done");
}
const run = new Subject<string>();
const effect$ = run.pipe(
// Limit one task at a time
concatMap(task),
share()
);
const effectSub = effect$.subscribe();
interval(5000).subscribe(_ =>
run.next("ticker")
);
// call task immediately
app.get("/task", async (req, res) => {
effect$.pipe(
take(1)
).subscribe(_ =>
res.send("ok")
);
run.next("trigger");
});
The issue here is that res.send("ok") is linked to the effect$ streams next emission. This may not be the one generated by the run.next you're about to call.
There are many ways to fix this. For example, you can tag each emission with an ID and then wait for the corresponding emission before using res.send("ok").
There are better ways too if calls distinguish themselves naturally.
A Clunky ID Version
Generating an ID randomly is a bad idea, but it gets the general thrust across. You can generate unique IDs however you like. They can be integrated directly into the task somehow or can be kept 100% separate the way they are here (task itself has no knowledge that it's been assigned an ID before being run).
interface IdTask {
taskId: number,
reason: string
}
interface IdResponse {
taskId: number,
response: any
}
async function task(reason: string) {
console.log("do thing because %s...", reason);
await sleep(1000);
console.log("done");
}
const run = new Subject<IdTask>();
const effect$: Observable<IdResponse> = run.pipe(
// concatMap only allows one observable at a time to run
concatMap((eTask: IdTask) => from(task(eTask.reason)).pipe(
map((response:any) => ({
taskId: eTask.taskId,
response
})as IdResponse)
)),
share()
);
const effectSub = effect$.subscribe({
next: v => console.log("This is a shared task emission: ", v)
});
interval(5000).subscribe(num =>
run.next({
taskId: num,
reason: "ticker"
})
);
// call task immediately
app.get("/task", async (req, res) => {
const randomId = Math.random();
effect$.pipe(
filter(({taskId}) => taskId == randomId),
take(1)
).subscribe(_ =>
res.send("ok")
);
run.next({
taskId: randomId,
reason: "trigger"
});
});

Related

How to use Await Inside Array.map for API's response [duplicate]

Consider the following code that reads an array of files in a serial/sequential manner. readFiles returns a promise, which is resolved only once all files have been read in sequence.
var readFile = function(file) {
... // Returns a promise.
};
var readFiles = function(files) {
return new Promise((resolve, reject) => {
var readSequential = function(index) {
if (index >= files.length) {
resolve();
} else {
readFile(files[index]).then(function() {
readSequential(index + 1);
}).catch(reject);
}
};
readSequential(0); // Start with the first file!
});
};
The above code works, but I don't like having to do recursion for things to occur sequentially. Is there a simpler way that this code can be re-written so that I don't have to use my weird readSequential function?
Originally I tried to use Promise.all, but that caused all of the readFile calls to happen concurrently, which is not what I want:
var readFiles = function(files) {
return Promise.all(files.map(function(file) {
return readFile(file);
}));
};
Update 2017: I would use an async function if the environment supports it:
async function readFiles(files) {
for(const file of files) {
await readFile(file);
}
};
If you'd like, you can defer reading the files until you need them using an async generator (if your environment supports it):
async function* readFiles(files) {
for(const file of files) {
yield await readFile(file);
}
};
Update: In second thought - I might use a for loop instead:
var readFiles = function(files) {
var p = Promise.resolve(); // Q() in q
files.forEach(file =>
p = p.then(() => readFile(file));
);
return p;
};
Or more compactly, with reduce:
var readFiles = function(files) {
return files.reduce((p, file) => {
return p.then(() => readFile(file));
}, Promise.resolve()); // initial
};
In other promise libraries (like when and Bluebird) you have utility methods for this.
For example, Bluebird would be:
var Promise = require("bluebird");
var fs = Promise.promisifyAll(require("fs"));
var readAll = Promise.resolve(files).map(fs.readFileAsync,{concurrency: 1 });
// if the order matters, you can use Promise.each instead and omit concurrency param
readAll.then(function(allFileContents){
// do stuff to read files.
});
Although there is really no reason not to use async await today.
Here is how I prefer to run tasks in series.
function runSerial() {
var that = this;
// task1 is a function that returns a promise (and immediately starts executing)
// task2 is a function that returns a promise (and immediately starts executing)
return Promise.resolve()
.then(function() {
return that.task1();
})
.then(function() {
return that.task2();
})
.then(function() {
console.log(" ---- done ----");
});
}
What about cases with more tasks? Like, 10?
function runSerial(tasks) {
var result = Promise.resolve();
tasks.forEach(task => {
result = result.then(() => task());
});
return result;
}
This question is old, but we live in a world of ES6 and functional JavaScript, so let's see how we can improve.
Because promises execute immediately, we can't just create an array of promises, they would all fire off in parallel.
Instead, we need to create an array of functions that returns a promise. Each function will then be executed sequentially, which then starts the promise inside.
We can solve this a few ways, but my favorite way is to use reduce.
It gets a little tricky using reduce in combination with promises, so I have broken down the one liner into some smaller digestible bites below.
The essence of this function is to use reduce starting with an initial value of Promise.resolve([]), or a promise containing an empty array.
This promise will then be passed into the reduce method as promise. This is the key to chaining each promise together sequentially. The next promise to execute is func and when the then fires, the results are concatenated and that promise is then returned, executing the reduce cycle with the next promise function.
Once all promises have executed, the returned promise will contain an array of all the results of each promise.
ES6 Example (one liner)
/*
* serial executes Promises sequentially.
* #param {funcs} An array of funcs that return promises.
* #example
* const urls = ['/url1', '/url2', '/url3']
* serial(urls.map(url => () => $.ajax(url)))
* .then(console.log.bind(console))
*/
const serial = funcs =>
funcs.reduce((promise, func) =>
promise.then(result => func().then(Array.prototype.concat.bind(result))), Promise.resolve([]))
ES6 Example (broken down)
// broken down to for easier understanding
const concat = list => Array.prototype.concat.bind(list)
const promiseConcat = f => x => f().then(concat(x))
const promiseReduce = (acc, x) => acc.then(promiseConcat(x))
/*
* serial executes Promises sequentially.
* #param {funcs} An array of funcs that return promises.
* #example
* const urls = ['/url1', '/url2', '/url3']
* serial(urls.map(url => () => $.ajax(url)))
* .then(console.log.bind(console))
*/
const serial = funcs => funcs.reduce(promiseReduce, Promise.resolve([]))
Usage:
// first take your work
const urls = ['/url1', '/url2', '/url3', '/url4']
// next convert each item to a function that returns a promise
const funcs = urls.map(url => () => $.ajax(url))
// execute them serially
serial(funcs)
.then(console.log.bind(console))
To do this simply in ES6:
function(files) {
// Create a new empty promise (don't do that with real people ;)
var sequence = Promise.resolve();
// Loop over each file, and add on a promise to the
// end of the 'sequence' promise.
files.forEach(file => {
// Chain one computation onto the sequence
sequence =
sequence
.then(() => performComputation(file))
.then(result => doSomething(result));
// Resolves for each file, one at a time.
})
// This will resolve after the entire chain is resolved
return sequence;
}
Addition example
const addTwo = async () => 2;
const addThree = async (inValue) => new Promise((resolve) => setTimeout(resolve(inValue + 3), 2000));
const addFour = (inValue) => new Promise((res) => setTimeout(res(inValue + 4), 1000));
const addFive = async (inValue) => inValue + 5;
// Function which handles promises from above
async function sequenceAddition() {
let sum = await [addTwo, addThree, addFour, addFive].reduce(
(promise, currPromise) => promise.then((val) => currPromise(val)),
Promise.resolve()
);
console.log('sum:', sum); // 2 + 3 + 4 + 5 = 14
}
// Run function. See console for result.
sequenceAddition();
General syntax to use reduce()
function sequence(tasks, fn) {
return tasks.reduce((promise, task) => promise.then(() => fn(task)), Promise.resolve());
}
UPDATE
items-promise is a ready to use NPM package doing the same.
I've had to run a lot of sequential tasks and used these answers to forge a function that would take care of handling any sequential task...
function one_by_one(objects_array, iterator, callback) {
var start_promise = objects_array.reduce(function (prom, object) {
return prom.then(function () {
return iterator(object);
});
}, Promise.resolve()); // initial
if(callback){
start_promise.then(callback);
}else{
return start_promise;
}
}
The function takes 2 arguments + 1 optional. First argument is the array on which we will be working. The second argument is the task itself, a function that returns a promise, the next task will be started only when this promise resolves. The third argument is a callback to run when all tasks have been done. If no callback is passed, then the function returns the promise it created so we can handle the end.
Here's an example of usage:
var filenames = ['1.jpg','2.jpg','3.jpg'];
var resize_task = function(filename){
//return promise of async resizing with filename
};
one_by_one(filenames,resize_task );
Hope it saves someone some time...
With Async/Await (if you have the support of ES7)
function downloadFile(fileUrl) { ... } // This function return a Promise
async function main()
{
var filesList = [...];
for (const file of filesList) {
await downloadFile(file);
}
}
(you must use for loop, and not forEach because async/await has problems running in forEach loop)
Without Async/Await (using Promise)
function downloadFile(fileUrl) { ... } // This function return a Promise
function downloadRecursion(filesList, index)
{
index = index || 0;
if (index < filesList.length)
{
downloadFile(filesList[index]).then(function()
{
index++;
downloadRecursion(filesList, index); // self invocation - recursion!
});
}
else
{
return Promise.resolve();
}
}
function main()
{
var filesList = [...];
downloadRecursion(filesList);
}
My preferred solution:
function processArray(arr, fn) {
return arr.reduce(
(p, v) => p.then((a) => fn(v).then(r => a.concat([r]))),
Promise.resolve([])
);
}
It's not fundamentally different from others published here but:
Applies the function to items in series
Resolves to an array of results
Doesn't require async/await (support is still quite limited, circa 2017)
Uses arrow functions; nice and concise
Example usage:
const numbers = [0, 4, 20, 100];
const multiplyBy3 = (x) => new Promise(res => res(x * 3));
// Prints [ 0, 12, 60, 300 ]
processArray(numbers, multiplyBy3).then(console.log);
Tested on reasonable current Chrome (v59) and NodeJS (v8.1.2).
First, you need to understand that a promise is executed at the time of creation.
So for example if you have a code:
["a","b","c"].map(x => returnsPromise(x))
You need to change it to:
["a","b","c"].map(x => () => returnsPromise(x))
Then we need to sequentially chain promises:
["a", "b", "c"].map(x => () => returnsPromise(x))
.reduce(
(before, after) => before.then(_ => after()),
Promise.resolve()
)
executing after(), will make sure that promise is created (and executed) only when its time comes.
Nicest solution that I was able to figure out was with bluebird promises. You can just do Promise.resolve(files).each(fs.readFileAsync); which guarantees that promises are resolved sequentially in order.
With async/await of ES2016 (and maybe some features of ES2018), this can be reduced to this form:
function readFile(file) {
... // Returns a promise.
}
async function readFiles(files) {
for (file in files) {
await readFile(file)
}
}
I haven't seen another answer express that simplicity. The OP said parallel execution of readFile was not desired. However, with IO like this it really makes sense to not be blocking on a single file read, while keeping the loop execution synchronous (you don't want to do the next step until all files have been read). Since I just learned about this and am a bit excited about it, I'll share that approach of parallel asynchronous execution of readFile with overall synchronous execution of readFiles.
async function readFiles(files) {
await Promise.all(files.map(readFile))
}
Isn't that a thing of beauty?
This is a slight variation of another answer above. Using native Promises:
function inSequence(tasks) {
return tasks.reduce((p, task) => p.then(task), Promise.resolve())
}
Explanation
If you have these tasks [t1, t2, t3], then the above is equivalent to Promise.resolve().then(t1).then(t2).then(t3). It's the behavior of reduce.
How to use
First You need to construct a list of tasks! A task is a function that accepts no argument. If you need to pass arguments to your function, then use bind or other methods to create a task. For example:
var tasks = files.map(file => processFile.bind(null, file))
inSequence(tasks).then(...)
I created this simple method on the Promise object:
Create and add a Promise.sequence method to the Promise object
Promise.sequence = function (chain) {
var results = [];
var entries = chain;
if (entries.entries) entries = entries.entries();
return new Promise(function (yes, no) {
var next = function () {
var entry = entries.next();
if(entry.done) yes(results);
else {
results.push(entry.value[1]().then(next, function() { no(results); } ));
}
};
next();
});
};
Usage:
var todo = [];
todo.push(firstPromise);
if (someCriterium) todo.push(optionalPromise);
todo.push(lastPromise);
// Invoking them
Promise.sequence(todo)
.then(function(results) {}, function(results) {});
The best thing about this extension to the Promise object, is that it is consistent with the style of promises. Promise.all and Promise.sequence is invoked the same way, but have different semantics.
Caution
Sequential running of promises is not usually a very good way to use promises. It's usually better to use Promise.all, and let the browser run the code as fast as possible. However, there are real use cases for it - for example when writing a mobile app using javascript.
My answer based on https://stackoverflow.com/a/31070150/7542429.
Promise.series = function series(arrayOfPromises) {
var results = [];
return arrayOfPromises.reduce(function(seriesPromise, promise) {
return seriesPromise.then(function() {
return promise
.then(function(result) {
results.push(result);
});
});
}, Promise.resolve())
.then(function() {
return results;
});
};
This solution returns the results as an array like Promise.all().
Usage:
Promise.series([array of promises])
.then(function(results) {
// do stuff with results here
});
Use Array.prototype.reduce, and remember to wrap your promises in a function otherwise they will already be running!
// array of Promise providers
const providers = [
function(){
return Promise.resolve(1);
},
function(){
return Promise.resolve(2);
},
function(){
return Promise.resolve(3);
}
]
const inSeries = function(providers){
const seed = Promise.resolve(null);
return providers.reduce(function(a,b){
return a.then(b);
}, seed);
};
nice and easy...
you should be able to re-use the same seed for performance, etc.
It's important to guard against empty arrays or arrays with only 1 element when using reduce, so this technique is your best bet:
const providers = [
function(v){
return Promise.resolve(v+1);
},
function(v){
return Promise.resolve(v+2);
},
function(v){
return Promise.resolve(v+3);
}
]
const inSeries = function(providers, initialVal){
if(providers.length < 1){
return Promise.resolve(null)
}
return providers.reduce((a,b) => a.then(b), providers.shift()(initialVal));
};
and then call it like:
inSeries(providers, 1).then(v => {
console.log(v); // 7
});
Using modern ES:
const series = async (tasks) => {
const results = [];
for (const task of tasks) {
const result = await task;
results.push(result);
}
return results;
};
//...
const readFiles = await series(files.map(readFile));
Most of the answers dont include the results of ALL promises individually, so in case someone is looking for this particular behaviour, this is a possible solution using recursion.
It follows the style of Promise.all:
Returns the array of results in the .then() callback.
If some promise fails, its returned immediately in the .catch() callback.
const promiseEach = (arrayOfTasks) => {
let results = []
return new Promise((resolve, reject) => {
const resolveNext = (arrayOfTasks) => {
// If all tasks are already resolved, return the final array of results
if (arrayOfTasks.length === 0) return resolve(results)
// Extract first promise and solve it
const first = arrayOfTasks.shift()
first().then((res) => {
results.push(res)
resolveNext(arrayOfTasks)
}).catch((err) => {
reject(err)
})
}
resolveNext(arrayOfTasks)
})
}
// Lets try it 😎
const promise = (time, shouldThrowError) => new Promise((resolve, reject) => {
const timeInMs = time * 1000
setTimeout(()=>{
console.log(`Waited ${time} secs`)
if (shouldThrowError) reject(new Error('Promise failed'))
resolve(time)
}, timeInMs)
})
const tasks = [() => promise(1), () => promise(2)]
promiseEach(tasks)
.then((res) => {
console.log(res) // [1, 2]
})
// Oops some promise failed
.catch((error) => {
console.log(error)
})
Note about the tasks array declaration:
In this case is not possible to use the following notation like Promise.all would use:
const tasks = [promise(1), promise(2)]
And we have to use:
const tasks = [() => promise(1), () => promise(2)]
The reason is that JavaScript starts executing the promise immediatelly after its declared. If we use methods like Promise.all, it just checks that the state of all of them is fulfilled or rejected, but doesnt start the exection itself. Using () => promise() we stop the execution until its called.
You can use this function that gets promiseFactories List:
function executeSequentially(promiseFactories) {
var result = Promise.resolve();
promiseFactories.forEach(function (promiseFactory) {
result = result.then(promiseFactory);
});
return result;
}
Promise Factory is just simple function that returns a Promise:
function myPromiseFactory() {
return somethingThatCreatesAPromise();
}
It works because a promise factory doesn't create the promise until it's asked to. It works the same way as a then function – in fact, it's the same thing!
You don't want to operate over an array of promises at all. Per the Promise spec, as soon as a promise is created, it begins executing. So what you really want is an array of promise factories...
If you want to learn more on Promises, you should check this link:
https://pouchdb.com/2015/05/18/we-have-a-problem-with-promises.html
If you want you can use reduce to make a sequential promise, for example:
[2,3,4,5,6,7,8,9].reduce((promises, page) => {
return promises.then((page) => {
console.log(page);
return Promise.resolve(page+1);
});
}, Promise.resolve(1));
it'll always works in sequential.
I really liked #joelnet's answer, but to me, that style of coding is a little bit tough to digest, so I spent a couple of days trying to figure out how I would express the same solution in a more readable manner and this is my take, just with a different syntax and some comments.
// first take your work
const urls = ['/url1', '/url2', '/url3', '/url4']
// next convert each item to a function that returns a promise
const functions = urls.map((url) => {
// For every url we return a new function
return () => {
return new Promise((resolve) => {
// random wait in milliseconds
const randomWait = parseInt((Math.random() * 1000),10)
console.log('waiting to resolve in ms', randomWait)
setTimeout(()=>resolve({randomWait, url}),randomWait)
})
}
})
const promiseReduce = (acc, next) => {
// we wait for the accumulator to resolve it's promise
return acc.then((accResult) => {
// and then we return a new promise that will become
// the new value for the accumulator
return next().then((nextResult) => {
// that eventually will resolve to a new array containing
// the value of the two promises
return accResult.concat(nextResult)
})
})
};
// the accumulator will always be a promise that resolves to an array
const accumulator = Promise.resolve([])
// we call reduce with the reduce function and the accumulator initial value
functions.reduce(promiseReduce, accumulator)
.then((result) => {
// let's display the final value here
console.log('=== The final result ===')
console.log(result)
})
As Bergi noticed, I think the best and clear solution is use BlueBird.each, code below:
const BlueBird = require('bluebird');
BlueBird.each(files, fs.readFileAsync);
I find myself coming back to this question many times and the answers aren't exactly giving me what I need, so putting this here for anyone that needs this too.
The code below does sequential promises execution (one after another), and each round consists of multiple callings:
async function sequence(list, cb) {
const result = [];
await list.reduce(async (promise, item) => promise
.then(() => cb(item))
.then((res) => result.push(res)
), Promise.resolve());
return result;
}
Showcase:
<script src="https://cdnjs.cloudflare.com/ajax/libs/axios/0.15.3/axios.min.js"></script>
<script src="https://unpkg.com/#babel/standalone#7/babel.min.js"></script>
<script type="text/babel">
function sleep(ms) {
return new Promise(resolve => setTimeout(resolve, ms));
}
async function readFile(url, index) {
console.log('Running index: ', index);
// First action
const firstTime = await axios.get(url);
console.log('First API response: ', firstTime.data.activity);
// Second action
await sleep(1000);
// Third action
const secondTime = await axios.get(url);
console.log('Second API response: ', secondTime.data.activity);
// Fourth action
await sleep(1000);
return secondTime.data;
}
async function sequence(urls, fn) {
const result = [];
await urls.reduce(async (promise, url, index) => promise.then(() => fn(url, index)).then((res) => result.push(res)), Promise.resolve());
return result;
}
const urls = [
'https://www.boredapi.com/api/activity',
'https://www.boredapi.com/api/activity',
'https://www.boredapi.com/api/activity',
];
(async function init() {
const result = await sequence(urls, readFile);
console.log('result', result);
})()
</script>
I use the following code to extend the Promise object. It handles rejection of the promises and returns an array of results
Code
/*
Runs tasks in sequence and resolves a promise upon finish
tasks: an array of functions that return a promise upon call.
parameters: an array of arrays corresponding to the parameters to be passed on each function call.
context: Object to use as context to call each function. (The 'this' keyword that may be used inside the function definition)
*/
Promise.sequence = function(tasks, parameters = [], context = null) {
return new Promise((resolve, reject)=>{
var nextTask = tasks.splice(0,1)[0].apply(context, parameters[0]); //Dequeue and call the first task
var output = new Array(tasks.length + 1);
var errorFlag = false;
tasks.forEach((task, index) => {
nextTask = nextTask.then(r => {
output[index] = r;
return task.apply(context, parameters[index+1]);
}, e=>{
output[index] = e;
errorFlag = true;
return task.apply(context, parameters[index+1]);
});
});
// Last task
nextTask.then(r=>{
output[output.length - 1] = r;
if (errorFlag) reject(output); else resolve(output);
})
.catch(e=>{
output[output.length - 1] = e;
reject(output);
});
});
};
Example
function functionThatReturnsAPromise(n) {
return new Promise((resolve, reject)=>{
//Emulating real life delays, like a web request
setTimeout(()=>{
resolve(n);
}, 1000);
});
}
var arrayOfArguments = [['a'],['b'],['c'],['d']];
var arrayOfFunctions = (new Array(4)).fill(functionThatReturnsAPromise);
Promise.sequence(arrayOfFunctions, arrayOfArguments)
.then(console.log)
.catch(console.error);
Your approach is not bad, but it does have two issues: it swallows errors and it employs the Explicit Promise Construction Antipattern.
You can solve both of these issues, and make the code cleaner, while still employing the same general strategy:
var Q = require("q");
var readFile = function(file) {
... // Returns a promise.
};
var readFiles = function(files) {
var readSequential = function(index) {
if (index < files.length) {
return readFile(files[index]).then(function() {
return readSequential(index + 1);
});
}
};
// using Promise.resolve() here in case files.length is 0
return Promise.resolve(readSequential(0)); // Start!
};
This is my sequentially implementation that I use in various projects:
const file = [file1, file2, file3];
const fileContents = sequentially(readFile, files);
// somewhere else in the code:
export const sequentially = async <T, P>(
toPromise: (element: T) => Promise<P>,
elements: T[]
): Promise<P[]> => {
const results: P[] = [];
await elements.reduce(async (sequence, element) => {
await sequence;
results.push(await toPromise(element));
}, Promise.resolve());
return results;
};
Here is my Angular/TypeScript approach, using RxJS:
Given an array of URL strings, convert it into an Observable using the from function.
Use pipe to wrap the Ajax request, immediate response logic, any desired delay, and error handling.
Inside of the pipe, use concatMap to serialize the requests. Otherwise, using Javascript forEach or map would make the requests at the same time.
Use RxJS ajax to make the call, and also to add any desired delay after each call returns.
Working example: https://stackblitz.com/edit/rxjs-bnrkix?file=index.ts
The code looks like this (I left in some extras so you can choose what to keep or discard):
import { ajax } from 'rxjs/ajax';
import { catchError, concatMap, delay, from, of, map, Observable } from 'rxjs';
const urls = [
'https://randomuser.me/api/',
'https://randomuser.me/api/',
'https://randomuser.me/api/',
];
const delayAfterCall = 500;
from(urls)
.pipe(
concatMap((url: string) => {
return ajax.getJSON(url).pipe(
map((response) => {
console.log('Done! Received:', response);
return response;
}),
catchError((error) => {
console.error('Error: ', error);
return of(error);
}),
delay(delayAfterCall)
);
})
)
.subscribe((response) => {
console.log('received email:', response.results[0].email);
});
On the basis of the question's title, "Resolve promises one after another (i.e. in sequence)?", we might understand that the OP is more interested in the sequential handling of promises on settlement than sequential calls per se.
This answer is offered :
to demonstrate that sequential calls are not necessary for sequential handling of responses.
to expose viable alternative patterns to this page's visitors - including the OP if he is still interested over a year later.
despite the OP's assertion that he does not want to make calls concurrently, which may genuinely be the case but equally may be an assumption based on the desire for sequential handling of responses as the title implies.
If concurrent calls are genuinely not wanted then see Benjamin Gruenbaum's answer which covers sequential calls (etc) comprehensively.
If however, you are interested (for improved performance) in patterns which allow concurrent calls followed by sequential handling of responses, then please read on.
It's tempting to think you have to use Promise.all(arr.map(fn)).then(fn) (as I have done many times) or a Promise lib's fancy sugar (notably Bluebird's), however (with credit to this article) an arr.map(fn).reduce(fn) pattern will do the job, with the advantages that it :
works with any promise lib - even pre-compliant versions of jQuery - only .then() is used.
affords the flexibility to skip-over-error or stop-on-error, whichever you want with a one line mod.
Here it is, written for Q.
var readFiles = function(files) {
return files.map(readFile) //Make calls in parallel.
.reduce(function(sequence, filePromise) {
return sequence.then(function() {
return filePromise;
}).then(function(file) {
//Do stuff with file ... in the correct sequence!
}, function(error) {
console.log(error); //optional
return sequence;//skip-over-error. To stop-on-error, `return error` (jQuery), or `throw error` (Promises/A+).
});
}, Q()).then(function() {
// all done.
});
};
Note: only that one fragment, Q(), is specific to Q. For jQuery you need to ensure that readFile() returns a jQuery promise. With A+ libs, foreign promises will be assimilated.
The key here is the reduction's sequence promise, which sequences the handling of the readFile promises but not their creation.
And once you have absorbed that, it's maybe slightly mind-blowing when you realise that the .map() stage isn't actually necessary! The whole job, parallel calls plus serial handling in the correct order, can be achieved with reduce() alone, plus the added advantage of further flexibility to :
convert from parallel async calls to serial async calls by simply moving one line - potentially useful during development.
Here it is, for Q again.
var readFiles = function(files) {
return files.reduce(function(sequence, f) {
var filePromise = readFile(f);//Make calls in parallel. To call sequentially, move this line down one.
return sequence.then(function() {
return filePromise;
}).then(function(file) {
//Do stuff with file ... in the correct sequence!
}, function(error) {
console.log(error); //optional
return sequence;//Skip over any errors. To stop-on-error, `return error` (jQuery), or `throw error` (Promises/A+).
});
}, Q()).then(function() {
// all done.
});
};
That's the basic pattern. If you wanted also to deliver data (eg the files or some transform of them) to the caller, you would need a mild variant.
If someone else needs a guaranteed way of STRICTLY sequential way of resolving Promises when performing CRUD operations you also can use the following code as a basis.
As long as you add 'return' before calling each function, describing a Promise, and use this example as a basis the next .then() function call will CONSISTENTLY start after the completion of the previous one:
getRidOfOlderShoutsPromise = () => {
return readShoutsPromise('BEFORE')
.then(() => {
return deleteOlderShoutsPromise();
})
.then(() => {
return readShoutsPromise('AFTER')
})
.catch(err => console.log(err.message));
}
deleteOlderShoutsPromise = () => {
return new Promise ( (resolve, reject) => {
console.log("in deleteOlderShouts");
let d = new Date();
let TwoMinuteAgo = d - 1000 * 90 ;
All_Shouts.deleteMany({ dateTime: {$lt: TwoMinuteAgo}}, function(err) {
if (err) reject();
console.log("DELETED OLDs at "+d);
resolve();
});
});
}
readShoutsPromise = (tex) => {
return new Promise( (resolve, reject) => {
console.log("in readShoutsPromise -"+tex);
All_Shouts
.find({})
.sort([['dateTime', 'ascending']])
.exec(function (err, data){
if (err) reject();
let d = new Date();
console.log("shouts "+tex+" delete PROMISE = "+data.length +"; date ="+d);
resolve(data);
});
});
}
Array push and pop method can be used for sequence of promises. You can also push new promises when you need additional data. This is the code, I will use in React Infinite loader to load sequence of pages.
var promises = [Promise.resolve()];
function methodThatReturnsAPromise(page) {
return new Promise((resolve, reject) => {
setTimeout(() => {
console.log(`Resolve-${page}! ${new Date()} `);
resolve();
}, 1000);
});
}
function pushPromise(page) {
promises.push(promises.pop().then(function () {
return methodThatReturnsAPromise(page)
}));
}
pushPromise(1);
pushPromise(2);
pushPromise(3);
(function() {
function sleep(ms) {
return new Promise(function(resolve) {
setTimeout(function() {
return resolve();
}, ms);
});
}
function serial(arr, index, results) {
if (index == arr.length) {
return Promise.resolve(results);
}
return new Promise(function(resolve, reject) {
if (!index) {
index = 0;
results = [];
}
return arr[index]()
.then(function(d) {
return resolve(d);
})
.catch(function(err) {
return reject(err);
});
})
.then(function(result) {
console.log("here");
results.push(result);
return serial(arr, index + 1, results);
})
.catch(function(err) {
throw err;
});
}
const a = [5000, 5000, 5000];
serial(a.map(x => () => sleep(x)));
})();
Here the key is how you call the sleep function. You need to pass an array of functions which itself returns a promise instead of an array of promises.

Is there any programmatic way to break infinite loop in NodeJS?

Ultimately speaking - is there a practical method (also maybe by inserting some JS constructs into the code) to break or halt the long lasting JS code during the execution? For example: can it be interrupted by some process.* object constructs, or similar? Or the other way? The valid solution may even include the NodeJS process to be killed and/or restarted. Thank you!
EDIT:
I need to execute some particular user code on the server, using Function clause (ala eval, - let alone security concerns). I cannot insert any extra code inside it, only enclose it. What I need is to have a possibility to break user code after 5 minutes, if it is not finished by this time. For example:
usercode = 'Some code from the user';
pre_code = 'some controlling code for breaking the user code';
post_code = 'another controlling code';
fcode = pre_code + usercode + post_code;
<preparations for breaking usercode>
(new Function(fcode))(); // This MUST exit in 5 minutes
Edit:
Answering your edit. I see the intention now. If it is running in nodejs, you can use worker_thread for that https://nodejs.org/api/worker_threads.html#worker_threads_worker_workerdata.
For example:
// main.js
const runCode = (code) => {
const worker = new Worker("./code-executor.js", { workerData: { code: guestCode } });
const promise = new Promise((resolve) => {
setTimeout(() => worker.kill(), 60000 * 5);
worker.on("error", () => {
return reject(new SomeCustomError())
});
worker.on("message", (message) => {
if(message.success) return resolve(message.result);
return reject(new Error(message.error));
});
});
promise.finally(() => { worker.kill() });
return promise;
}
// code-executor.js
const { workerData, parentPort } = require("worker_threads");
const { code } = workerData;
Promise.resolve()
.then(() => (new Function(fcode))())
.then((result) => {
parentPort.postMessage({
success: true,
result: value
})
})
.catch((error) => {
parentPort.postMessage({
success: true,
error: error.message
})
});
If it's in browser https://developer.mozilla.org/en-US/docs/Web/API/Worker
The WebAPI is not exactly the same but the logic should be similar
Original
Killing a process. Also read: https://nodejs.org/api/process.html#process_signal_events
process.kill(pid, "SIGINT")
"Killing" a long running function, you gotta hack a bit. There's no elegant solution. Inject a controller which can be mutated outside of the long running function. To stop it from the outside, set controller.isStopped = true
export const STOP_EXECUTION = Symbol();
function longRunning(controller){
... codes
// add stopping point
if(controller.isStopped) throw STOP_EXECUTION;
... codes
// add stopping point
if(controller.isStopped) throw STOP_EXECUTION;
... codes
}
// catch it by
try{
longRunnning();
}catch(e){
switch(true){
e === STOP_EXECUTION: ...; // the longRunning function is stopped from the outside
default: ...; // the longRunning function is throwing not because of being stopped
}
}
Gist: https://gist.github.com/Kelerchian/3824ca4ce1be390d34c5147db671cc9b

How to await in the main during start up in node?

Want to initialize a value from a async call, then keep on using that value, I don't know how to await before node loads other code here.
console.log('---- initialize the value during startup ----');
const p1 = (async () => {
const res = await requestAsync.get('https://nodejs.org/dist/latest-v8.x/docs/api/util.html');
return res.headers;
})();
const v2 = p1.then(v => (v));
console.log(v2);
console.log('---------- more work after v2 is resolved --------------');
i get
---- initialize the value during startup ----
Promise { <pending> }
---------- more work after v2 is resolved --------------
Until node.js has top level await (which it currently doesn't), then the usual way of doing things is to just use .then() and put everything that is dependent upon the asynchronous result inside the .then() handler:
requestAsync.get('https://nodejs.org/dist/latest-v8.x/docs/api/util.html').then(result => {
// put all the rest of your initialization code that
// needs this async result here
}).catch(err => {
// handle errors here
});
If you want to (just for code structure reasons), you can move everything into a function:
requestAsync.get('https://nodejs.org/dist/latest-v8.x/docs/api/util.html')
.then(startup)
.catch(handleErr);
function startup(result) {
// put your startup code here that needs result
}
function handleErr(err) {
// handle error here
}
Keep in mind that any other startup code will continue because module loading does not block waiting for this async result.
If you have multiple asynchronous operations you want do in a row, then you can chain them:
requestAsync.get().then(f1).then(f2).catch(err => {....});
Or, you can wrap code in a top level function so you can use await:
async function startup() {
let result1 = await f1(...);
let result2 = await f2(...);
let result3 = await f3(...);
}
startup().then(finalResult => {
// all async operation done here with finalResult
}).catch(err => {
// error starting up
});
I'm not sure I understand what you mean, but maybe this is what you want?
async function main() {
const response = await requestAsync.get('https://n...');
const headers = response.headers;
console.log('some other stuff here');
}
main();
Putting everything in a "main" function allows you to wait for you initial request before doing anything else.
Be careful though: this is blocking, so if the request takes a while, nothing will happen in your code until it's done.

kafka-node asynchronous consumer handler

That's how my consumer is initialised:
const client = new kafka.Client(config.ZK_HOST)
const consumer = new kafka.Consumer(client, [{ topic: config.KAFKA_TOPIC, offset: 0}],
{
autoCommit: false
})
Now the consumer consumer.on('message', message => applyMessage(message))
The thing is applyMessage talks to the database using knex, the code looks something like:
async function applyMessage(message: kafka.Message) {
const usersCount = await db('users').count()
// just assume we ABSOLUTELY need to calculate a number of users,
// so we need previous state
await db('users').insert(inferUserFromMessage(message))
}
The code above makes applyMessage to execute in parallel for all the messages in kafka, so in the code above given that there are no users in the database yet, usersCount will ALWAYS be 0 even for the second message from kafka where it should be 1 already since first call to applyMessage inserts a user.
How do I "synchronise" the code in a way that all the applyMessage functions run sequentially?
You'll need to implement some sort of Mutex. Basically a class which queues up things to execute synchronously. Example
var Mutex = function() {
this.queue = [];
this.locked = false;
};
Mutex.prototype.enqueue = function(task) {
this.queue.push(task);
if (!this.locked) {
this.dequeue();
}
};
Mutex.prototype.dequeue = function() {
this.locked = true;
const task = this.queue.shift();
if (task) {
this.execute(task);
} else {
this.locked = false;
}
};
Mutex.prototype.execute = async function(task) {
try { await task(); } catch (err) { }
this.dequeue();
}
In order for this to work, your applyMessage function (whichever handles Kafka messages) needs to return a Promise - notice also the async has moved from the parent function to the returned Promise function:
function applyMessage(message: kafka.Message) {
return new Promise(async function(resolve,reject) {
try {
const usersCount = await db('users').count()
// just assume we ABSOLUTELY need to calculate a number of users,
// so we need previous state
await db('users').insert(inferUserFromMessage(message))
resolve();
} catch (err) {
reject(err);
}
});
}
Finally, each invocation of applyMessage needs to be added to the Mutex queue instead of called directly:
var mutex = new Mutex();
consumer.on('message', message => mutex.enqueue(function() { return applyMessage(message); }))

async await with setInterval

function first(){
console.log('first')
}
function second(){
console.log('second')
}
let interval = async ()=>{
await setInterval(first,2000)
await setInterval(second,2000)
}
interval();
Imagine that I have this code above.
When I run it, first() and second() will be called at the same time; how do I call second() after first)() returns some data, for example, if first() is done, only then call second()?
Because first() in my code will be working with a big amount of data and if this 2 functions will be calling at the same time, it will be hard for the server.
How do I call second() each time when first() will return some data?
As mentioned above setInterval does not play well with promises if you do not stop it. In case you clear the interval you can use it like:
async function waitUntil(condition) {
return await new Promise(resolve => {
const interval = setInterval(() => {
if (condition) {
resolve('foo');
clearInterval(interval);
};
}, 1000);
});
}
Later you can use it like
const bar = waitUntil(someConditionHere)
You have a few problems:
Promises may only ever resolve once, setInterval() is meant to call the callback multiple times, Promises do not support this case well.
Neither setInterval(), nor the more appropriate setTimeout() return Promises, therefore, awaiting on them is pointless in this context.
You're looking for a function that returns a Promise which resolves after some times (using setTimeout(), probably, not setInterval()).
Luckily, creating such a function is rather trivial:
async function delay(ms) {
// return await for better async stack trace support in case of errors.
return await new Promise(resolve => setTimeout(resolve, ms));
}
With this new delay function, you can implement your desired flow:
function first(){
console.log('first')
}
function second(){
console.log('second')
}
let run = async ()=>{
await delay(2000);
first();
await delay(2000)
second();
}
run();
setInterval doesn't play well with promises because it triggers a callback multiple times, while promise resolves once.
It seems that it's setTimeout that fits the case. It should be promisified in order to be used with async..await:
async () => {
await new Promise(resolve => setTimeout(() => resolve(first()), 2000));
await new Promise(resolve => setTimeout(() => resolve(second()), 2000));
}
await expression causes async to pause until a Promise is settled
so you can directly get the promise's result without await
for me, I want to initiate Http request every 1s
let intervalid
async function testFunction() {
intervalid = setInterval(() => {
// I use axios like: axios.get('/user?ID=12345').then
new Promise(function(resolve, reject){
resolve('something')
}).then(res => {
if (condition) {
// do something
} else {
clearInterval(intervalid)
}
})
}, 1000)
}
// you can use this function like
testFunction()
// or stop the setInterval in any place by
clearInterval(intervalid)
You could use an IFFE. This way you could escape the issue of myInterval not accepting Promise as a return type.
There are cases where you need setInterval, because you want to call some function unknown amount of times with some interval in between.
When I faced this problem this turned out to be the most straight-forward solution for me. I hope it help someone :)
For me the use case was that I wanted to send logs to CloudWatch but try not to face the Throttle exception for sending more than 5 logs per second. So I needed to keep my logs and send them as a batch in an interval of 1 second. The solution I'm posting here is what I ended up using.
async function myAsyncFunc(): Promise<string> {
return new Promise<string>((resolve) => {
resolve("hello world");
});
}
function myInterval(): void {
setInterval(() => {
void (async () => {
await myAsyncFunc();
})();
}, 5_000);
}
// then call like so
myInterval();
Looked through all the answers but still didn't find the correct one that would work exactly how the OP is asked. This is what I used for the same purpose:
async function waitInterval(callback, ms) {
return new Promise(resolve => {
let iteration = 0;
const interval = setInterval(async () => {
if (await callback(iteration, interval)) {
resolve();
clearInterval(interval);
}
iteration++;
}, ms);
});
}
function first(i) {
console.log(`first: ${i}`);
// If the condition below is true the timer finishes
return i === 5;
}
function second(i) {
console.log(`second: ${i}`);
// If the condition below is true the timer finishes
return i === 5;
}
(async () => {
console.log('start');
await waitInterval(first, 1000);
await waitInterval(second, 1000);
console.log('finish');
})()
In my example, I also put interval iteration count and the timer itself, just in case the caller would need to do something with it. However, it's not necessary
In my case, I needed to iterate through a list of images, pausing in between each, and then a longer pause at the end before re-looping through.
I accomplished this by combining several techniques from above, calling my function recursively and awaiting a timeout.
If at any point another trigger changes my animationPaused:boolean, my recursive function will exit.
const loopThroughImages = async() => {
for (let i=0; i<numberOfImages; i++){
if (animationPaused) {
return;
}
this.updateImage(i);
await timeout(700);
}
await timeout(1000);
loopThroughImages();
}
loopThroughImages();
Async/await do not make the promises synchronous.
To my knowledge, it's just a different syntax for return Promise and .then().
Here i rewrote the async function and left both versions, so you can see what it really does and compare.
It's in fact a cascade of Promises.
// by the way no need for async there. the callback does not return a promise, so no need for await.
function waitInterval(callback, ms) {
return new Promise(resolve => {
let iteration = 0;
const interval = setInterval(async () => {
if (callback(iteration, interval)) {
resolve();
clearInterval(interval);
}
iteration++;
}, ms);
});
}
function first(i) {
console.log(`first: ${i}`);
// If the condition below is true the timer finishes
return i === 5;
}
function second(i) {
console.log(`second: ${i}`);
// If the condition below is true the timer finishes
return i === 5;
}
// async function with async/await, this code ...
(async () => {
console.log('start');
await waitInterval(first, 1000);
await waitInterval(second, 1000);
console.log('finish');
})() //... returns a pending Promise and ...
console.log('i do not wait');
// ... is kinda identical to this code.
// still asynchronous but return Promise statements with then cascade.
(() => {
console.log('start again');
return waitInterval(first, 1000).then(() => {
return waitInterval(second, 1000).then(() => {
console.log('finish again');
});
});
})(); // returns a pending Promise...
console.log('i do not wait either');
You can see the two async functions both execute at the same time.
So using promises around intervals here is not very useful, as it's still just intervals, and promises changes nothing, and make things confusing...
As the code is calling callbacks repeatedly into an interval, this is, i think, a cleaner way:
function first(i) {
console.log(`first: ${i}`);
// If the condition below is true the timer finishes
return i === 5;
}
function second(i) {
console.log(`second: ${i}`);
// If the condition below is true the timer finishes
return i === 5;
}
function executeThroughTime(...callbacks){
console.log('start');
let callbackIndex = 0; // to track current callback.
let timerIndex = 0; // index given to callbacks
let interval = setInterval(() =>{
if (callbacks[callbackIndex](timerIndex++)){ // callback return true when it finishes.
timerIndex = 0; // resets for next callback
if (++callbackIndex>=callbacks.length){ // if no next callback finish.
clearInterval(interval);
console.log('finish');
}
}
},1000)
}
executeThroughTime(first,second);
console.log('and i still do not wait ;)');
Also, this solution execute a callback every secondes.
if the callbacks are async requests that takes more than one sec to resolve, and i can't afford for them to overlap, then, instead of doing iterative call with repetitive interval, i would get the request resolution to call the next request (through a timer if i don't want to harass the server).
Here the "recursive" task is called lTask, does pretty much the same as before, except that, as i do not have an interval anymore, i need a new timer each iteration.
// slow internet request simulation. with a Promise, could be a callback.
function simulateAsync1(i) {
console.log(`first pending: ${i}`);
return new Promise((resolve) =>{
setTimeout(() => resolve('got that first big data'), Math.floor(Math.random()*1000)+ 1000);//simulate request that last between 1 and 2 sec.
}).then((result) =>{
console.log(`first solved: ${i} ->`, result);
return i==2;
});
}
// slow internet request simulation. with a Promise, could be a callback.
function simulateAsync2(i) {
console.log(`second pending: ${i}`);
return new Promise((resolve) =>{
setTimeout(() => resolve('got that second big data'), Math.floor(Math.random()*1000) + 1000);//simulate request that last between 1 and 2 sec.
}).then((result) =>{ // promise is resolved
console.log(`second solved: ${i} ->`,result);
return i==4; // return a promise
});
}
function executeThroughTime(...asyncCallbacks){
console.log('start');
let callbackIndex = 0;
let timerIndex = 0;
let lPreviousTime = Date.now();
let lTask = () => { // timeout callback.
asyncCallbacks[callbackIndex](timerIndex++).then((result) => { // the setTimeout for the next task is set when the promise is solved.
console.log('result',result)
if (result) { // current callback is done.
timerIndex = 0;
if (++callbackIndex>=asyncCallbacks.length){//are all callbacks done ?
console.log('finish');
return;// its over
}
}
console.log('time elapsed since previous call',Date.now() - lPreviousTime);
lPreviousTime = Date.now();
//console.log('"wait" 1 sec (but not realy)');
setTimeout(lTask,1000);//redo task after 1 sec.
//console.log('i do not wait');
});
}
lTask();// no need to set a timer for first call.
}
executeThroughTime(simulateAsync1,simulateAsync2);
console.log('i do not wait');
Next step would be to empty a fifo with the interval, and fill it with web request promises...

Resources