Kill an unresolved promise (or ignore and move on) - node.js

Using node child process exec, I'm calling a ffmpeg conversion via a promise that takes a bit of time. Each time the use clicks "next" it starts the FFMpeg command on a new file:
function doFFMpeg(path){
return new Promise((resolve, reject) => {
exec('ffmpeg (long running command)', (error, stdout, stderr) => {
if (error) {
reject();
}
}).on('exit', (code) => { // Exit returns code 0 for good 1 bad
if (code) {
reject();
} else {
resolve();
}
});
});
}
The problem is, if the user moves on to the next video before the promise is returned, I need to scrap the process and move on to converting the next video.
How do I either:
A) (Ideally) Cancel the current promised exec process*
B) Let the current promised exec process complete, but just ignore that promise while I start a new one.
*I realize that promise.cancel is not yet in ECMA, but I'd like to know of a workaround -- preferably without using a 3rd party module / library.
Attempt:
let myChildProcess;
function doFFMpeg(path){
myChildProcess.kill();
return new Promise((resolve, reject) => {
myChildProcess = exec('ffmpeg (long running command)', (error, stdout, stderr) => {
if (error) {
reject();
}
}).on('exit', (code) => { // Exit returns code 0 for good 1 bad
if (code) {
reject();
} else {
resolve();
}
});
});
}

Assuming exec() does indeed return an object with a .kill() method, the attempt looks pretty close to what you want. You just have to accept promise rejection in lieu of cancellation, which is unavailable in native Promises. It is typically inconsequential, even better, to reject than to cancel.
As I understand it, killing the process will cause the callback to fire with an error, (and/or the 'exit' handler to fire with the error code). If so, you don't need to reject the promise explicitly when the process is killed - reject() will be called anyway.
Your doFFMpeg() just needs some safety around calling myChildProcess.kill().
Something like this should do it :
const doFFMpeg = (function() {
var myChildProcess = null;
return function (path) {
if (myChildProcess) {
myChildProcess.kill();
}
return new Promise((resolve, reject) => {
myChildProcess = exec('ffmpeg (long running command)', (error, stdout, stderr) => {
if (error) {
reject(error);
}
myChildProcess = null;
});
myChildProcess.on('exit', (code) => { // Exit returns code 0 for good 1 bad
if (code) {
reject(new Error('bad exit'));
} else {
resolve();
}
myChildProcess = null;
});
});
}
})();
I'm not sure that exec()'s callback is necessary (unless you need to process stdout/stderr or need to know details of the error). It's possible that just the .on('exit') handler will suffice, or maybe .on('end') and .on('error') handlers.
If the caller needs to handle "kill errors" differently from other errors, then there's a little more work to do. You will need to ensure that, on kill, the Promise is rejected with a detectable error (eg a custom error, or an Error monkeypatched with a custom property).

If I understand correctly you want to execute ffmpeg conversions in a chain, one after the other, and kill the active one upon moving to the next if the active one hasn't finished yet.
Assuming childprocess.exec() is used, you could keep track of the child processes in a global variable and when doFFMpeg() is invoked, it should kill() any still running before instantiating the new promise.

Related

Node.js - Calling asynchronous functions whenever a toggle is changed

In my website, I have a toggle button that determines whether or not a given user will receive messages from a given source. Whenever this toggle is changed, an asynchronous function needs to be called. However, if the toggle is changed, and then it is quickly changed again, my program must wait for the previous asynchronous call to finish. Here are my two functions that call the asynchronous functions
_enable() {
let params = determineSubscriptionParams(this.endpoint, this.level);
SNS_CLIENT.subscribe(params, (err, data) => {
if (err) {
throw err;
}
else {
// do stuff here
}
});
}
_disable() {
let params = {
SubscriptionArn: this.subscriptionArn
}
SNS_CLIENT.unsubscribe(params, (err, data) => {
if (err) {
throw err;
}
else {
// do stuff here
}
});
}
Both of these functions are members of a class and the subscribe and unsubscribe functions are the asynchronous calls
You are looking for a lock for critical sections of code. If you don't want to implement your own, you could use async-lock to accomplish this.
This is will make the second request made in quick succession unable to enter the critical part of the code before the first request has released it.

Waiting in a while loop on an async function (Node.js/ES6)

I'm writing a Windows Node.js server app (using ES6 btw).
The first thing I want to do - in the top-level code - is sit in a while loop, calling an async function which searches for a particular registry key/value. This function is 'proven' - it returns the value data if found, or else throws:
async GetRegValue(): Promise<string> { ... }
I need to sit in a while loop until the registry item exists, and then grab the value data. (With a delay between retries).
I think I know how to wait for an async call to complete (one way or the other) before progressing with the rest of the start-up, but I can't figure out how to sit in a loop waiting for it to succeed.
Any advice please on how to achieve this?
(I'm fairly new to typescript, and still struggling to get my head round all async/await scenarios!)
Thanks
EDIT
Thanks guys. I know I was 'vague' about my code - I didn't want to put my real/psuedo code attempts, since they have all probably overlooked the points you can hopefully help me understand.
So I just kept it as a textual description... I'll try though:
async GetRegValue(): Promise<string> {
const val: RegistryItem = await this.GetKeyValue(this.KEY_SW, this.VAL_CONN);
return val.value
}
private async GetKeyValue(key: string, name: string): Promise<RegistryItem> {
return await new Promise((resolve, reject) => {
new this.Registry({
hive: this.Hive, key
}).get(name, (err, items) => {
if (err) {
reject(new Error('Registry get failed'));
}
else {
resolve( items );
}
});
})
.catch(err => { throw err });
}
So I want to do something like:
let keyObtained = false
let val
while (keyObtained == false)
{
// Call GetRegValue until val returned, in which case break from loop
// If exception then pause (e.g. ~100ms), then loop again
}
}
// Don't execute here till while loop has exited
// Then use 'val' for the subsequent statements
As I say, GetRegValue() works fine in other places I use it, but here I'm trying to pause further execution (and retry) until it does come back with a value
You can probably just use recursion. Here is an example on how you can keep calling the GetRegValue function until is resolves using the retryReg function below.
If the catch case is hit, it will just call GetRegValue over and over until it resolves successfully.
you should add a counter in the catch() where if you tried x amount of times you give up.
Keep in mind I mocked the whole GetRegValue function, but given what you stated this would still work for you.
let test = 0;
function GetRegValue() {
return new Promise((resolve, reject) => {
setTimeout(function() {
test++;
if (test === 4) {
return resolve({
reg: "reg value"
});
}
reject({
msg: "not ready"
});
}, 1000);
});
}
function retryReg() {
GetRegValue()
.then(registryObj => {
console.log(`got registry obj: ${JSON.stringify(registryObj)}`)
})
.catch(fail => {
console.log(`registry object is not ready: ${JSON.stringify(fail)}`);
retryReg();
});
}
retryReg();
I don't see why you need this line:
.catch(err => { throw err });
The loop condition of while isn't much use in this case, as you don't really need a state variable or expression to determine if the loop should continue:
let val;
while (true)
{
try {
val = await GetRegValue(/* args */);
break;
} catch (x) {
console.log(x); // or something better
}
await delay(100);
}
If the assignment to val succeeds, we make it to the break; statement and so we leave the loop successfully. Otherwise we jump to the catch block and log the error, wait 100 ms and try again.
It might be better to use a for loop and so set a sensible limit on how many times to retry.
Note that delay is available in an npm package of the same name. It's roughly the same as:
await new Promise(res => setTimeout(res, 100));

Next.js: What to do if a child process doesn't have any data to return?

I'm using Next.js to pipe system calls. The code looks something like this (not exactly this, but this is simple enough to illustrate what I'm doing):
export async function find(file, searchTerm) {
const cat = execFile('cat', [file], {maxBuffer: 1024 * 1024});
const grep = execFile('grep', [searchTerm], {maxBuffer: 1024 * 1024});
cat.stdout.pipe(grep.stdin);
return new Promise((resolve, reject) => {
grep.stdout.on('data', async (d) => {
setTimeout(() => resolve(d.toString().trim()), 100)
});
});
}
Notice there are two processes:
cat
grep
cat.stdout is piped to grep.stdin, and when grep.stdout receives data, the whole function returns a Promise.
All fine and good. The problem is that if grep doesn't find searchTerm inside the results returned from cat, the callback for grep.stdout.on('data', ... isn't invoked and the whole chain just hangs.
In production I have an abstraction that lets me chain together an arbitrary number of processes (started with execFile as above).
Is there a way to detect if any process in the chain returns "nothing", and to just send "nothing" (e.g. the empty string) along the pipe?
Thanks.
You can use the on 'exit' event here.
The 'exit' event is emitted after the child process ends. If the
process exited, code is the final exit code of the process, otherwise
null. If the process terminated due to receipt of a signal, signal is
the string name of the signal, otherwise null. One of the two will
always be non-null.
So you can reject the promise when this event is occurred.
more info : https://nodejs.org/api/child_process.html#child_process_event_exit
You can call the process.exit() function in the callback, Like this:
const { execFile } = require('child_process');
async function find(file, searchTerm) {
const cat = execFile('cat', [file], {maxBuffer: 1024 * 1024});
const grep = execFile('grep', [searchTerm], {maxBuffer: 1024 * 1024}, (error, stdout, stderr) => {
//Because grep gives an empty string when no match
if(stdout ===''){
console.log('emptty')
process.exit()
}
});
cat.stdout.pipe(grep.stdin)
return new Promise((resolve, reject) => {
grep.stdout.on('data', async (d) => {
setTimeout(() => resolve(d.toString().trim()), 100)
});
})
.then(d => console.log('succes'))
}
What I ended up doing is this, for all processes in the chain, I added the following event listener:
p.on('close', (code) => {
if (code > 0) {
reject(`child process PID ${p.pid} exited with code ${code}`);
}
});
If the exit code of any of the processes isn't 0 (0 means no errors), reject the Promise.

nodejs concurrency synchronous execution

I've two WebSockets getting data asynchronously, every time I get some message from the sockets I execute some code in CompareData.
The problem is that CompareData should be executed synchronously, or (better) only if it is not already running
This is my code:
function CompareData(data) {
console.log('data ', data);
AsyncFunction();
};
ws1 = new WebSocket(WS1_URL);
ws2 = new WebSocket(WS2_URL);
ws1.on('message', (data) => {
CompareData(data);
});
ws2.on('message', (data) => {
CompareData(data);
});
Can you help me, please? I'm very new to NodeJs
Node.js is single threaded. So you don't really get true concurrency issues occurring in Node programs as you might in other languages. In your example, there can only be at most one WebSocket callback for CompareData occurring at any given time.
You should not make synchronous call in node.js but you can make those call sequential. See below example might be helpful.
var messages = [];
var inProgress = false;
function CompareData(data) {
return new Promise((resolve, reject) => {
// do some stuff and resolve
setTimeout(() => {
resolve(data);
}, 1000);
});
};
const start = async () => {
if (!inProgress) {
if (messages.length !== 0) {
inProgress = true;
try {
const data = await CompareData(messages.shift());
console.log(data);
} catch (error) {
console.log(error);
}
inProgress = false;
await start();
}else{
console.log('Process Done');
}
}
}
const handler = (data) => {
messages.push(data);
start();
}
handler(1);
handler(2);
handler(3);
handler(4);
// ws1 = new WebSocket(WS1_URL);
// ws2 = new WebSocket(WS2_URL);
// ws1.on('message', handler);
// ws2.on('message', handler);
You should use some mutex in order to avoid that two async operations of compareData are executed at the same time, like node-mutex or mutexify.
My suggestions are:
First of all, you need to know when CompareData is finished. Reorganize your code to use promises or callbacks. If you're using third-party async functions, I'm almost sure they provide some feedback on completion - This is a must have in async world
Add inProgress = false flag somewhere to serve for you as simple lock. As someone posted, JS is single-threaded and you're guaranteed that your code won't get interrupted in the middle of operation. Thanks to that you can use really simple locks instead of complicated os-based mutexes known from multithreaded
langs.
In ws.on(...) check if inProgress is set. If not, lock it and run CompareData
In CompareData completion callback or on promise resolution set inProgress back to false, so you're no longer ignoring incoming data.
If you can simply discard the data, there is no need to complicate this scenario with extra queues, mutexes, etc.
If you need to serve it all, then queue incoming data and serve next piece after completion callback is fired.
This is basically what Rahul's suggests, but he uses features that are not established in current version of standard, so don't use it if you're not transpiling your code.

What is a sensible way to structure my control flow (promises and looping)?

I'm not sure of how to adequately achieve my desired control flow using promises/bluebird.
Essentially I have a database with X 'tasks' stored and each needs to be loaded and executed sequentially. I don't want to run more than one task concurrently and the entire code must continue executing indefinitely.
I have achieved this with the following code so far:
export default function syncLoop() {
getNextTaskRunner().then((taskRunner) => {
if (taskRunner) {
taskRunner.startTask()
.then(syncLoop)
.catch((error) => {
throw new Error(error);
});
} else {
syncLoop();
}
});
}
getNextTaskRunner() simply loads and resolves with the next task from the database (calc'd based on timestamps). Or it resolves with null (no task avail).
taskRunner.startTask() resolves with null when the full task has completed.
I've been advised that the way it is structured (recursive /w promises) could lead to stack issues after it has been running for some time.
What I've thought about doing is to restructure it to something like:
let running = false;
setInterval(() => {
if (!running) {
running = true;
getNextTaskRunner().then((taskRunner) => {
if (taskRunner) {
taskRunner.startTask()
.then(() => {
running = false;
})
.catch((error) => {
log.error(error);
});
} else {
running = false;
}
});
}
}, 5000);
Or as yet another possibility, using event emitters in some form?
task.on('complete', nextTask());
Thoughts and advice will be greatly appreciated!
What stack issues? The way you've written your code is perfectly fine as long as getNextTaskRunner is truly async (i.e. it gives control back to the main loop at some point, e.g. if it does async io). There is no recursion in your code in that case. Whoever told you that is mistaken.
Though you might want to add a setTimeout somewhere so you won't flood your db with requests. Plus it will help you if getNextTaskRunner will no longer be sync (due to for example in memory caching):
export default function syncLoop() {
setTimeout(() => {
getNextTaskRunner().then((taskRunner) => {
if (taskRunner) {
taskRunner.startTask()
.then(syncLoop)
.catch((error) => {
throw new Error(error);
});
} else {
syncLoop();
}
});
}, 2000);
}

Resources