setTimeout blocks Promise inside a child process - node.js

I encountered a weird issue with setTimeout inside a promise in a child process.
These are my files:
index.js:
const {spawnSync} = require('child_process');
const {resolve} = require('path');
const timeoutPromiseModule = resolve(__dirname, '.', 'timeout-promise');
const {stdout} = spawnSync('node', [timeoutPromiseModule]);
console.log(stdout.toString());
timeout-promise.js:
Promise.race([
Promise.resolve(),
new Promise((resolve, reject) => {
setTimeout(() => {reject('too long')}, 10000);
})
])
.then(x=> console.log('resolved'))
.catch(e => console.log('rejected'));
When I run node index.js I expected the output to be print immediatly but what actually happens is that the output hangs until setTimeout's callback is called by the child process.
What's causing this and how can this be resolved?
I'm guessing it's something to do with the child process's event loop that prevents the child process from closing until the messages empty?
I uploaded the code to GitHub for your convenience:
https://github.com/alexkubica/promise-race-settimeout-blocking-inside-child-process

The reason for this is that spawnSync will not return until the child process has fully closed as stated in the documentation:
The child_process.spawnSync() method is generally identical to
child_process.spawn() with the exception that the function will not
return until the child process has fully closed. [...]
Note that a node script will only exit when there are no more pending tasks in the eventloop's queue, which in this case happens after the timeout has resolved.
You can switch to spawn to see the immediatley resolved promise output:
const res = spawn('node', [timeoutPromiseModule]);
res.stdout.on('data', (data) => {
console.log(`stdout: ${data}`);
});
res.on('close', (code) => {
console.log(`child process exited with code ${code}`);
});

Related

node: await with child process message handler

I'm having trouble wrapping my hands around some async/await code I'm working on. Is there a way to make a child process's message handler async? Here is what my child process file looks like:
// child.ts
import { writeImage } from './generate-images'
const slowFunction = async (imageAttributes) => {
console.log("inside child slowFunction....")
await writeImage(imageAttributes, 0, true)
}
process.on('message', (msg) => {
console.log('starting child....')
slowFunction(msg)
console.log('exiting child')
process.exit()
})
I am calling it via fork in a big loop in the parent process because i need to perform this slow function a few thousand times, this is a dumbed down version of what im calling inside a big loop in the parent:
// parent.ts
const child = child_process.fork(path.join(__dirname, 'child.ts'))
child.send(Array.from(imageAttributes)[i])
child.on('exit', function () {
console.log(`child exiting`)
// do some cleanup
})
the problem is that all my forks keep exiting before slowFunction finishes because its an async function, but i cant add async slowFunction(msg) because the process.on('message', ...) handler is not async.
any ideas?
You need to make process.on('message', ...) an async function. No callbacks will be async by default.
process.on('message', async (msg) => {
console.log('starting child....')
await slowFunction(msg)
console.log('exiting child')
process.exit()
})

How to pipe text into command with promisified node exec

I am using node to execute a jar file that usually takes a CSV file as an input path.
I would like to try and circumvent writing the CSV file and pipe in the CSV as a string into the process if possible.
I have this working with execSync but I would prever to use exec wrapped with promisify
The problem is that exec does not have the input option like execSync so I can't pipe data into it. How do you get around this? Or is the best practice to wrap execSync in a Promise?
import {execSync} from 'child_process';
export const runJar = async (input: string, cwd: string) => {
const out = execSync(`java -jar model.jar`, {
cwd,
input,
})
return out.toString('utf-8');
};
Minimalistic example usage of a childs process stdio.
https://nodejs.org/dist/latest-v14.x/docs/api/child_process.html#child_process_child_process_exec_command_options_callback
const child_process = require("child_process");
const fs = require("fs");
// exec returns a child process instance
// https://nodejs.org/dist/latest-v14.x/docs/api/child_process.html#child_process_class_childprocess
const child = child_process.exec("cat");
// write to child process stdin
child.stdin.write("Hello World");
// to read/parse your csv file
//fs.createReadStream("./file.csv").pipe(child.stdin);
// listen on child process stdout
child.stdout.on("data", (chunk) => {
console.log(chunk);
child.kill();
});
To promisfy this, you can listen on the exit (status) on the child process and resolve or reject the promise based on the exit code:
child.on("close", (code) => {
if (code != 0) {
reject();
} else {
resolve();
}
});
Example given:
const readParseCSV = function (file = "./file.csv") {
return new Promise((resolve, reject) => {
const child = child_process.exec("java -jar model.jar");
fs.createReadStream(file).pipe(child.stdin);
let response = "";
// listen on child process stdout
child.stdout.on("data", (chunk) => {
response += chunk;
});
child.on("close", (code) => {
if (code != 0) {
reject();
} else {
resolve(response);
}
});
});
};
Im not sure if this works on windows the same way as on linux.

FileStream Promise resolving early

I am experiencing a rather weird problem in nodeJS, and I cannot quite figure out why.
Consider the following code:
(async () => {
console.log ("1");
await new Promise ((resolve, reject) => {
setTimeout (() => {
console.log ("2");
resolve ();
}, 1000);
});
console.log ("3");
process.exit ();
})();
This code does exactly what it is supposed to do. It prints 123, in that order. After the print of 1, it waits approximately one second. Perfect. Now let's see the following example:
const fs = require ("fs");
(async () => {
const stream = fs.createWriteStream ("file.txt");
stream.write ("Test");
console.log ("1");
await new Promise ((resolve, reject) => {
stream.on ("finish", () => {
console.log ("2");
resolve ();
});
});
console.log ("3");
process.exit ();
})();
From my understanding, this code should either complete, or - in case the finish event never gets fired - run infinitely. What happens is the exact opposite: It prints 1, then quits. Shouldn't it at least print another 3 before quitting, since this is the end of the script?
Important: I know that the promise will not resolve, because .end() is not called on the stream. I want to know why the script finishes anyway.
Can anyone explain this behaviour to me?
The best explanation is probably to write this without the async/await keywords and for you to undertstand these don't do anything "magical" and are simply "sugar" for a different way to resolve a Promise as opposed to .then().
const fs = require ("mz/fs");
const stream = fs.createWriteStream("file.txt");
stream.write("Test");
console.log("1");
new Promise ((resolve, reject) => {
stream.on ("finish", () => {
console.log("2");
resolve();
});
}).then(() => {
console.log("2");
process.exit();
});
That's the exact same thing right! So where's the catch.
The thing you are really missing is there is "nothing" that says when you open a file handle it "must" be explicitly closed before the program can exit. As such, there is "nothing to wait for" and the program completes but does not "branch" into the part that is still awaiting the Promise to resolve().
The reason why it only logs "1" is because the remaining branch "is" waiting for the Promise to resolve, but it's just never going to get there before the program finishes.
Of course that all changes when you actually call stream.end() immediately after the write or ideally by "awaiting" any write requests that may be pending:
const fs = require ("mz/fs");
(async () => {
const stream = fs.createWriteStream ("file.txt");
await stream.write ("Test"); // await here before continuing
stream.end()
console.log ("1");
await new Promise ((resolve, reject) => {
stream.on ("finish", () => {
console.log ("2");
//resolve ();
});
});
console.log ("3");
//process.exit ();
})();
That of course will log each output in the listing, as you should well know.
So If you were expecting to see the "3" in the log, the reason why it does not is because of the await where we don't ever close the stream. Again probably best demonstrated by getting rid of the await:
const fs = require ("mz/fs");
(async () => {
const stream = fs.createWriteStream ("file.txt");
await stream.write ("Test");
stream.end()
console.log ("1");
new Promise ((resolve, reject) => { // remove await - execution hoisted
stream.on ("finish", () => {
console.log ("2");
//resolve ();
});
});
console.log ("3");
//process.exit ();
})();
Then you "should" see:
1
3
2
At least on most systems unless you have an "extreme" lag. But generally the "finish" should get fired before the next line was reached after "awaiting" the write.
NOTE: Just using the mz library here for demonstration of an an await on the write() method without wrapping a callback. Generally speaking the callback execution should resolve just the same.

Why does my forked child process exit immediately after I fork it?

I'm just trying to fork a simple child process and have the IPC channel stay open but it keeps exiting immediately for some reason.
In parent.js:
var child = require('child_process').fork('./child.js');
child.on('hi', function() {
console.log("Hi");
});
child.on('exit', function() {
console.log("Exited");
});
child.send('hello');
In child.js:
process.on('hello', function() {
process.send('hi');
});
I get "Exited" printed to the console immediately, and never get a 'Hi'. Then if I continue to try to send to the child process I get a channel closed error.
Something I am doing wrong?
You need to keep both processes open as a child will close immediately and so will the parent. You can do so with something like this:
parent.js
var child = require('child_process').fork('./child.js');
child.on('message', function () {
console.log("Hi");
});
child.on('exit', function () {
console.log("Exited");
});
setTimeout(() => {
child.send('hello');
}, 1000);
process.stdin.resume();
child.js
process.on('message', function () {
console.log("sending hi");
process.send('hi');
});

Node.js child process with detached option

I am creating an electron desktop app, and I have code use spawn() with option detached: true. My purpose is to let the child process keep running even when the parent process terminated.
const spawn = require('child_process').spawn;
const ls = spawn('ls', ['-lh', '/usr'], { detached: true });
ls.stdout.on('data', (data) => {
console.log(`stdout: ${data}`);
fs.writeFileSync('path-to-test.txt', 'stdout');
});
ls.stderr.on('data', (data) => {
console.log(`stderr: ${data}`);
fs.writeFileSync('path-to-test.txt', 'stderr');
});
ls.on('close', (code) => {
console.log(`child process exited with code ${code}`);
});
squirrel events https://github.com/electron/grunt-electron-installer#handling-squirrel-events:
switch (squirrelCommand) {
case '--squirrel-install':
case '--squirrel-updated':
app.quit();
return true;
case '--squirrel-uninstall':
app.quit();
return true;
case '--squirrel-obsolete':
return true;
}
I tested the above code outside of squirrel events, it works well when the parent process is alive. But after I put these code inside squirrel events like --squirrel-uninstall (the parent process may terminated before/during child process run), it can only run commands, any code inside it (like fs function) doesn't work any more.
My have a question is: despite of squirrel event, can the logic code like fs inside child process work after the node parent process terminate?

Resources