NodeJS child_process: close event not fired when specifying stdio: "inherit" - node.js

I am trying to execute some commands in the shell using NodeJS. Therefore I use the node:child_process module.
I use the spawn function in order to be able to forward the output of the child process to the console of the main process.
In order to keep the formatting of the output of the child process I passed the option stdio: "inherit" (as described in this question: preserve color when executing child_process.spawn).
But if I add this option the child process events (exit, disconnect, close, ...) don't work anymore. If I get rid of the option I lose the formatting, but the events work. Is there a way to keep the formatting and be informed, when the child process closes?
The (relevant) code:
const { spawn } = require("node:child_process");
let child = spawn("yarn", args, {
stdio: "inherit",
shell: true,
});
child.on("close", (code) => {
console.log(`child process exited with code ${code}`);
});

stdio: 'inherit' means you're also forwarding your STDIN to the child process, therefore if the child process reads STDIN, it will never exit, and your close listener will never be called. In particular, that's the case for Node.JS REPL (yarn node).
Depending on your needs, you may want to:
just stop the child process with child.kill(). The close listener is then called, note that code is 0, and the second argument (signal) is SIGTERM (documentation);
do not forward STDIN but still forward STDOUT and STDERR back: call spawn with stdio: ['ignore', 'inherit', 'inherit'] (documentation). When the child process will exit by itself and the streams released, the close listener will be called.

Related

How do I spawn two processes from Node.js and pipe them together?

I want to be able to spawn two Node.js child processes but have the stdout from one be piped to the stdin of another. The equivalent of:
curl "https://someurl.com" | jq .
My Node's stdout will go to either a terminal or to a file, depending on whether the user pipes the output or not.
You can spawn a child process with Node.js's child_process built-in module. We need to processes, so we'll call it twice:
const cp = require('child_process')
const curl = cp.spawn('curl', ['https://someurl.com'], { stdio: ['inherit', 'pipe', 'inherit'] })
const jq = cp.spawn('jq', ['.'], { stdio: ['pipe', 'inherit', 'pipe'] })
The first parameter is the executable to run, the second is the array of parameters to pass it and the third is options. We need to tell it where the process's stdin, stdout and stderr are to be routed: 'inherit' means "use the host Node.js application's stdio", and 'pipe' means "we'll handle it programmatically.
So in this case curl's output and jq's input are left to be dealt with programmatically which we do with an additional line of code:
curl.stdout.pipe(jq.stdin)
which means "plumb curl's stdout into jq's stdin".
It's as simple as that.

Where does the buffer come into picture when using node.js exec function instead of spawn function?

As I read from the child_process module documentation of Node.js, I understand the difference between exec and spawn. The most crucial difference that is also highlighted in a similar StackOverflow question about the Spawn vs Exec:
The main difference is that spawn is more suitable for long-running processes with huge output. That's because spawn streams input/output with a child process. On the other hand, exec buffers output in a small (by default 200K) buffer.
However, I noticed, thanks to TS Intellisense that both exec and spawn function return similar object of type ChildProcess. So, I could technically write this for the exec function using stdout as stream and it works:
function cmdAsync (cmd, options) {
return new Promise((resolve) => {
const proc = exec(cmd, options);
proc.stdout?.pipe(process.stdout);
proc.on('exit', resolve);
});
}
cmdAsync('node server/main.mjs');
And without any buffer time/delay, I could see the logs generated by server/main.mjs file being piped into the parent process's stdout stream.
So, my question is exactly where the buffering is happening and how the streaming behavior is different for exec than spawn function! Also, can I rely of this feature even if it is undocumented?

Stderr of child process not received when child process exits

We have a VS Code extension built in node that runs like so:
Our extension is a node process, it calls child_process.spawn(...), with pipe option for the stdio. The child process' stdin and stdout are used to pass messages between the two processes.
Due to a bug in the child process binary, we see an issue where SIGSEGV error code is thrown by the binary. When we call into the process on the commandline and hit this error, we see the stack trace is dumped to stderr.
In the VS Code extension/node process, we see the handlers for the child process exiting are hit, however the handlers for the stderr output do not.
In other non-crashing scenarios, we see stderr output is correctly transmitted.
The implementation of spawning and handling the child process is reasonably complex, however it boils down to something like this:
child_process.ChildProcess childProcess;
startChildProcess() {
this.childProcess = child_process.spawn(binary, args, {
cwd: root,
env: {...process.env, ...environment},
// pipe stdin and stdout, and inherit stderr (unless parent know better)
stdio: ['pipe', 'pipe', 'pipe'],
});
// set up stdin and stdout pipes here...
this.childProcess.on('exit', (code, signal) => {
// do something interesting when the process exits
// THIS CODE BLOCK IS HIT
}
this.childProcess.stderr?.addListener('data', (errOutput) => {
const errOutputStr = errOutput.toString();
process.stderr.write(errOutputStr);
// THIS CODE BLOCK IS NOT HIT WHEN THE ABOVE IS HIT
})
}
As annotated in the example, the stderr output is not hit in the case of a SIGSEGV from child process.
What can I do to ensure the errors are output? VS Code, the extension process and the child process are all running on Mac OS

What are the ways to flush a Linux command's stdout as a Node.js child process?

What are the ways to flush a Linux command's stdout as a Node.js child process?
Ending the stdin stream (child.stdin.end) will work. As will unbuffering the command with stdbuf.
But I imagine there's a proper way to stream results from external commands.
How do we tell a command we're ready to consume while we are still providing data?
Example:
const { spawn } = require('child_process');
const child = spawn('uniq');
child.stdout.pipe(process.stdout);
child.stdin.write('a\nb\nb\nc\n', 'utf8');
// No output, child is still running.
(uniq is just an example here. It's the same with most Linux commands.)

Node-webkit execute external command?

How to execute system command in node-webkit (or node.js) external process parallel of current script.
I'm trying to use child_process. After interruption of my script subprocess is exit. However i need a simple way execute bash command without output or with output but without program stop when my script will be interrupted.
I need a correct simple way.
Thanks all.
Use detached option in spawn/execute arguments:
If the detached option is set, the child process will be made the
leader of a new process group. This makes it possible for the child to
continue running after the parent exits.
By default, the parent will wait for the detached child to exit. To
prevent the parent from waiting for a given child, use the
child.unref() method, and the parent's event loop will not include the
child in its reference count.
Example of detaching a long-running process and redirecting its output
to a file:
var fs = require('fs'),
spawn = require('child_process').spawn,
out = fs.openSync('./out.log', 'a'),
err = fs.openSync('./out.log', 'a');
var child = spawn('prg', [], {
detached: true,
stdio: [ 'ignore', out, err ]
});
child.unref();

Resources