I'm having trouble stopping a node.js process that spawns a child process. If I run the .js process from Terminal, I can stop it using Ctrl+C. But if I spawn it from a NodeJS app, I cannot kill it using kill("SIGINT") -- it just keeps going and continues to report stdout.
Here's the setup. I have a script, lets call it docker.js and it does this:
// docker.js
child_process.spawn("docker-compose", ["up", "-d", ...args], { stdio: 'inherit' });
The docker-compose up command does a lot of things and runs for awhile, sometimes for several minutes.
If I run ./docker.js from Terminal, I can consistently break out at any point by pressing Ctrl+C.
If I spawn docker.js from inside a different NodeJS app (in my case an Electron app), using spawn() or fork():
// DockerApp.js
const dir = `path/to/dockerjs/file/`;
// Tried this with and without `detached: true`
const child = spawn("node", [`./docker.js`, ...args], { cwd: dir, env, detached: true });
// Also tried this, which uses Electron's packaged node process
const child = fork(`./docker.js`, args, { cwd: dir, env, silent: true });
And I listen for stdout and stderr and close:
child.stdout.on("data", data => {
console.log(`stdout: ${data}`);
});
child.stderr.on("data", data => {
console.log(`stderr: ${data}`);
});
child.on("close", code => {
console.log(`child process exited with code ${code}`);
});
Everything works fine (I see expected output and eventually "close" after completion), but if I try to stop the process before completion like this:
child.kill("SIGINT"); // Equivalent to Ctrl+C in terminal
The child process just keeps running, and I keep getting docker-compose output through stdout.
I've tried for awhile but I cannot figure out how to stop docker.js when spawned as a child process from NodeJS/Electron app. I thought Ctrl+C from Terminal and child.kill("SIGINT") would have the same behavior, but it doesn't.
Can anyone explain what is going on here? And how can I reliably kill this docker.js child process from my NodeJS app?
Try something like this in child process:
process.on('SIGINT', () => {
console.log('Received SIGINT');
process.exit(0);
});
Related
I start a spawn child process this way:
let process = spawn(apiPath, {
detached: true
})
process.unref()
process.stdout.on('data', data => { /* do something */ })
When I start the process I need to keep it attached because I want to read its output. But just before closing my Node process (the parent) I want to detach all not finished children processes to keep them running in background, but as the documentation say:
When using the detached option to start a long-running process, the process will not stay running in the background after the parent exits unless it is provided with a stdio configuration that is not connected to the parent.
But with the option stdio: 'ignore' I can't read the stdout which is a problem.
I tried to manually close the pipes before to close the parent process but it is unsuccessful:
// Trigger just before the main process end
process.stdin.end()
process.stderr.unpipe()
process.stdout.unpipe()
After many tests I found at least one way to solve this problem : destroying all pipe before to leave the main process.
One tricky point is that the child process have to handle correctly the pipes destroying, if not it could got an error and close anyway. In this example the node child process seems to have no problem with this but it could be different with other scenario.
main.js
const { spawn } = require('child_process')
console.log('Start Main')
let child = spawn('node', ['child.js'], { detached: true })
child.unref() // With this the main process end after fully disconnect the child
child.stdout.on('data', data => {
console.log(`Got data : ${data}`)
})
// In real case should be triggered just before the end of the main process
setTimeout(() => {
console.log('Disconnect the child')
child.stderr.unpipe()
child.stderr.destroy()
child.stdout.unpipe()
child.stdout.destroy()
child.stdin.end()
child.stdin.destroy()
}, 5000)
child.js
console.log('Start Child')
setInterval(function() {
process.stdout.write('hello from child')
}, 1000)
output
Start Main
Got data : Start Child
Got data : hello from child
Got data : hello from child
Got data : hello from child
Got data : hello from child
Disconnect the child
The objective is to launch Chrome or Chromium from nodejs using child_process, and return immediately, similar to how the windows START command launches a completely separate process and the calling process can exit immediately.
The { shell: true } option for child_process.execFile() almost does the job, in that it separates the node process from the Chrome process; I can exit the main nodejs process with Ctrl+C, and the launched browser remains open. Without that option, they remain married and ^C in node closes Chrome.exe.
What I need, however, is for node to exit completely after launching Chrome. There is apparently no adverse effects of pressing ^C. So if ^C is possible to exit node, why won't it exit immediately? I suspect until the chrome process object is destroyed, node can't exit in good conscience.
What is interesting: If the same Chrome.exe happens to be running already, the "new" Chrome I am launching starts a new tab or Window in that existing chrome and exits. In that case the nodejs script exits immediately.
const child_process = require('child_process');
let ex = "C:\\PROGRA~2\\Google\\Chrome\\APPLIC~1\\chrome.exe";
let chrome = child_process.execFile(ex, [
// tried various Chromium switches here but nothing helped
], {
shell: true, // this spawns a separate process but node won't exit
} , function(err, data) {
console.log(err)
console.log(data.toString());
});
chrome.stdout.on('data', function (data) {
console.log('stdout: ' + data);
});
chrome.stderr.on('data', function (data) {
console.log('stderr: ' + data);
});
chrome.on('exit', function (code) {
console.log('child process exited with code ' + code);
// chrome.kill();
});
Expected: Since the nodejs can be killed with ^C, why does it even continue running / blocking? I would expect it to exit after it launched Chrome.exe.
However, in actuality, nodejs blocks until I exit Chrome, or press ^C.
I also tried without callback function and .stdout, .stderr and .on hooks -- they don't seem to help or hurt. Node always blocks till I ^C or the child process, albeit separate, exits.
Posting r3wt's comment as answer here: use process.exit(0) to exit your script with non-error. it doesn't exit immediately because there are running EventEmitter(s)
const path = require('path');
const spawn = require('child_process').spawn;
chrome = spawn(
path.join(__dirname, '/path_to_chrome'),
[ ... your chrome switches here],
{
shell: true, // use to run in a shell if you need
detached: true, // important
stdio: 'ignore' // important
}
);
chrome.unref(); // this one is important too
To achieve both: possibility to exit immediately and keep subprocess running, - follow the instructions below.
Use subProcess.unref() which is explained here
It prevents parent from waiting for a given subprocess to exit.
One more thing is detached option together with stdio:'ignored' which allow you to keep your subprocess alive even after you stop nodejs process.
I am trying to get a node script to run another command with child_process, which starts up a server, but this server has 'live reload' functionality.
As far as I understand, as soon as anything changes, the child process restarts, and my parent process continues as if the child died/exited.
I want my parent process to re-connect to a newly forked/spawned child.
Is there any way to do this?
const command = 'server --listen 7070';
const process = child_process.exec(command);
process.stdout.on('data', (data) => {
console.log(data);
});
process.on('exit', (event) => {
console.log('The child process exited')
});
In this example, the server keeps running, but the stdout processing stops after the child process restarts itself for the first time.
What I want to do is when an endpoint in my Express app is hit, I want to run a command line script - without waiting for the result - in a separate process.
Right now I am using the child_process’s spawn function and it is working, but if the Node server were to quit, the child script would quit as well. I need to have the child script run to completion even if the server quits.
I don’t need access to stdout or anything from the child script. I just need a way to basically “fire and forget”
Is there any way to do this with spawn that I may be missing? Or is there another way I should be going about this?
Thanks in advance for any guidance!
What you want here is options.detached of spawn. Setting this option will allow the sub-process to continue even after the main process calling spawn has terminated.
Quoting the documentation:
On Windows, setting options.detached to true makes it possible for the child process to continue running after the parent exits. The child will have its own console window. Once enabled for a child process, it cannot be disabled.
On non-Windows platforms, if options.detached is set to true, the child process will be made the leader of a new process group and session. Note that child processes may continue running after the parent exits regardless of whether they are detached or not. See setsid(2) for more information.
Basically this means what you "launch" keeps running until it actually terminates itself. As 'detached', there is nothing that "ties" the sub-process to the execution of the parent from which it was spawned.
Example:
listing of sub.js:
(async function() {
try {
await new Promise((resolve,reject) => {
let i = 0;
let ival = setInterval(() => {
i++;
console.log('Run ',i);
if (i === 5) {
clearInterval(ival);
resolve();
}
}, 2000);
});
} catch(e) {
console.error(e);
} finally {
process.exit();
}
})();
listing of main.js
const fs = require('fs');
const { spawn } = require('child_process');
(async function() {
try {
const out = fs.openSync('./out.log', 'a');
const err = fs.openSync('./out.log', 'a');
console.log('spawn sub');
const sub = spawn(process.argv[0], ['sub.js'], {
detached: true, // this removes ties to the parent
stdio: [ 'ignore', out, err ]
});
sub.unref();
console.log('waiting..');
await new Promise((resolve,reject) =>
setTimeout(() => resolve(), 3000)
);
console.log('exiting main..');
} catch(e) {
console.error();
} finally {
process.exit();
}
})();
The basics there are that the sub.js listing is going to output every 2 seconds for 5 iterations. The main.js is going to "spawn" this process as detached, then wait for 3 seconds and terminate itself.
Though it's not really needed, for demonstration purposes we are setting up the spawned sub-process to redirect its output ( both stdout and stderr ) to a file named out.log in the same directory.
What you see here is that the main listing does it's job and spawns the new process then terminates after 3 seconds. At this time the sub-process will only have output 1 line, but it will continue to run and produce output to the redirected file for another 7 seconds, despite the main process being terminated.
Consider the following code:
import {spawn, exec} from 'child_process';
var child = spawn('su',
[process.env.USER, '-c', 'while (true); do sleep 0.3; echo "tick"; done'],
{stdio: ['ignore', 'pipe', 'pipe']}
);
child.stdout.pipe(process.stdout);
child.stderr.pipe(process.stderr);
setTimeout(() => {
child.kill();
}, 1000);
Here I'm trying to run a particular script which runs some other child process (in that example su will spawn a bash process) and closes it all. However, I can't make it work as I expect.
Calling child.kill() kills just the parent process of su and not its child bash.
What can be done to make it work — calling exec(`pkill -TERM -P ${child.pid}`) instead of child.kill()? As far as I understand, this will kill the whole process tree with parent child.pid.
Yet, it has some oddity when combining two methods:
setTimeout(() => {
child.kill();
exec(`pkill -TERM -P ${child.pid}`);
}, 1000);`
This code continues writing tick into the console even after the process has been killed.
Why is this happening? Can somebody explain, please?
I was facing the exact problem. I found the solution from How to kill child processes that spawn their own child processes in Node.js.
Here is the working form of your code:
const {spawn, exec} = require('child_process');
var child = spawn('./test.sh',
[],
{stdio: ['ignore', 'pipe', 'pipe'], detached: true} // <---- this
);
child.stdout.pipe(process.stdout);
child.stderr.pipe(process.stderr);
setTimeout(() => {
process.kill(-child.pid); // <---- and this
// child.kill();
}, 1000);
When I ran your original code, the terminal prevented me to run su from script, so I modified the testing code to ./test.sh, which does the same thing:
(while (true); do sleep 0.3; echo "tick"; done)
So the lines that do the magic are detached:true and process.kill(-child.pid).
Quoted from the original site:
We can start child processes with {detached: true} option so those processes will not be attached to main process but they will go to a new group of processes. Then using process.kill(-pid) method on main process we can kill all processes that are in the same group of a child process with the same pid group.