I start a spawn child process this way:
let process = spawn(apiPath, {
detached: true
})
process.unref()
process.stdout.on('data', data => { /* do something */ })
When I start the process I need to keep it attached because I want to read its output. But just before closing my Node process (the parent) I want to detach all not finished children processes to keep them running in background, but as the documentation say:
When using the detached option to start a long-running process, the process will not stay running in the background after the parent exits unless it is provided with a stdio configuration that is not connected to the parent.
But with the option stdio: 'ignore' I can't read the stdout which is a problem.
I tried to manually close the pipes before to close the parent process but it is unsuccessful:
// Trigger just before the main process end
process.stdin.end()
process.stderr.unpipe()
process.stdout.unpipe()
After many tests I found at least one way to solve this problem : destroying all pipe before to leave the main process.
One tricky point is that the child process have to handle correctly the pipes destroying, if not it could got an error and close anyway. In this example the node child process seems to have no problem with this but it could be different with other scenario.
main.js
const { spawn } = require('child_process')
console.log('Start Main')
let child = spawn('node', ['child.js'], { detached: true })
child.unref() // With this the main process end after fully disconnect the child
child.stdout.on('data', data => {
console.log(`Got data : ${data}`)
})
// In real case should be triggered just before the end of the main process
setTimeout(() => {
console.log('Disconnect the child')
child.stderr.unpipe()
child.stderr.destroy()
child.stdout.unpipe()
child.stdout.destroy()
child.stdin.end()
child.stdin.destroy()
}, 5000)
child.js
console.log('Start Child')
setInterval(function() {
process.stdout.write('hello from child')
}, 1000)
output
Start Main
Got data : Start Child
Got data : hello from child
Got data : hello from child
Got data : hello from child
Got data : hello from child
Disconnect the child
Related
I have a architecture with one parent that spawns tow childs (one in c++ the other in python). The parent spawns with the following class :
export class subProcess {
protected cmd: string;
protected args: string[];
protected process: child.ChildProcess;
constructor(cmd: string, args: string[]) {
this.cmd = cmd;
this.args = args;
this.process = null;
}
spawn(): void {
this.process = child.spawn(this.cmd, this.args);
const rlout = readline.createInterface({
input: this.process.stdout,
});
rlout.on('line', line => this.logger.info(line));
const rlerr = readline.createInterface({
input: this.process.stderr,
});
rlerr.on('line', line => this.logger.error(line));
this.process.on('exit', (code: number) => {
this.logger.info(`exit code: ${code}`);
});
}
When I interrupt the parent whith a Ctrl-C, signal SIGINT is caugth in the parent process to be able to first disconnect and kill the childs gracefully :
process.on('SIGINT', () => {
this.bus.disconnect();
});
disconnect is a function that sends an "exit_process" command to the childs via ZeroMQ. This command works perfectly fine in normal behavior. But the problem is that when I press Ctrl-C, the SIGINT is caugth by the Parent and it executes disconnect function (as expected) but it seems that it also propagate SIGINT to the childs. Indeed, the "exit_process" command sent via ZeroMQ reaches it's timeout (which means that the childs have never received/answered) whereas the childs emit a returned code via the exit event.
The point is that I can't detache and/or unref the childs, or manage signals in childs, for project reasons. And I expected the parent to catch the SIGINT whithout propagating it to the childs.
One more point, I tried to add the following in subProcess class, but it did not work :
this.process.on('SIGINT', () => {
console.log('SIGINT received. Do nothing');
});
Your SIGINT is being passed to the entire process group — see this section on Wikipedia. However, you're probably not seeing any output because of how the child process pipes are established.
When spawning a new child process, you can provide stdio options:
this.process = child.spawn(this.cmd, this.args, {stdio: 'inherit'});
The above causes the parent's process.stdin, process.stdout, process.stderr to be inherited by the child process. If you use this approach, you will see that your child is receiving the SIGINT.
The default behaviour is to create separate streams, which is why you are not seeing your console.log. You could also listen to the child's stdout stream:
this.process.stdout.on('data', data => console.log(data.toString()));
What I want to do is when an endpoint in my Express app is hit, I want to run a command line script - without waiting for the result - in a separate process.
Right now I am using the child_process’s spawn function and it is working, but if the Node server were to quit, the child script would quit as well. I need to have the child script run to completion even if the server quits.
I don’t need access to stdout or anything from the child script. I just need a way to basically “fire and forget”
Is there any way to do this with spawn that I may be missing? Or is there another way I should be going about this?
Thanks in advance for any guidance!
What you want here is options.detached of spawn. Setting this option will allow the sub-process to continue even after the main process calling spawn has terminated.
Quoting the documentation:
On Windows, setting options.detached to true makes it possible for the child process to continue running after the parent exits. The child will have its own console window. Once enabled for a child process, it cannot be disabled.
On non-Windows platforms, if options.detached is set to true, the child process will be made the leader of a new process group and session. Note that child processes may continue running after the parent exits regardless of whether they are detached or not. See setsid(2) for more information.
Basically this means what you "launch" keeps running until it actually terminates itself. As 'detached', there is nothing that "ties" the sub-process to the execution of the parent from which it was spawned.
Example:
listing of sub.js:
(async function() {
try {
await new Promise((resolve,reject) => {
let i = 0;
let ival = setInterval(() => {
i++;
console.log('Run ',i);
if (i === 5) {
clearInterval(ival);
resolve();
}
}, 2000);
});
} catch(e) {
console.error(e);
} finally {
process.exit();
}
})();
listing of main.js
const fs = require('fs');
const { spawn } = require('child_process');
(async function() {
try {
const out = fs.openSync('./out.log', 'a');
const err = fs.openSync('./out.log', 'a');
console.log('spawn sub');
const sub = spawn(process.argv[0], ['sub.js'], {
detached: true, // this removes ties to the parent
stdio: [ 'ignore', out, err ]
});
sub.unref();
console.log('waiting..');
await new Promise((resolve,reject) =>
setTimeout(() => resolve(), 3000)
);
console.log('exiting main..');
} catch(e) {
console.error();
} finally {
process.exit();
}
})();
The basics there are that the sub.js listing is going to output every 2 seconds for 5 iterations. The main.js is going to "spawn" this process as detached, then wait for 3 seconds and terminate itself.
Though it's not really needed, for demonstration purposes we are setting up the spawned sub-process to redirect its output ( both stdout and stderr ) to a file named out.log in the same directory.
What you see here is that the main listing does it's job and spawns the new process then terminates after 3 seconds. At this time the sub-process will only have output 1 line, but it will continue to run and produce output to the redirected file for another 7 seconds, despite the main process being terminated.
My objective is to have some code execute after a detached, unreferenced, child process is spawned from a NodeJS app. Here is the code that I have:
var child_options = {
cwd : prj
, env : {
PATH: cmd_directory
}
, detatched : true
, stdio : 'ignore'
};
//Spawn a child process with myapp with the options and command line params
child = spawn('myapp', params_array, child_options, function(err, stdout, stderr){
if (err) {
console.log("\t\tProblem executing myapp =>\n\t\t" + err);
} else {
console.log("\t\tLaunched myapp successfully!")
}
});
//Handle the child processes exiting. Maybe send an email?
child.on('exit', function(data) {
fs.writeFile(path.resolve("/Users/me/Desktop/myapp-child.log"), "Finished with child process!");
});
//Let the child process run in its own session without parent
child.unref();
So the function inside the exit handler does not seem to get executed when the child process finishes. Is there any way at all to have code execute after the child process exits even when it's detached and when calling the .unref() method?
Note that if I change the 'stdio' key value in the child_options object from 'ignore' to 'inherit' then the exit handler does execute.
Any ideas?
UPDATE PART 1
So, I still can not figure this one out. I went back to the NodeJS docs on spawn, and noticed the example about spawning "long-running processes". In one example, they redirect the child process' output to files instead of just using 'ignore' for the 'stdio' option. So I changed the 'stdio' key within the child_options object as in the following, but alas I am still not able to execute the code within the 'close' or 'exit' event:
var out_log = fs.openSync(path.resolve(os.tmpdir(), "stdout.log"), 'a'),
err_log = fs.openSync(path.resolve(os.tmpdir(), "stderr.log"), 'a');
var child_options = {
cwd : prj
, env : {
PATH: cmd_directory
}
, detatched : true
, stdio : ['ignore', out_log, err_log]
};
So, the stdout.log file does get the stdout from the child process—so I know it gets redirected. However, the code in the close or exit event still does not execute. Then I thought I would be able to detect when the writing to the out_log file was finished, in which case I would be able to execute code at that point. However, I cannot figure out how to do that. Any suggestions?
You can add listener to 'close' event, e.g. replace 'exit' with 'close'. It worked on my side even with 'ignore' stdio. Also, input parameter in callback is exit code number or null.
According to nodejs documentation difference between exit and close events:
The 'close' event is emitted when the stdio streams of a child process
have been closed. This is distinct from the 'exit' event, since
multiple processes might share the same stdio streams.
Hope it helps.
I'm trying to fork a node child process with
child_process.fork("child.js")
and have it say alive after the parent exits. I've tried using the detached option like so:
child_process.fork("child.js", [], {detached:true});
Which works when using spawn, but when detached is true using fork it just fails silently, not even executing the child.js.
I've also tried
var p = child_process.fork("child.js")
p.disconnect();
p.unref();
But child still dies when the parent does.
Any help or insight would be greatly appreciated.
EDIT:
Node Version: v5.3.0
Platform: Windows 8.1
Code:
//Parent
var child_process = require("child_process");
var p;
try{
console.log(1)
p = child_process.fork("./child.js")
console.log(2)
} catch(e){
console.log(e)
}
p.on('error', console.log.bind(console))
p.disconnect();
p.unref();
//To keep process alive
setTimeout(() => {
console.log(1);
}, 100000);
--
//Child
var fs = require("fs");
console.log(3);
fs.writeFileSync("test.txt", new Date().toString());
setTimeout(()=>{
console.log(1);
}, 100000);
I'm assuming you're executing your parent file from the command line, which is probably why it "appears" that the forked child is not executing. In reality when the parent process exits, the terminal stops waiting and thus prints a new line, waiting for your next command. This makes it seem like the child isn't executing, but trust me it is. Also there is no "detached" option for child_process.fork
Add some console.log() statements to your child process and you should see input printing in your terminal even after the parent has exited. If you don't it's because your child is prematurely exiting due to an error. Run your child process directly to debug it, before calling it from the parent.
Check out this quick example:
Hope this helps.
Consider the following code:
import {spawn, exec} from 'child_process';
var child = spawn('su',
[process.env.USER, '-c', 'while (true); do sleep 0.3; echo "tick"; done'],
{stdio: ['ignore', 'pipe', 'pipe']}
);
child.stdout.pipe(process.stdout);
child.stderr.pipe(process.stderr);
setTimeout(() => {
child.kill();
}, 1000);
Here I'm trying to run a particular script which runs some other child process (in that example su will spawn a bash process) and closes it all. However, I can't make it work as I expect.
Calling child.kill() kills just the parent process of su and not its child bash.
What can be done to make it work — calling exec(`pkill -TERM -P ${child.pid}`) instead of child.kill()? As far as I understand, this will kill the whole process tree with parent child.pid.
Yet, it has some oddity when combining two methods:
setTimeout(() => {
child.kill();
exec(`pkill -TERM -P ${child.pid}`);
}, 1000);`
This code continues writing tick into the console even after the process has been killed.
Why is this happening? Can somebody explain, please?
I was facing the exact problem. I found the solution from How to kill child processes that spawn their own child processes in Node.js.
Here is the working form of your code:
const {spawn, exec} = require('child_process');
var child = spawn('./test.sh',
[],
{stdio: ['ignore', 'pipe', 'pipe'], detached: true} // <---- this
);
child.stdout.pipe(process.stdout);
child.stderr.pipe(process.stderr);
setTimeout(() => {
process.kill(-child.pid); // <---- and this
// child.kill();
}, 1000);
When I ran your original code, the terminal prevented me to run su from script, so I modified the testing code to ./test.sh, which does the same thing:
(while (true); do sleep 0.3; echo "tick"; done)
So the lines that do the magic are detached:true and process.kill(-child.pid).
Quoted from the original site:
We can start child processes with {detached: true} option so those processes will not be attached to main process but they will go to a new group of processes. Then using process.kill(-pid) method on main process we can kill all processes that are in the same group of a child process with the same pid group.