NodeJs execute command in background and forget - node.js

I have an endless NodeJS script.js loop and I need this script to execute another script in background as a service which is a WebSocket service actually.
var exec = require('child_process').exec;
exec('node bgService.js &');
So now both scripts are running okay!
When I do a Ctrl+C on my script.js, the bgService.js script is also removed from memory which I don't want to.
How to run something in the background and forget ?

You can do it using child_process.spawn with detached option:
var spawn = require('child_process').spawn;
spawn('node', ['bgService.js'], {
detached: true
});
It will make child process the leader of a new process group, so it'll continue running after parent process will exit.
But by default parent process will wait for the detached child to exit, and it'll also listen for its stdio. To completely detach child process from the parent you should:
detach child's stdio from the parent process, piping it to some file or to /dev/null
remove child process from the parent event loop reference count using unref() method
Here is an example of doing it:
var spawn = require('child_process').spawn;
spawn('node', ['bgService.js'], {
stdio: 'ignore', // piping all stdio to /dev/null
detached: true
}).unref();
If you don't want to loose child's stdin output, you may pipe it to some log file:
var fs = require('fs'),
spawn = require('child_process').spawn,
out = fs.openSync('./out.log', 'a'),
err = fs.openSync('./out.log', 'a');
spawn('node', ['bgService.js'], {
stdio: [ 'ignore', out, err ], // piping stdout and stderr to out.log
detached: true
}).unref();
For more information see child_process.spawn documentation

Short answer: (tl;dr)
spawn('command', ['arg', ...],
{ stdio: 'ignore', detached: true }).unref()
unref is required to prevent parent from waiting.
docs

Related

How to get full stdout from a node.js child_process that spawns another process recursively?

I'm trying to spawn a child process like this:
const { spawn } = require('child_process')
const child = spawn("python run.py", {
detached: true,
shell: true,
})
The problem is, the run.py then calls another process called child.py.
I noticed that when I listen to the child's stdout, it prints all the log messages in the python run.py process, but not the python child.py, which is called interanlly by run.py.
So my question is, how do you capture all the STDOUT from child processes recursively?
The intermediate run.py python determines where stdio from child.py goes.
Here's some JS to run the python, setting specific stdio options
const { spawn } = require('node:child_process')
console.log('python.js')
const child = spawn("./run.py", {
stdio: [ 'inherit', 'inherit', 'inherit', 'pipe' ]
})
child.stdio[3].on('data', (data) => {
console.log('fd3: %s', data)
})
The stdio option, inherit is used to pass through the node processes stdin, stdout, stderr. This setup is not specific to your question, but I've included it as an example of what run.py will do with it's child.
The 4th element pipe opens an extra channel as file descriptor 3 (After descriptor 0/1/2 are used for stdin/stdout/stderr.
The intermediate run.py python then chooses where stdio goes for its child.py processes.
#!/usr/local/bin/python3
from subprocess import Popen
from os import fdopen
print('run.py')
# `Popen` defaults to `inherit`, so stdout/stderr from child.py will
# be written to the `run.py` processes stdout/sterr
Popen(['./child.py'])
# Second child, the file descriptor created in node will be used for stdout
pipe = fdopen(3)
Popen(['./child.py'], stdout=pipe)
The child script outputs as normal.
#!/usr/local/bin/python3
print('child.py')
When run:
$ node python.js
python.js
run.py
child.py
fd3: child.py

Execute bash script via Node.js and include command line parameters

I'm trying to execute a bash script from a Node application, which I have previously done successfully via:
var spawn = require('child_process').spawn;
spawn('bash', [pathToScript], {
stdio: 'ignore',
detached: true
}).unref();
It's important that I do it this way, because the script needs to continue to execute, even if/when the application is stopped.
Now, the script I need to execute requires an input value to be provided on the command line, ie.
./myScript.sh hello
But I cannot figure out how to pass this into the spawn call. I have tried the following, with no luck
var spawn = require('child_process').spawn;
spawn('bash', [pathToScript + '' + params], {
stdio: 'ignore',
detached: true
}).unref();
The second parameter in spawn is an array of arguments to pass to the command. So I think you almost have it but instead of concating the params to the path pass them in as an array:
var spawn = require('child_process').spawn;
var params = ['pathToScript','run', '-silent'];
spawn('bash', params, {
stdio: 'ignore',
detached: true
}).unref();

Node Spawn process outliving the main node process

I want to start a process that will live on its own and continue to live even if the node application that started it dies.
To do so I am trying to use child_process and I did not manage to have the process live even if the node process die.
Here is my code:
const cp = require('child_process');
const process = cp.spawn('long_running_process', ['arg1'], {
stdio: 'ignore',
detached: true
});
process.unref();
This code follows the child_process documentation available here :
https://nodejs.org/api/child_process.html#child_process_options_detached

Redirect stdout/stderr from a child_process to /dev/null or something similar

I am creating some child_processes with Node.js (require('child_process')) and I want to ensure that the stdout/stderr from each child_process does not go to the terminal, because I want only the output from the parent process to get logged. Is there a way to redirect the stdout/stderr streams in the child_processes to /dev/null or some other place that is not the terminal?
https://nodejs.org/api/child_process.html
perhaps it's just:
var n = cp.fork('child.js',[],{
stdio: ['ignore','ignore','ignore']
});
I just tried that, and that didn't seem to work.
Now I tried this:
var stdout, stderr;
if (os.platform() === 'win32') {
stdout = fs.openSync('NUL', 'a');
stderr = fs.openSync('NUL', 'a');
}
else {
stdout = fs.openSync('/dev/null', 'a');
stderr = fs.openSync('/dev/null', 'a');
}
and then this option:
stdio: ['ignore', stdout, stderr],
but that didn't do it, but it seems like using the "detached:true" option might make this work.
Solution:
To throw away the stdout and stderr of a forked childprocess:
setup a pipe i.e. use silent = True when forking.
And redirect the stdout and stderr pipes on the parent process into /dev/null.
Explanation:
The node.js documentation states :
For convenience, options.stdio may be one of the following strings:
'pipe' - equivalent to ['pipe', 'pipe', 'pipe'] (the default)
'ignore' - equivalent to ['ignore', 'ignore', 'ignore']
'inherit' - equivalent to [process.stdin, process.stdout, process.stderr] or [0,1,2]
Apparently childprocess.fork() does NOT support ignore; Only childprocess.spawn() does.
fork does support a silent option that allows one to choose between pipe OR inherit.
When forking a child process:
If silent = True, then stdio = pipe.
If silent = False, then stdio = inherit.
silent
Boolean
If true, stdin, stdout, and stderr of the child will be piped to the parent, otherwise they will be inherited from the parent.
See the 'pipe' and 'inherit' options for child_process.spawn()'s stdio for more details.

Running a shell command from Node.js without buffering output

I'm trying to launch a shell command from Node.js, without redirecting that command's input and output -- just like shelling out to a command using a shell script, or using Ruby's system command. If the child process wants to write to STDOUT, I want that to go straight to the console (or get redirected, if my Node app's output was redirected).
Node doesn't seem to have any straightforward way to do this. It looks like the only way to run another process is with child_process, which always redirects the child process's input and output to pipes. I can write code to accept data from those pipes and write it to my process's STDOUT and STDERR, but if I do that, the APIs force me to sacrifice some flexibility.
I want two features:
Shell syntax. I want to be able to pipe output between commands, or run Windows batch files.
Unlimited output. If I'm shelling out to a compiler and it wants to generate megabytes of compiler warnings, I want them all to scroll across the screen (until the user gets sick of it and hits Ctrl+C).
It looks like Node wants to force me choose between those two features.
If I want an unlimited amount of output, I can use child_process.spawn and then do child.stdout.on('data', function(data) { process.stdout.write(data); }); and the same thing for stderr, and it'll happily pipe data until the cows come home. Unfortunately, spawn doesn't support shell syntax.
If I want shell syntax, I can use child_process.exec. But exec insists on buffering the child process's STDOUT and STDERR for me and giving them to me all at the end, and it limits the size of those buffers (configurable, 200K by default). I can still hook the on('data') events, if I want to see the output as it's generated, but exec will still add the data to its buffers too. When the amount of data exceeds the predefined buffer size, exec will terminate the child process.
(There's also child_process.execFile, which is the worst of both worlds from a flexibility standpoint: no shell syntax, but you still have to cap the amount of output you expect.)
Am I missing something? Is there any way to just shell out to a child process in Node, and not redirect its input and output? Something that supports shell syntax and doesn't crap out after a predefined amount of output, just like is available in shell scripts, Ruby, etc.?
You can inherit stdin/out/error streams via spawn argument so you don't need to pipe them manually:
var spawn = require('child_process').spawn;
spawn('ls', [], { stdio: 'inherit' });
Use shell for shell syntax - for bash it's -c parameter to read script from string:
var spawn = require('child_process').spawn;
var shellSyntaxCommand = 'ls -l | grep test | wc -c';
spawn('sh', ['-c', shellSyntaxCommand], { stdio: 'inherit' });
To summarise:
var spawn = require('child_process').spawn;
function shspawn(command) {
spawn('sh', ['-c', command], { stdio: 'inherit' });
}
shspawn('ls -l | grep test | wc -c');
You can replace exec by spawn and use the shell syntax simply with:
const {spawn} = require ('child_process');
const cmd = 'ls -l | grep test | wc -c';
const p = spawn (cmd, [], {shell: true});
p.stdout.on ('data', (data) => {
console.log (data.toString ());
});
The magic is just {shell: true}.
I haven't used it, but I've seen this library: https://github.com/polotek/procstreams
It you'd do this. The .out() automatically pipes to the process's stdin/out.
var $p = require('procstreams');
$p('cat lines.txt').pipe('wc -l').out();
If doesn't support shell syntax, but that's pretty trivial I think.
var command_str = "cat lines.txt | wc -l";
var cmds = command_str.split(/\s?\|\s?/);
var cmd = $p(cmds.shift());
while(cmds.length) cmd = cmd.pipe(cmds.shift());
cmd
.out()
.on('exit', function() {
// Do whatever
});
There's an example in the node docs for the child_process module:
Example of detaching a long-running process and redirecting its output to a file:
var fs = require('fs'),
spawn = require('child_process').spawn,
out = fs.openSync('./out.log', 'a'),
err = fs.openSync('./out.log', 'a');
var child = spawn('prg', [], {
detached: true,
stdio: [ 'ignore', out, err ]
});
child.unref();

Resources