Node-cli freeze after Spawning a child_process - node.js

The cli could not receive keyboard input after execution, and this also includes 'ctrl-c' and 'ctrl-z', thus you have to manually exit the program. It gave me a lot of trouble, please take a look at it;
var { exec, spawn } = require("child_process");
let cmd = (cmdline, consolelog = true) => {
return new Promise((resolve, reject) => {
let cmdarray = cmdline.split(" ");
let result = "";
let error = "";
let child = spawn(cmdarray.shift(), cmdarray);
process.stdin.pipe(child.stdin);
child.stdout.setEncoding("utf8");
child.stderr.setEncoding("utf8");
child.stderr.on("data", data => {
if (consolelog) process.stdout.write(data.toString());
error = data.toString();
});
child.stdout.on("data", data => {
if (consolelog) process.stdout.write(data.toString());
result = data.toString();
});
child.on("close", code => {
if (consolelog) process.stdout.write(`Exit code: ${code}\n`);
code == 0 ? resolve(result) : reject(error);
});
});
};
OS: osx & ubuntu 19.04
Test case:
cmd("echo hi");
Edit:
Normal circumstances : put the code inside myprogram.js and use node myprogram.js to activate the script. It works perfectly, and you can also try different commands. HOWEVER, if you put following code by using
$ node
> let cmd = require(PATH_TO_CMD_FUNCTION)
> cmd("echo hi");
The node-cli will freeze and stop listening to your keyboard input.
Edit 2:
Found out, you need to channel through {stdio: "inherit"}

UPDATED ANSWER:
I trimmed down your spawner a little in order to be succinct, and eliminate any other possibilities. There is one common test case I could find to reproduce stated issue regarding signals, keyboard shortcuts, and trapped input.
If you spawn the 'sh' command, you will not be able to escape from the spawned process by means of conventional signal keyboard shortcuts. This is because node.js "traps" the input and forwards it directly to the spawned process.
Most processes allow killing via signalling through keyboard shortcuts such as CTRL-C. 'sh', however, does not-- and so is a perfect example.
The only ways to exit are to use the 'exit' command, close the window (which may possibly leave the spawned process running in the background), reboot your machine, etc. Also, internally or by other means sending a signal, but not via stdin or equivalent.
Your CTRL-C input, in other words, is "normally working" not because it is killing your node app, but because it is being forwarded to the spawned process and killing it.
The spawned process will continue to trap your input if it is immune.
require("child_process").spawn("sh", {
shell: true,
encoding: 'utf8',
stdio: [0,1,2]
});
This may not be the best example for your specific program, but it illustrates the principle, which is the closest I can come since I cannot replicate with the given test case (I have tried it on my phone, my laptop, and my cloud server, three different versions of node, two different versions of Ubuntu).
In any case, it sounds like your stdin is not being "let go" by the spawned process. You may need to "reassign" it to the original process.stdin .
As stated here:
https://node.readthedocs.io/en/latest/api/child_process/
Also, note that node establishes signal handlers for 'SIGINT' and
'SIGTERM', so it will not terminate due to receipt of those signals,
it will exit.
PREVIOUSLY:
It looks like your cmd function is only getting one argument (the command itself) due to the split and shift. Spawn expects a string with the whole command, so likely it is only getting "echo" without "hi", so it isn't exiting due to hanging on "echo". May need to append a newline ("\n") as well.
It also may help to nest the command in an sh command that then executes it, so it runs in a shell.
Like this:
var { exec, spawn } = require("child_process");
let cmd = (cmdline, consolelog = true) => {
return new Promise((resolve, reject) => {
let result = "";
let error = "";
// This. Note the shell option.
let child = spawn(cmdline, {shell:true});
process.stdin.pipe(child.stdin);
child.stdout.setEncoding("utf8");
child.stderr.setEncoding("utf8");
child.stderr.on("data", data => {
if (consolelog) process.stdout.write(data.toString());
error = data.toString();
});
child.stdout.on("data", data => {
if (consolelog) process.stdout.write(data.toString());
result = data.toString();
});
child.on("close", code => {
if (consolelog) process.stdout.write(`Exit code: ${code}\n`);
code == 0 ? resolve(result) : reject(error);
});
});
};
cmd("echo hello");
Output:
hello
Exit code: 0

Related

Can I “listen” for a specific output with child_process?

So far I have gotten my script to execute a windows .bat file with child_process, my issue is that it opens it in the background with no way to “connect” to it to see what happens and debug, is there a way to “listen” for a certain output to happen? For example, if the .bat outputs a “Done!” in the shell at one point, is there a way to make my node.js script detect that certain keyword and run further commands if it does?
Thanks!
Some clarification: The .bat outputs "Done!" and stays running, it doesn't stop, all I want to do is detect that "Done!" so that I can send a message to the user that the server has successfully started
My current code:
exec('D:\\servers\\game_server_1\\start.bat', {shell: true, cwd: 'D:\\servers\\game_server_1'});
Well, if you're trying to do a one and done type of NodeJS script, you can just spawn a process that launches with the given command and exits when all commands completed. This creates a one and done streaming interface that you can monitor. The stdout returns a data buffer that returns the command you ran, unless it's something like START to launch a program-- it returns null. You could just issue a KILL command after the START -- your_program.exe:
const spawn = require('child_process').spawn;
const child = spawn('cmd.exe', ['/c', 'commands.bat']);
let DONE = 0;
const done = () => {
console.log("log it");
DONE++;
};
child.stdout.on('data', function (data) {
console.log('stdout: ' + data);
//it's important to add some type of counter to
//prevent any logic from running twice, since
//this will run twice for any given command
if ( data.toString().includes("DONE") && DONE === 0 ) {
done();
}
});
child.stderr.on('data', function (data) {
console.log('stderr: ' + data);
});
child.on('exit', function (code) {
console.log('child process exited with code ' + code);
});
Keep in mind, when you run a command to launch a program and the program launches, the data buffer will be null in stdout event listener. The error event will only fire if there was an issue with launching the program.
YOUR .BAT:
ECHO starting batch script
//example launching of program
START "" https://localhost:3000
//issue a command after your program launch
ECHO DONE
EXIT
You could also issue an ECHO DONE command right after the command where you launched the program and listen for that, and try and parse out that command from stdout.
You could use a Regular expression.
const { spawn } = require('child_process');
const child = spawn(...);
child.stdout.on('data', function (data) {
console.log('stdout: ' + data);
// Now use a regular expression to detect a done event
// For example
data.toString().match(/Done!/);
});
// Error handling etc. here

Nodejs exec child process stdout not getting all the chunks

I'm trying to send messages from my child process to main my process but some chunks are not being sent, possibly because the file is too big.
main process:
let response = ''
let error = ''
await new Promise(resolve => {
const p = exec(command)
p.stdout.on('data', data => {
// this gets triggered many times because the html string is big and gets split up
response += data
})
p.stderr.on('data', data => {
error += data
})
p.on('exit', resolve)
})
console.log(response)
child process:
// only fetch 1 page, then quit
const bigHtmlString = await fetchHtmlString(url)
process.stdout.write(bigHtmlString)
I know the child process works because when I run the it directly, I can see the end of the file in in the console. But when I run the main process, I cannot see the end of the file. It's quite big so I'm not sure exactly what chunks are missing.
edit: there's also a new unknown problem. when I add a wait at the end of my child process, it doesn't wait, it closes. So I'm guessing it crashes somehow? I'm not seeing any error even with p.on('error', console.log)
example:
const bigHtmlString = await fetchHtmlString(url)
process.stdout.write(bigHtmlString)
// this never gets executed, the process closes. The wait works if I launch the child process directly
await new Promise(resolve => setTimeout(resolve, 1000000))
process.stdout.write(...) returns true/false depending on whether it wrote the string or not. If it returns false, you can listen() to the drain event to make sure it finishes.
Something like this:
const bigHtmlString = await fetchHtmlString(url);
const wrote = process.stdout.write(bigHtmlString);
if (!wrote){
// this effectively means "wait for this
// event to fire", but it doesn't block everything
process.stdout.on('drain', ...doSomethingHere)
}
My suggestion from the comments resolved the issue so I'm posting it as an answer.
I would suggest using spawn instead of exec. The latter buffers the output and flushes it when the process is ended (or the buffer is full) while spawn is streaming the output which is better for huge output like in your case

Execute script from Node in a separate process

What I want to do is when an endpoint in my Express app is hit, I want to run a command line script - without waiting for the result - in a separate process.
Right now I am using the child_process’s spawn function and it is working, but if the Node server were to quit, the child script would quit as well. I need to have the child script run to completion even if the server quits.
I don’t need access to stdout or anything from the child script. I just need a way to basically “fire and forget”
Is there any way to do this with spawn that I may be missing? Or is there another way I should be going about this?
Thanks in advance for any guidance!
What you want here is options.detached of spawn. Setting this option will allow the sub-process to continue even after the main process calling spawn has terminated.
Quoting the documentation:
On Windows, setting options.detached to true makes it possible for the child process to continue running after the parent exits. The child will have its own console window. Once enabled for a child process, it cannot be disabled.
On non-Windows platforms, if options.detached is set to true, the child process will be made the leader of a new process group and session. Note that child processes may continue running after the parent exits regardless of whether they are detached or not. See setsid(2) for more information.
Basically this means what you "launch" keeps running until it actually terminates itself. As 'detached', there is nothing that "ties" the sub-process to the execution of the parent from which it was spawned.
Example:
listing of sub.js:
(async function() {
try {
await new Promise((resolve,reject) => {
let i = 0;
let ival = setInterval(() => {
i++;
console.log('Run ',i);
if (i === 5) {
clearInterval(ival);
resolve();
}
}, 2000);
});
} catch(e) {
console.error(e);
} finally {
process.exit();
}
})();
listing of main.js
const fs = require('fs');
const { spawn } = require('child_process');
(async function() {
try {
const out = fs.openSync('./out.log', 'a');
const err = fs.openSync('./out.log', 'a');
console.log('spawn sub');
const sub = spawn(process.argv[0], ['sub.js'], {
detached: true, // this removes ties to the parent
stdio: [ 'ignore', out, err ]
});
sub.unref();
console.log('waiting..');
await new Promise((resolve,reject) =>
setTimeout(() => resolve(), 3000)
);
console.log('exiting main..');
} catch(e) {
console.error();
} finally {
process.exit();
}
})();
The basics there are that the sub.js listing is going to output every 2 seconds for 5 iterations. The main.js is going to "spawn" this process as detached, then wait for 3 seconds and terminate itself.
Though it's not really needed, for demonstration purposes we are setting up the spawned sub-process to redirect its output ( both stdout and stderr ) to a file named out.log in the same directory.
What you see here is that the main listing does it's job and spawns the new process then terminates after 3 seconds. At this time the sub-process will only have output 1 line, but it will continue to run and produce output to the redirected file for another 7 seconds, despite the main process being terminated.

Electron kill child_process.exec

I have an electron app that uses child_process.exec to run long running tasks.
I am struggling to manage when the user exits the app during those tasks.
If they exit my app or hit close the child processes continue to run until they finish however the electron app window has already closed and exited.
Is there a way to notify the user that there are process still running and when they have finished then close the app window?
All I have in my main.js is the standard code:
// Quit when all windows are closed.
app.on('window-all-closed', function() {
// On OS X it is common for applications and their menu bar
// to stay active until the user quits explicitly with Cmd + Q
if (process.platform != 'darwin') {
app.quit();
}
});
Should I be adding a check somewhere?
Thanks for your help
EDITED
I cannot seem to get the PID of the child_process until it has finished. This is my child_process code
var loader = child_process.exec(cmd, function(error, stdout, stderr) {
console.log(loader.pid)
if (error) {
console.log(error.message);
}
console.log('Loaded: ', value);
});
Should I be trying to get it in a different way?
So after everyones great comments I was able to update my code with a number of additions to get it to work, so am posting my updates for everyone else.
1) Change from child_process.exec to child_process.spawn
var loader = child_process.spawn('program', options, { detached: true })
2) Use the Electron ipcRenderer to communicate from my module to the main.js script. This allows me to send the PIDs to main.js
ipcRenderer.send('pid-message', loader.pid);
ipcMain.on('pid-message', function(event, arg) {
console.log('Main:', arg);
pids.push(arg);
});
3) Add those PIDs to array
4) In my main.js I added the following code to kill any PIDs that exist in the array before exiting the app.
// App close handler
app.on('before-quit', function() {
pids.forEach(function(pid) {
// A simple pid lookup
ps.kill( pid, function( err ) {
if (err) {
throw new Error( err );
}
else {
console.log( 'Process %s has been killed!', pid );
}
});
});
});
Thanks for everyones help.
ChildProcess emits an exit event when the process has finished - if you keep track of the current processes in an array, and have them remove themselves after the exit event fires, you should be able to just foreach over the remaining ones running ChildProcess.kill() when you exit your app.
This may not be 100% working code/not the best way of doing things, as I'm not in a position to test it right now, but it should be enough to set you down the right path.
var processes = [];
// Adding a process
var newProcess = child_process.exec("mycommand");
processes.push(newProcess);
newProcess.on("exit", function () {
processes.splice(processes.indexOf(newProcess), 1);
});
// App close handler
app.on('window-all-closed', function() {
if (process.platform != 'darwin') {
processes.forEach(function(proc) {
proc.kill();
});
app.quit();
}
});
EDIT: As shreik mentioned in a comment, you could also just store the PIDs in the array instead of the ChildProcess objects, then use process.kill(pid) to kill them. Might be a little more efficient!
Another solution. If you want to keep using exec()
In order to kill the child process running by exec() take a look to the module ps-tree. They exaplain what is happening.
in UNIX, a process may terminate by using the exit call, and it's
parent process may wait for that event by using the wait system call.
the wait system call returns the process identifier of a terminated
child, so that the parent tell which of the possibly many children has
terminated. If the parent terminates, however, all it's children have
assigned as their new parent the init process. Thus, the children
still have a parent to collect their status and execution statistics.
(from "operating system concepts")
SOLUTION: use ps-tree to get all processes that a child_process may have started, so that they
exec() actually works like this:
function exec (cmd, cb) {
spawn('sh', ['-c', cmd]);
...
}
So check the example and adapt it to your needs
var cp = require('child_process'),
psTree = require('ps-tree');
var child = cp.exec("node -e 'while (true);'", function () { /*...*/ });
psTree(child.pid, function (err, children) {
cp.spawn('kill', ['-9'].concat(children.map(function (p) { return p.PID })));
});

Node.js spawning a child process interactively with separate stdout and stderr streams

Consider the following C program (test.c):
#include <stdio.h>
int main() {
printf("string out 1\n");
fprintf(stderr, "string err 1\n");
getchar();
printf("string out 2\n");
fprintf(stderr, "string err 2\n");
fclose(stdout);
}
Which should print a line to stdout, a line to stderr, then wait for user input, then another line to stdout and another line to stderr. Very basic!
When compiled and run on the command line the output of the program when complete (user input is received for getchar()):
$ ./test
string out 1
string err 1
string out 2
string err 2
When trying to spawn this program as a child process using nodejs with the following code:
var TEST_EXEC = './test';
var spawn = require('child_process').spawn;
var test = spawn(TEST_EXEC);
test.stdout.on('data', function (data) {
console.log('stdout: ' + data);
});
test.stderr.on('data', function (data) {
console.log('stderr: ' + data);
});
// Simulate entering data for getchar() after 1 second
setTimeout(function() {
test.stdin.write('\n');
}, 1000);
The output appears like this:
$ nodejs test.js
stderr: string err 1
stdout: string out 1
string out 2
stderr: string err 2
Very different from the output as seen when running ./test in the terminal. This is because the ./test program isn't running in an interactive shell when spawned by nodejs. The test.c stdout stream is buffered and when run in a terminal as soon as a \n is reached the buffer is flushed but when spawned in this way with node the buffer isn't flushed. This could be resolved by either flushing stdout after every print, or changing the stdout stream to be unbuffered so it flushes everything immediately.
Assuming that test.c source isn't available or modifiable, neither of the two flushing options mentioned can be implemented.
I then started looking at emulating an interactive shell, there's pty.js (pseudo terminal) which does a good job, for example:
var spawn = require('pty.js').spawn;
var test = spawn(TEST_EXEC);
test.on('data', function (data) {
console.log('data: ' + data);
});
// Simulate entering data for getchar() after 1 second
setTimeout(function() {
test.write('\n');
}, 1000);
Which outputs:
$ nodejs test.js
data: string out 1
string err 1
data:
data: string out 2
string err 2
However both stdout and stderr are merged together (as you would see when running the program in a terminal) and I can't think of a way to separate the data from the streams.
So the question..
Is there any way using nodejs to achieve the output as seen when running ./test without modifying the test.c code? Either by terminal emulation or process spawning or any other method?
Cheers!
I tried the answer by user568109 but this does not work, which makes sense since the pipe only copies the data between streams. Hence, it only gets to process.stdout when the buffer is flushed...
The following appears to work:
var TEST_EXEC = './test';
var spawn = require('child_process').spawn;
var test = spawn(TEST_EXEC, [], { stdio: 'inherit' });
//the following is unfortunately not working
//test.stdout.on('data', function (data) {
// console.log('stdout: ' + data);
//});
Note that this effectively shares stdio's with the node process. Not sure if you can live with that.
I was just revisiting this since there is now a 'shell' option available for the spawn command in node since version 5.7.0. Unfortunately there doesn't seem to be an option to spawn an interactive shell (I also tried with shell: '/bin/sh -i' but no joy).
However I just found this which suggests using 'stdbuf' allowing you to change the buffering options of the program that you want to run. Setting them to 0 on everything produces unbuffered output for all streams and they're still kept separate.
Here's the updated javascript:
var TEST_EXEC = './test';
var spawn = require('child_process').spawn;
var test = spawn('stdbuf', ['-i0', '-o0', '-e0', TEST_EXEC]);
test.stdout.on('data', function (data) {
console.log('stdout: ' + data);
});
test.stderr.on('data', function (data) {
console.log('stderr: ' + data);
});
// Simulate entering data for getchar() after 1 second
setTimeout(function() {
test.stdin.write('\n');
}, 1000);
Looks like this isn't pre-installed on OSX and of course not available for Windows, may be similar alternatives though.
You can do this :
var TEST_EXEC = 'test';
var spawn = require('child_process').spawn;
var test = spawn(TEST_EXEC);
test.stdin.pipe(process.stdin);
test.stdout.pipe(process.stdout);
test.stderr.pipe(process.stderr);
When you use events on stdout and stderr to print the output on console.log, you will get jumbled output because of asynchronous execution of the functions. The output will be ordered for a stream independently, but output can still get interleaved among stdin,stdout and stderr.

Resources