Getting output of child process created with inherit - node.js

I am spawning a child process in node where for the opts I'm using {stdio:'inherit'}, I need to do this because the child process needs to accept some input from the user.
I need to also get this output though, because the output of the child is actually nothing because it is using the parent's. I thought about doing a quick process.stdout.on('data', ...) attach and detach after the child process, but I prefer not to do that. I assume that is some node stream solution? Like something that would get intercept the input first, and then pass it along to my parent process?
TLDR Need output of child process while still doing stdio:'inherit' in node. Perhaps something that pipes output of child process to a Buffer/string?
Thank you and please if answering provide working code example

Figured it out!
const child_process = require('child_process');
const child = child_process.spawn('python3', ['./eater.py'], {
stdio: ['inherit', 'pipe', 'pipe'],
});
const output = [];
child.stdout.on('data', d => {
console.log(d.toString());
output.push(d.toString());
});
child.stdout.on('end', () => {
console.log('Finished');
console.log({ output });
});
and
if the python was something like:
print("Some prompt")
auth_code = input('Provide some code to us\n')

Related

Use node script to start child process, capture child stdout, and kill child on close

I would like to start a node-script called myScript.js, that starts a child process npm start, store the stdout of npm start into a global variable let myVar, and make sure that if the main program myScript.js is exited for any reason, the child process is killed as well. Nothing from the child's stdout should appear in the terminal window after ctr-c or similar.
My current solution does not kill on close:
const childProcess = require('child_process');
let myVar = ''
const child = childProcess.spawn('npm', ['start'], {
detached: false
});
process.on('exit', function () {
child.stdin.pause();
child.kill();
});
child.stdout.on('data', (data) => {
myVar = `${data}`
});
Can this be accomplished?
Small change, but I think that might look something like this:
const childProcess = require('child_process')
const child = childProcess.spawn('npm', ['start'], {shell:true});
var myVar = ''; child.stdout.setEncoding('utf8');
child.stdout.on('data', function(data) {
myVar = data.toString();
});
child.on('close', function(exitcode) {
// on the close of the child process, use standard output or maybe call a function
});
process.on('exit', function() {
// I don't think pausing std.in is strictly necessary
child.kill()
})
Further reading
node's child_process documentation, including event names
geeksforgeeks article on child_process.spawn
This StackOverflow thread on getting and using the output of a child_process' pipe
This StackOverflow thread on the difference between the close and exit events

Wait for all child processes to finish to continue

I would like to know if it is possible to wait for all child process created using the spawn function to finish before continuing execution.
I have a code looking like this:
const spawn = window.require('child_process').spawn;
let processes = [];
let thing = [];
// paths.length = 2
paths.forEach((path) => {
const pythonProcess = spawn("public/savefile.py", ['-d', '-j', '-p', path, tempfile]);
pythonProcess.on('exit', () => {
fs.readFile(tempfile, 'utf8', (err, data) => {
thing.push(...)
});
});
processes.push(pythonProcess);
});
console.log(processes) // Here we have 2 child processes
console.log(thing) // empty array.. the python processes didnt finish yet
return thing // of course it doesn't work. I want to wait for all the processes to have finished their callbacks to continue
As you can guess, I would like to know how I could get all the python scripts running at the same time, and wait for all of them to finish to continue my js code.
I'm running node 10.15.3
Thank you
ForEach to push Promise into an array of Promise and Promise.all()
Have you tried spawnSync ?
Is generally identical to spawn with the exception that the function
will not return until the child process has fully closed.
import { spawnSync } from "child_process";
spawnSync('ls', ['-la']);

How to send input to child process created with spawn? nodejs

I'm running Windows 10, and I have a program, let's call it program, that can be run from the command line. When run, it responds to commands the user enters. The user enters a command, presses the return key, and the program prints a response. I did not make this program and do not have the source, so I cannot modify it.
I want to run this program from within Node.js, and have my Node.js program act as the user, sending it commands and getting the responses. I spawn my program like this:
var spawn = require('child_process').spawn;
var child = spawn('program');
child.stdout.on('data', function(data) {
console.log(`stdout: ${data}`);
});
Then I attempt to send it a command, for example, help.
child.stdin.write("help\n");
And nothing happens. If I manually run the program, type help, and press the return key, I get output. I want Node.js to run the program, send it input, and receive the output exactly as a human user would. I assumed that stdin.write() would send the program a command as if the user typed it in the console. However, as the program does not respond, I assume this is not the case. How can I send the program input?
I've seen many similar questions, but unfortunately the solutions their authors report as "working" did not work for me.
Sending input data to child process in node.js
I've seen this question and answer and tried everything in it with no success. I've tried ending the command with \r\n instead of \n. I've also tried adding the line child.stdin.end() after writing. Neither of these worked.
How to pass STDIN to node.js child process
This person, in their self-answer, says that they got theirs to work almost exactly as I'm doing it, but mine does not work.
Nodejs Child Process: write to stdin from an already initialised process
This person, in their self-answer, says they got it to work by writing their input to a file and then piping that file to stdin. This sounds overly complicated to send a simple string.
This worked for me, when running from Win10 CMD or Git Bash:
console.log('Running child process...');
const spawn = require('child_process').spawn;
const child = spawn('node');
// Also worked, from Git Bash:
//const child = spawn('cat');
child.stdout.on('data', (data) => {
console.log(`stdout: "${data}"`);
});
child.stdin.write("console.log('Hello!');\n");
child.stdin.end(); // EOF
child.on('close', (code) => {
console.log(`Child process exited with code ${code}.`);
});
Result:
D:\Martin\dev\node>node test11.js
Running child process...
stdout: "Hello!
"
Child process exited with code 0.
I also tried running aws configure like this, first it didn't work because I sent only a single line. But when sending four lines for the expected four input values, it worked.
Maybe your program expects special properties for stdin, like being a real terminal, and therefore doesn't take your input?
Or did you forget to send the EOF using child.stdin.end();? (If you remove that call from my example, the child waits for input forever.)
Here is what worked for me. I have used child_process exec to create a child process. Inside this child process Promise, I am handling the i/o part of the cmd given as parameter. Its' not perfect, but its working.
Sample function call where you dont need any human input.
executeCLI("cat ~/index.html");
Sample function call where you interact with aws cli. Here
executeCLI("aws configure --profile dev")
Code for custom executeCLI function.
var { exec } = require('child_process');
async function executeCLI(cmd) {
console.log("About to execute this: ", cmd);
var child = exec(cmd);
return new Promise((resolve, reject) => {
child.stdout.on('data', (data) => {
console.log(`${data}`);
process.stdin.pipe(child.stdin);
});
child.on('close', function (err, data) {
if (err) {
console.log("Error executing cmd: ", err);
reject(err);
} else {
// console.log("data:", data)
resolve(data);
}
});
});
}
Extract the user input code from browser and save that code into a file on your system using fs module. Let that file be 'program.cpp' and save the user data input in a text file.
As we can compile our c++ code on our terminal using g++ similarly we will be using child_process to access our system terminal and run user's code.
execFile can be used for executing our program
var { execFile } = require('child_process');
execFile("g++", ['program.cpp'], (err, stdout, stderr) => {
if (err) {
console.log("compilation error: ",err);
} else{
execFile ('./a.out' ,['<', 'input.txt'], {shell: true}, (err, stdout, stderr) => {
console.log("output: ", stdout);
})
}
})
In this code we simply require the child_process and uses its execFile function.
First we compile the code present in program.cpp, which creates a default a.out as output file
Then we pass the a.out file with input that is present in input.txt
Hence you can view the generated output in your terminal and pass that back to the user.
for more details you can check: Child Processes

Retrieve shell error output from child_process.spawn

I'm using child_process.spawn and need to capture the shell error that occurs when the command fails. According to this question, I should be able to do:
var child_process = require('child_process');
var python = child_process.spawn(
'python', ["script.py", "someParam"]
);
python.on('error', function(error) {
console.log("Error: bad command", error);
});
When I replace 'python', ["script.py", "someParam"] with banana, like in the linked question, it works, and the error is visible. But in my case, using python with arguments, the 'error' event is never called.
How can I capture shell errors from python?
According to the Node.js docs for the ChildProcess error event, it is only fired in a few situations:
The process could not be spawned, or
The process could not be killed, or
Sending a message to the child process failed for whatever reason.
To capture the shell error output, you can additionally listen to data events on the stdout and stderr of your spawned process:
python.stdout.on('data', function(data) {
console.log(data.toString());
});
python.stderr.on('data', function(data) {
console.error(data.toString());
});
To capture the shell error code, you can attach a listener to the exit event:
python.on('exit', function(code) {
console.log("Exited with code " + code);
});
Thread is a little bit old and all but I encountered this today while working on my test casing library. I realize that the accepted answer has a solution already, but, for me, it is not clearly explained. Anyway in case someone needs it here is what you need to do.
The thing I realized is that, while executing code, if python interpreter encounters an error, or should I say, if your code has an error in it, it will write it to standard error stream and exit. So what you need to do, in your Node.js code, is to listen to the stderr stream of the spawned process. In addition, all of the data passed to print() function is written to the 'stdout' stream of the process.
So here is an example code:
const { spawn } = require('child_process');
const proc = spawn('python',['main.py','-c']);
proc.stderr.on('data',(data)=>{
//Here data is of type buffer
console.log(data.toString())
})
proc.stdout('data',(data)=>{
//Also buffer
console.log(data.toString());
})
What happens clear should already be clear if you read the first part of my answer. One other thing you could do instead of writing data to the console, is redirect it to another stream, this could be really useful if you want to write output data to a file for example. This is how you could do it:
const fs = require('fs');
const path = require('path');
const { spawn } = require('child_process');
const outputFile = path.join(__dirname,'output.txt');
const errorFile = path.join(__dirname,'output.txt');
const outputStream = fs.createWriteStream(outputFile, {
encoding: "utf8",
autoClose: true
});
const outputStream = fs.createWriteStream(errorFile, {
encoding: "utf8",
autoClose: true
});
const proc = spawn('python',['main.py','-c']);
proc.stdout.pipe(outputStream);
proc.stderr.pipe(errorStream);
What is happening here is that, using pipe function we send all data from stdout and stderr of the process to the file streams. Also you do not have to worry about files existing, it will create them for you

Piping data from child to parent in nodejs

I have a nodejs parent process that starts up another nodejs child process. The child process executes some logic and then returns output to the parent. The output is large and I'm trying to use pipes to communicate, as suggested in documentation for child.send() method (which works fine BTW).
I would like someone to suggest how to properly build this communication channel. I want to be able to send data from parent to child and also to be able to send data from child to parent. I've started it a bit, but it is incomplete (sends message only from parent to child) and throws an error.
Parent File Code:
var child_process = require('child_process');
var opts = {
stdio: [process.stdin, process.stdout, process.stderr, 'pipe']
};
var child = child_process.spawn('node', ['./b.js'], opts);
require('streamifier').createReadStream('test 2').pipe(child.stdio[3]);
Child file code:
var fs = require('fs');
// read from it
var readable = fs.createReadStream(null, {fd: 3});
var chunks = [];
readable.on('data', function(chunk) {
chunks.push(chunk);
});
readable.on('end', function() {
console.log(chunks.join().toString());
})
The above code prints expected output ("test 2") along with the following error:
events.js:85
throw er; // Unhandled 'error' event
^
Error: shutdown ENOTCONN
at exports._errnoException (util.js:746:11)
at Socket.onSocketFinish (net.js:232:26)
at Socket.emit (events.js:129:20)
at finishMaybe (_stream_writable.js:484:14)
at afterWrite (_stream_writable.js:362:3)
at _stream_writable.js:349:9
at process._tickCallback (node.js:355:11)
at Function.Module.runMain (module.js:503:11)
at startup (node.js:129:16)
at node.js:814:3
Full Answer:
Parent's code:
var child_process = require('child_process');
var opts = {
stdio: [process.stdin, process.stdout, process.stderr, 'pipe', 'pipe']
};
var child = child_process.spawn('node', ['./b.js'], opts);
child.stdio[3].write('First message.\n', 'utf8', function() {
child.stdio[3].write('Second message.\n', 'utf8', function() {
});
});
child.stdio[4].pipe(process.stdout);
Child's code:
var fs = require('fs');
// read from it
var readable = fs.createReadStream(null, {fd: 3});
readable.pipe(process.stdout);
fs.createWriteStream(null, {fd: 4}).write('Sending a message back.');
Your code works, but by using the streamifier package to create a read stream from a string, your communication channel is automatically closed after that string is transmitted, which is the reason you get an ENOTCONN error.
To be able to send multiple messages over the stream, consider using .write on it. You can call this as often as you like:
child.stdio[3].write('First message.\n');
child.stdio[3].write('Second message.\n');
If you want to use this method to send multiple discrete messages (which I believe is the case based on your remark of using child.send() before), it's a good idea to use some separator symbol to be able to split the messages when the stream is read in the child. In the above example, I used newlines for that. A useful package for helping with this splitting is event-stream.
Now, in order to create another communication channel from the child in the parent, just add another 'pipe' to your stdio.
You can write to it in the child:
fs.createWriteStream(null, {fd: 4}).write('Sending a message back.');
And read from it in the parent:
child.stdio[4].pipe(process.stdout);
This will print 'Sending a message back.' to the console.
I was running into the same issue and used the {end:false} option to fix the error. Unfortunately the accepted answer works only while handling discrete writes of short amounts of data. In case you have a lot of data (rather than just short messages), you need to handle flow control and using the .write() is not the best. For scenarios like this (large data transfers), its better you use the .pipe() function as originally in your code to handle flow control.
The error is thrown because the readable stream in your parent process is trying to end and close the writable stream input pipe of your child process. You should use the {end: false} option in the parent process pipe:
Original Code:
require('streamifier').createReadStream('test 2').pipe(child.stdio[3]);
Suggested Modification:
require('streamifier').createReadStream('test 2').pipe(child.stdio[3], {end:false});
See details here from the NodeJs documentation: https://nodejs.org/dist/latest-v5.x/docs/api/stream.html#stream_readable_pipe_destination_options
Hope this helps someone else facing this problem.
You can do this with fork()
I just solved this one for myself...fork() is the the higher level version of spawn, and it's recommended to use fork() instead of spawn() in general.
if you use the {silent:true} option, stdio will be piped to the parent process
const cp = require('child_process');
const n = cp.fork(<path>, args, {
cwd: path.resolve(__dirname),
detached: true,
});
n.stdout.setEncoding('utf8');
// here we can listen to the stream of data coming from the child process:
n.stdout.on('data', (data) => {
ee.emit('data',data);
});
//you can also listen to other events emitted by the child process
n.on('error', function (err) {
console.error(err.stack);
ee.emit('error', err);
});
n.on('message', function (msg) {
ee.emit('message', msg);
});
n.on('uncaughtException', function (err) {
console.error(err.stack);
ee.emit('error', err);
});
n.once('exit', function (err) {
console.error(err.stack);
ee.emit('exit', err);
});

Resources