How to send input to child process created with spawn? nodejs - node.js

I'm running Windows 10, and I have a program, let's call it program, that can be run from the command line. When run, it responds to commands the user enters. The user enters a command, presses the return key, and the program prints a response. I did not make this program and do not have the source, so I cannot modify it.
I want to run this program from within Node.js, and have my Node.js program act as the user, sending it commands and getting the responses. I spawn my program like this:
var spawn = require('child_process').spawn;
var child = spawn('program');
child.stdout.on('data', function(data) {
console.log(`stdout: ${data}`);
});
Then I attempt to send it a command, for example, help.
child.stdin.write("help\n");
And nothing happens. If I manually run the program, type help, and press the return key, I get output. I want Node.js to run the program, send it input, and receive the output exactly as a human user would. I assumed that stdin.write() would send the program a command as if the user typed it in the console. However, as the program does not respond, I assume this is not the case. How can I send the program input?
I've seen many similar questions, but unfortunately the solutions their authors report as "working" did not work for me.
Sending input data to child process in node.js
I've seen this question and answer and tried everything in it with no success. I've tried ending the command with \r\n instead of \n. I've also tried adding the line child.stdin.end() after writing. Neither of these worked.
How to pass STDIN to node.js child process
This person, in their self-answer, says that they got theirs to work almost exactly as I'm doing it, but mine does not work.
Nodejs Child Process: write to stdin from an already initialised process
This person, in their self-answer, says they got it to work by writing their input to a file and then piping that file to stdin. This sounds overly complicated to send a simple string.

This worked for me, when running from Win10 CMD or Git Bash:
console.log('Running child process...');
const spawn = require('child_process').spawn;
const child = spawn('node');
// Also worked, from Git Bash:
//const child = spawn('cat');
child.stdout.on('data', (data) => {
console.log(`stdout: "${data}"`);
});
child.stdin.write("console.log('Hello!');\n");
child.stdin.end(); // EOF
child.on('close', (code) => {
console.log(`Child process exited with code ${code}.`);
});
Result:
D:\Martin\dev\node>node test11.js
Running child process...
stdout: "Hello!
"
Child process exited with code 0.
I also tried running aws configure like this, first it didn't work because I sent only a single line. But when sending four lines for the expected four input values, it worked.
Maybe your program expects special properties for stdin, like being a real terminal, and therefore doesn't take your input?
Or did you forget to send the EOF using child.stdin.end();? (If you remove that call from my example, the child waits for input forever.)

Here is what worked for me. I have used child_process exec to create a child process. Inside this child process Promise, I am handling the i/o part of the cmd given as parameter. Its' not perfect, but its working.
Sample function call where you dont need any human input.
executeCLI("cat ~/index.html");
Sample function call where you interact with aws cli. Here
executeCLI("aws configure --profile dev")
Code for custom executeCLI function.
var { exec } = require('child_process');
async function executeCLI(cmd) {
console.log("About to execute this: ", cmd);
var child = exec(cmd);
return new Promise((resolve, reject) => {
child.stdout.on('data', (data) => {
console.log(`${data}`);
process.stdin.pipe(child.stdin);
});
child.on('close', function (err, data) {
if (err) {
console.log("Error executing cmd: ", err);
reject(err);
} else {
// console.log("data:", data)
resolve(data);
}
});
});
}

Extract the user input code from browser and save that code into a file on your system using fs module. Let that file be 'program.cpp' and save the user data input in a text file.
As we can compile our c++ code on our terminal using g++ similarly we will be using child_process to access our system terminal and run user's code.
execFile can be used for executing our program
var { execFile } = require('child_process');
execFile("g++", ['program.cpp'], (err, stdout, stderr) => {
if (err) {
console.log("compilation error: ",err);
} else{
execFile ('./a.out' ,['<', 'input.txt'], {shell: true}, (err, stdout, stderr) => {
console.log("output: ", stdout);
})
}
})
In this code we simply require the child_process and uses its execFile function.
First we compile the code present in program.cpp, which creates a default a.out as output file
Then we pass the a.out file with input that is present in input.txt
Hence you can view the generated output in your terminal and pass that back to the user.
for more details you can check: Child Processes

Related

How do i send a large text file from node to python as a child process

I want to send the text file I upload to my node.js app to python in order to do some machine learning and be able to use the pandas and NumPy libraries.
I never worked with child process before but I found a tutorial and i came up with the following code :
app.post("/upload-txt", uploads.single("txt"), (req, res) => {
//convert csvfile to jsonArray
//const fileName = req.file.originalname;
let contents = readTextFile.readSync(req.file.path);
var largeDataSet = [];
// spawn new child process to call the python script
const python = spawn("python", ["script4.py", contents]);
// collect data from script
python.stdout.on("data", function (data) {
console.log("Pipe data from python script ...");
largeDataSet.push(data);
});
// in close event we are sure that stream is from child process is closed
python.on("close", (code) => {
console.log(`child process close all stdio with code ${code}`);
// send data to browser
res.send(largeDataSet.join(""));
});
});`
This works fine for a limited number of lines in the text file but as soon as the file gets even remotely larger it does not, how do i fix this ?
It appears that your data is sometimes too long for the command line in whatever OS you're using. So, instead of passing the data itself on the command line, you can pass the filename on the command line and then modify the python script to get the filename from the command line, read the data itself (which won't be limited by command line limitations) and then process that data.
app.post("/upload-txt", uploads.single("txt"), (req, res) => {
//convert csvfile to jsonArray
var largeDataSet = [];
// spawn new child process to call the python script
// pass the python script the uploaded file as an argument
const python = spawn("python", ["script4.py", req.file.path]);
// collect data from script
python.stdout.on("data", function (data) {
console.log("Pipe data from python script ...");
largeDataSet.push(data);
});
// in close event we are sure that stream is from child process is closed
python.on("close", (code) => {
console.log(`child process close all stdio with code ${code}`);
// send data to browser
res.send(largeDataSet.join(""));
});
python.on("error", (err) => {
console.log(err);
res.sendStatus(500);
});
});

Send text to .bat file that is ran from NodeJS

I want to start a .bat file from within NodeJS. With that, I would then be able to send messages to that running bat file.
My code:
const childprocess = require("child_process")
const mybat = childprocess.exec("start cmd /c my.bat", () => {
console.log("bat file has finished")
})
// some time later in another function
mybat.send("text to send")
// within the bat, it would use the new message "text to send" as if you typed and sent a message in the cmd terminal
// ...
mybat.send("a")
// sending any key to complete a PAUSE command which will close the cmd
The .send() isn't a working function but hopefully it demonstrates what I'm trying to accomplish. Everything except the send functions works fine.
The following code uses #rauschma/stringio to asynchronously write to the stdin of a child process running a shell command:
const {streamWrite, streamEnd, onExit} = require('#rauschma/stringio');
const {spawn} = require('child_process');
async function main() {
const sink = spawn('cmd.exe', ['/c', 'my.bat'],
{stdio: ['pipe', process.stdout, process.stderr]}); // (A)
writeToWritable(sink.stdin); // (B)
await onExit(sink);
console.log('bat file has finished');
}
main();
async function writeToWritable(writable) {
...
await streamWrite(writable, 'text to send\n');
...
await streamWrite(writable, 'a');
...
await streamEnd(writable);
}
We spawn a separate process, called sink, for the shell command. writeToWritable writes to sink.stdin. It does so asynchronously and pauses via await, to avoid requiring too much buffering.
Observations:
In line A, we tell spawn() to let us access stdin via sink.stdin ('pipe'). stdout and stderr are forwarded to process.stdin and process.stderr, as previously.
We don’t await in line B for the writing to finish. Instead, we await until the child process sink is done.

Node: How to stream STDERR content from child_process.spawn()?

I'm using Node's child_process.spawn() to run shell commands, with the following code:
const process = require('child_process').spawn('whoami');
// this works...
process.stdout.on('data', function(buf) {
console.log('HERE IS SOME STDOUT CONTENT "%s"', String(buf));
});
// this never works...
process.stderr.on('data', function(buf) {
console.log('HERE IS SOME STDERR CONTENT "%s"', String(buf));
});
// this works, but doesn't let me stream STDERR content...
process.on('error', (err) => {
console.log('there was an error: ' + err);
});
Running the above valid command whoami lets me read the STDOUT data inside the function passed to process.stdout.on.
If I change my command to something invalid (that produces some STDERR content) like...
const process = require('child_process').spawn('whoami BADARGUMENTTOBREAKTHINGS');
...the whoami command outputs an error message to STDERR (in a normal shell), but in Node my function inside process.stderr.on is never executed. I never see the HERE IS SOME STDERR CONTENT message.
I've also tried some other invalid shell commands like cd folderthatdoesntexist and ls filenamethatdoesntexist that should all produce STDERR content.
After typing my question out, I figured out that it was as simple as the fact that you can't have spaces in the 1st argument to .spawn() ... you have to pass the rest of the command in as the 2nd argument with an array, i.e.
const process = require('child_process').spawn('whoami', ['BADARGUMENTTOBREAKTHINGS']);
This was unexpected because both .exec() and .spawnSync() work with spaces in the 1st argument.
Hopefully this can help someone else having the same issue in the future.

How to get the output of a spawned child_process in Node.JS?

First of all, I'm a complete noob and started using Node.JS yesterday (it was also my first time using Linux in years) so please be nice and explicit
I'm currently making a Node.JS program which has to, among other things, launch shell commands (mainly : mount an usb drive).
I'm currently using
var spawn = require('child_process').spawnSync;
function shspawn(command) {
spawn('sh', ['-c', command], { stdio: 'inherit' });
}
shspawn('echo Hello world');
shspawn('mkdir newdir');
etc. which is a really comfortable way to do it for me.
The problem is that I'd like to store the output of, for example, a "ls" command in a variable, in a way like
var result = shspawn('ls -l')
I've read some examples online but they rarely use spawn and when they do, it doesn't work for me (I guess I may do something wrong, but again I'm a noob in Node)
If you guys have a better idea than using child_process_spawnSync I'm open to any idea, but I'd like as long as possible to keep my program package-free :)
EDIT : I need it to work synchronously ! That's why I've started using spawnSync. I will be using some commands like dd, that takes time and needs to be fully finished before the program moves on to another command.
You can do it something like below.
var spawn = require('child_process').spawn;
// Create a child process
var child = spawn('ls' , ['-l']);
child.stdout.on('data',
function (data) {
console.log('ls command output: ' + data);
});
child.stderr.on('data', function (data) {
//throw errors
console.log('stderr: ' + data);
});
child.on('close', function (code) {
console.log('child process exited with code ' + code);
});
Update: with spawnSync
var spawn = require('child_process').spawnSync;
var child = spawn('ls' , ['-l','/usr']);
console.log('stdout here: \n' + child.stdout);

Retrieve shell error output from child_process.spawn

I'm using child_process.spawn and need to capture the shell error that occurs when the command fails. According to this question, I should be able to do:
var child_process = require('child_process');
var python = child_process.spawn(
'python', ["script.py", "someParam"]
);
python.on('error', function(error) {
console.log("Error: bad command", error);
});
When I replace 'python', ["script.py", "someParam"] with banana, like in the linked question, it works, and the error is visible. But in my case, using python with arguments, the 'error' event is never called.
How can I capture shell errors from python?
According to the Node.js docs for the ChildProcess error event, it is only fired in a few situations:
The process could not be spawned, or
The process could not be killed, or
Sending a message to the child process failed for whatever reason.
To capture the shell error output, you can additionally listen to data events on the stdout and stderr of your spawned process:
python.stdout.on('data', function(data) {
console.log(data.toString());
});
python.stderr.on('data', function(data) {
console.error(data.toString());
});
To capture the shell error code, you can attach a listener to the exit event:
python.on('exit', function(code) {
console.log("Exited with code " + code);
});
Thread is a little bit old and all but I encountered this today while working on my test casing library. I realize that the accepted answer has a solution already, but, for me, it is not clearly explained. Anyway in case someone needs it here is what you need to do.
The thing I realized is that, while executing code, if python interpreter encounters an error, or should I say, if your code has an error in it, it will write it to standard error stream and exit. So what you need to do, in your Node.js code, is to listen to the stderr stream of the spawned process. In addition, all of the data passed to print() function is written to the 'stdout' stream of the process.
So here is an example code:
const { spawn } = require('child_process');
const proc = spawn('python',['main.py','-c']);
proc.stderr.on('data',(data)=>{
//Here data is of type buffer
console.log(data.toString())
})
proc.stdout('data',(data)=>{
//Also buffer
console.log(data.toString());
})
What happens clear should already be clear if you read the first part of my answer. One other thing you could do instead of writing data to the console, is redirect it to another stream, this could be really useful if you want to write output data to a file for example. This is how you could do it:
const fs = require('fs');
const path = require('path');
const { spawn } = require('child_process');
const outputFile = path.join(__dirname,'output.txt');
const errorFile = path.join(__dirname,'output.txt');
const outputStream = fs.createWriteStream(outputFile, {
encoding: "utf8",
autoClose: true
});
const outputStream = fs.createWriteStream(errorFile, {
encoding: "utf8",
autoClose: true
});
const proc = spawn('python',['main.py','-c']);
proc.stdout.pipe(outputStream);
proc.stderr.pipe(errorStream);
What is happening here is that, using pipe function we send all data from stdout and stderr of the process to the file streams. Also you do not have to worry about files existing, it will create them for you

Resources