Piping data from child to parent in nodejs - node.js

I have a nodejs parent process that starts up another nodejs child process. The child process executes some logic and then returns output to the parent. The output is large and I'm trying to use pipes to communicate, as suggested in documentation for child.send() method (which works fine BTW).
I would like someone to suggest how to properly build this communication channel. I want to be able to send data from parent to child and also to be able to send data from child to parent. I've started it a bit, but it is incomplete (sends message only from parent to child) and throws an error.
Parent File Code:
var child_process = require('child_process');
var opts = {
stdio: [process.stdin, process.stdout, process.stderr, 'pipe']
};
var child = child_process.spawn('node', ['./b.js'], opts);
require('streamifier').createReadStream('test 2').pipe(child.stdio[3]);
Child file code:
var fs = require('fs');
// read from it
var readable = fs.createReadStream(null, {fd: 3});
var chunks = [];
readable.on('data', function(chunk) {
chunks.push(chunk);
});
readable.on('end', function() {
console.log(chunks.join().toString());
})
The above code prints expected output ("test 2") along with the following error:
events.js:85
throw er; // Unhandled 'error' event
^
Error: shutdown ENOTCONN
at exports._errnoException (util.js:746:11)
at Socket.onSocketFinish (net.js:232:26)
at Socket.emit (events.js:129:20)
at finishMaybe (_stream_writable.js:484:14)
at afterWrite (_stream_writable.js:362:3)
at _stream_writable.js:349:9
at process._tickCallback (node.js:355:11)
at Function.Module.runMain (module.js:503:11)
at startup (node.js:129:16)
at node.js:814:3
Full Answer:
Parent's code:
var child_process = require('child_process');
var opts = {
stdio: [process.stdin, process.stdout, process.stderr, 'pipe', 'pipe']
};
var child = child_process.spawn('node', ['./b.js'], opts);
child.stdio[3].write('First message.\n', 'utf8', function() {
child.stdio[3].write('Second message.\n', 'utf8', function() {
});
});
child.stdio[4].pipe(process.stdout);
Child's code:
var fs = require('fs');
// read from it
var readable = fs.createReadStream(null, {fd: 3});
readable.pipe(process.stdout);
fs.createWriteStream(null, {fd: 4}).write('Sending a message back.');

Your code works, but by using the streamifier package to create a read stream from a string, your communication channel is automatically closed after that string is transmitted, which is the reason you get an ENOTCONN error.
To be able to send multiple messages over the stream, consider using .write on it. You can call this as often as you like:
child.stdio[3].write('First message.\n');
child.stdio[3].write('Second message.\n');
If you want to use this method to send multiple discrete messages (which I believe is the case based on your remark of using child.send() before), it's a good idea to use some separator symbol to be able to split the messages when the stream is read in the child. In the above example, I used newlines for that. A useful package for helping with this splitting is event-stream.
Now, in order to create another communication channel from the child in the parent, just add another 'pipe' to your stdio.
You can write to it in the child:
fs.createWriteStream(null, {fd: 4}).write('Sending a message back.');
And read from it in the parent:
child.stdio[4].pipe(process.stdout);
This will print 'Sending a message back.' to the console.

I was running into the same issue and used the {end:false} option to fix the error. Unfortunately the accepted answer works only while handling discrete writes of short amounts of data. In case you have a lot of data (rather than just short messages), you need to handle flow control and using the .write() is not the best. For scenarios like this (large data transfers), its better you use the .pipe() function as originally in your code to handle flow control.
The error is thrown because the readable stream in your parent process is trying to end and close the writable stream input pipe of your child process. You should use the {end: false} option in the parent process pipe:
Original Code:
require('streamifier').createReadStream('test 2').pipe(child.stdio[3]);
Suggested Modification:
require('streamifier').createReadStream('test 2').pipe(child.stdio[3], {end:false});
See details here from the NodeJs documentation: https://nodejs.org/dist/latest-v5.x/docs/api/stream.html#stream_readable_pipe_destination_options
Hope this helps someone else facing this problem.

You can do this with fork()
I just solved this one for myself...fork() is the the higher level version of spawn, and it's recommended to use fork() instead of spawn() in general.
if you use the {silent:true} option, stdio will be piped to the parent process
const cp = require('child_process');
const n = cp.fork(<path>, args, {
cwd: path.resolve(__dirname),
detached: true,
});
n.stdout.setEncoding('utf8');
// here we can listen to the stream of data coming from the child process:
n.stdout.on('data', (data) => {
ee.emit('data',data);
});
//you can also listen to other events emitted by the child process
n.on('error', function (err) {
console.error(err.stack);
ee.emit('error', err);
});
n.on('message', function (msg) {
ee.emit('message', msg);
});
n.on('uncaughtException', function (err) {
console.error(err.stack);
ee.emit('error', err);
});
n.once('exit', function (err) {
console.error(err.stack);
ee.emit('exit', err);
});

Related

Node.js child process isn't receiving stdin unless I close the stdin stream

I'm building a discord bot that wraps a terraria server in node.js so server users can restart the server and similar actions. I've managed to finish half the job, but I can't seem to create a command to execute commands on the terraria server. I've set it to write the command to the stdin of the child process and some basic debugging verifies that it does, but nothing apparently happens.
In the Node.js docs for child process stdin, it says "Note that if a child process waits to read all of its input, the child will not continue until this stream has been closed via end()." This seems likely to be the problem, as calling the end() function on it does actually send the command as expected. That said, it seems hard to believe that I'm unable to continuously send commands to stdin without having to close it.
Is this actually the problem, and if so what are my options for solving it? My code may be found below.
const discordjs = require("discord.js");
const child_process = require("child_process");
const tokens = require("./tokens");
const client = new discordjs.Client();
const terrariaServerPath = "C:\\Program Files (x86)\\Steam\\steamapps\\common\\Terraria\\TerrariaServer.exe"
const terrariaArgs = ['-port', '7777', "-maxplayers", "8", "-world", "test.wld"]
var child = child_process.spawn(terrariaServerPath, terrariaArgs);
client.on('ready', () => {
console.log(`Logged in as ${client.user.tag}!`);
});
client.on('disconnect', () => {
client.destroy();
});
client.on('message', msg => {
if (msg.channel.name === 'terraria') {
var msgSplit = msg.content.split(" ");
if (msgSplit[0] === "!restart") {
child.kill();
child = child_process.spawn(terrariaServerPath, terrariaArgs);
registerStdio();
msg.reply("restarting server")
}
if (msgSplit[0] === "!exec") {
msg.reply(msgSplit[1]);
child.stdin.write(msgSplit[1] + "\n");
child.stdin.end();
}
}
});
client.login(tokens.discord_token);
var registerStdio = function () {
child.stdout.on('data', (data) => {
console.log(`${data}`);
});
child.stderr.on('data', (data) => {
console.error(`${data}`);
});
}
registerStdio();
I was able to solve the problem by using the library node-pty. As near as I can tell, the problem was that the child process was not reading the stdin itself and I was unable to flush it. Node-pty creates a virtual terminal object which can be written to instead of stdin. This object does not buffer writes and so any input is immediately sent to the program.

Promise resolving to child stream stdout and rejecting child stream stderr

I'd like to build a promise that spawns a child process using require('child_process').spawn. The process streams its output to stdout and its errors to stderr.
I would like the promise to:
reject(child.stderr stream (or its data)) if child.stderr emits any data.
resolve(child.stdout stream) only if no error is emitted.
I'm doing this because I want to chain the promise to:
a then that processes the child.stdout stream (upload the stream to an S3 bucket).
a catch that can process the child.stderr stream, allowing me to properly handle errors.
Is it feasible to combine promises and process streams like this ?
I was thinking of working around stderr but unsure about whats happening in between to stdout if a lot of data is coming into it and I don't process it fast enough.
As I see it, the issue is that you don't know whether you ever got data on stderr until the entire process is done as it could put data there at any time.
So, you have to wait for the entire process to be done before calling resolve() or reject(). And, if you then want the entire data to be sent to either one of those, you'd have to buffer them. You could call reject() as soon as you got data on stderr, but you aren't guaranteed to have all the data yet because it's a stream.
So, if you don't want to buffer, you're better off just letting the caller see the streams directly.
If you are OK with buffering the data, you can buffer it yourself like this:
Based on the spawn example in the node.js doc, you could add promise support to it like this:
const spawn = require('child_process').spawn;
function runIt(cmd, args) {
return new Promise(function(resolve, reject) {
const ls = spawn(cmd, args);
// Edit thomas.g: My child process generates binary data so I use buffers instead, see my comments inside the code
// Edit thomas.g: let stdoutData = new Buffer(0)
let stdoutData = "";
let stderrData= "";
ls.stdout.on('data', (data) => {
// Edit thomas.g: stdoutData = Buffer.concat([stdoutData, chunk]);
stdoutData += data;
});
ls.stderr.on('data', (data) => {
stderrData += data;
});
ls.on('close', (code) => {
if (stderrData){
reject(stderrData);
} else {
resolve(stdoutData);
}
});
ls.on('error', (err) => {
reject(err);
});
})
}
//usage
runIt('ls', ['-lh', '/usr']).then(function(stdoutData) {
// process stdout data here
}, function(err) {
// process stdError data here or error object (if some other type of error)
});

Not receiving stdout from nodejs spawned process

I'm trying to have nodejs interact with adventure, an old text based game. The idea is to open adventure as a child process and then play the game by writing to its stdin and placing an event listener on stdout.
When the game starts, it prints an initial:
Welcome to Adventure!! Would you like instructions?
So to illustrate my problem, I have a nodejs+express instance with:
var childProcess = require('child_process');
var spawn = childProcess.spawn;
var child = spawn('adventure');
console.log("spawned: " + child.pid);
child.stdout.on('data', function(data) {
console.log("Child data: " + data);
});
child.on('error', function () {
console.log("Failed to start child.");
});
child.on('close', function (code) {
console.log('Child process exited with code ' + code);
});
child.stdout.on('end', function () {
console.log('Finished collecting data chunks.');
});
But when I start the server, the text from the game doesn't reach the event listener:
spawned: 24250
That's all the output I get. The child.stdout.on even listener is never called. Why isn't that initial line from the game being picked up?
If I append the following line to the above block of javascript, then the program output appears at once. So adventure runs, and I can now force it to trigger the child.stdout.on event listener... but this also ends the child process, which defeats the purpose of reading and writing to it.
...
child.stdout.on('end', function () {
console.log('Finished collecting data chunks.');
});
child.stdin.end();
Now the output is:
spawned: 28778
Child data:
Welcome to Adventure!! Would you like instructions?
user closed input stream, quitting...
Finished collecting data chunks.
Child process exited with code 0
I'm sure its a trivial oversight on my part, I appreciate any help figuring this out though.
After going through the Nodejs documentation a few more times, I convinced myself I was either missing something pretty big, or the spawn command wasn't working correctly. So I created a github issue.
And the answer was immediately posted: the child process can't buffer output if you want fast access.
So to achieve what I was originally looking for:
var childProcess = require('child_process');
var spawn = childProcess.spawn;
var child = spawn('unbuffer', 'adventure');
console.log("spawned: " + child.pid);
child.stdout.on('data', function(data) {
console.log("Child data: " + data);
});
child.on('error', function () {
console.log("Failed to start child.");
});
child.on('close', function (code) {
console.log('Child process exited with code ' + code);
});
child.stdout.on('end', function () {
console.log('Finished collecting data chunks.');
});
With the functional difference being the use of unbuffer, a command that disables output buffering.
Why isn't that initial line from the game being picked up?
I had the same problem on a project that called a compiled C++ program from node.js. I realized the problem was in the compiled C++ program: I wasn't flushing the stdout stream. Adding fflush(stdout); after printing a line solved the issue. Hopefully you still have access to the source of the game!
The data passed is a buffer type, not a string. Therefore, you need a decoder to read that buffer and then do the logging.
Here's how to do that.
var StringDecoder = require('string_decoder').StringDecoder;
var decoder = new StringDecoder('utf8');
child.stdout.on('data', function (data) {
var message = decoder.write(data);
console.log(message.trim());
});

Retrieve shell error output from child_process.spawn

I'm using child_process.spawn and need to capture the shell error that occurs when the command fails. According to this question, I should be able to do:
var child_process = require('child_process');
var python = child_process.spawn(
'python', ["script.py", "someParam"]
);
python.on('error', function(error) {
console.log("Error: bad command", error);
});
When I replace 'python', ["script.py", "someParam"] with banana, like in the linked question, it works, and the error is visible. But in my case, using python with arguments, the 'error' event is never called.
How can I capture shell errors from python?
According to the Node.js docs for the ChildProcess error event, it is only fired in a few situations:
The process could not be spawned, or
The process could not be killed, or
Sending a message to the child process failed for whatever reason.
To capture the shell error output, you can additionally listen to data events on the stdout and stderr of your spawned process:
python.stdout.on('data', function(data) {
console.log(data.toString());
});
python.stderr.on('data', function(data) {
console.error(data.toString());
});
To capture the shell error code, you can attach a listener to the exit event:
python.on('exit', function(code) {
console.log("Exited with code " + code);
});
Thread is a little bit old and all but I encountered this today while working on my test casing library. I realize that the accepted answer has a solution already, but, for me, it is not clearly explained. Anyway in case someone needs it here is what you need to do.
The thing I realized is that, while executing code, if python interpreter encounters an error, or should I say, if your code has an error in it, it will write it to standard error stream and exit. So what you need to do, in your Node.js code, is to listen to the stderr stream of the spawned process. In addition, all of the data passed to print() function is written to the 'stdout' stream of the process.
So here is an example code:
const { spawn } = require('child_process');
const proc = spawn('python',['main.py','-c']);
proc.stderr.on('data',(data)=>{
//Here data is of type buffer
console.log(data.toString())
})
proc.stdout('data',(data)=>{
//Also buffer
console.log(data.toString());
})
What happens clear should already be clear if you read the first part of my answer. One other thing you could do instead of writing data to the console, is redirect it to another stream, this could be really useful if you want to write output data to a file for example. This is how you could do it:
const fs = require('fs');
const path = require('path');
const { spawn } = require('child_process');
const outputFile = path.join(__dirname,'output.txt');
const errorFile = path.join(__dirname,'output.txt');
const outputStream = fs.createWriteStream(outputFile, {
encoding: "utf8",
autoClose: true
});
const outputStream = fs.createWriteStream(errorFile, {
encoding: "utf8",
autoClose: true
});
const proc = spawn('python',['main.py','-c']);
proc.stdout.pipe(outputStream);
proc.stderr.pipe(errorStream);
What is happening here is that, using pipe function we send all data from stdout and stderr of the process to the file streams. Also you do not have to worry about files existing, it will create them for you

Node.js Error Sending Stream to Mplayer, Sending File Works Fine

I'm trying to spawn mplayer as a child process, sometimes with a file to play and other times with a stream to play. The file works fine, but when I create a stream from the file, I get this error:
events.js:85
throw er; // Unhandled 'error' event
^
Error: read ECONNRESET
at exports._errnoException (util.js:746:11)
at Pipe.onread (net.js:550:26)
It's not a file permissions problem because I ran it as sudo and still had the same issue.
To test a potential problem with streaming, I created a write stream from the read stream and that worked without a problem.
I'm not really sure what to do next. I'd appreciate any advice or help.
Here's the code:
var fs = require('fs');
var spawn = require('child_process').spawn;
var file = "/usr/local/apps/ha6/web/voice/ga.wav";
var file2 = "/usr/local/apps/ha6/web/voice/ga2.wav";
function filePlay() {
var mplayer = spawn("mplayer", ["-slave", file], {stdio: ['pipe', 'ignore', 'ignore']});
mplayer.on("exit", function () {
console.log("exit");
});
}
function streamPlay() {
var str = fs.createReadStream(file).on("error", function (error) {
console.log("Error creating read stream:" + error);
});
var mplayer = spawn("mplayer", ["-slave "], {stdio: ['pipe', 'ignore', 'ignore']}).on("error", function (error) {
console.log("Spawn error " + error);
});
str.pipe(mplayer.stdin);
}
function testPiping() {
var str = fs.createReadStream(file).on("error", function (error) {
console.log("Error creating read stream:" + error);
});
var str2 = fs.createWriteStream(file2).on("error", function(error) {
console.log("Error creating write stream:" + error);
});
str.pipe(str2);
console.log("Pipe a success!");
}
filePlay(); // works fine
testPiping(); // works fine
streamPlay(); // ECONNRESET error
I would guess that the reason the error looks so bad is that there's an underlying problem with net.Socket.on as seen here and the code was revised here. So joyent appears to be the code that's in the underlying error. My attempt above was to make a catch-all error trapper that "swallows" the error and puts it into the log.
What is TCP RST? Read about it here. Sometimes a router or your ISP wants to validate that an open TCP connection is still listening and so they inject a reset. Your app's stack is expected to respond in such a way to satisfy them not to drop the session.
Shorter answer: try upgrading the joyent module to latest.
From this answer, "Clean and correct solution: Technically, in node, whenever you emit an 'error' event and no one listens to it, it will throw. To make it not throw, put a listener on it and handle it yourself. That way you can log the error with more information.", the suggestion is to listen (trap) the error so that it doesn't throw an exception. Joyent is throwing the error. If you don't catch it (trap it) then it rises to the level of destruction.
Try
stream.on("error", function(e) { console.log(e); });
...somewhere before the call. You're basically separating the error logic from the stream itself. If the child process isn't running yet or has closed before the parent calls then there's no binding. <- Guess

Resources