I'll illustrate the problem with the following code.
child.js :
process.stdin.resume();
process.stdin.on('data', function(data) {
process.stdout.write(data + '\n');
process.stdout.write('world\n');
process.stdout.write('greatings, earthlings!\n');
});
parent.js :
var spawn = require('child_process').spawn;
var child = spawn('node', ['child.js']);
child.stdin.write('hello');
child.stdout.on('data', function(data) {
process.stdout.write('child echoed with: ' + data);
});
And in the Windows cmd, I run
node parent.js
it outputs:
child echoed with: hello
child echoed with: world
greatings, earthlings!
Here I did bind the data event on every child's stdout, and I shall get echoed back in a pattern as 'child echoed with data'.
As you can see, on the third line of the outputs, it's not in that pattern. So why?
I assume one write triggers one data event(true?).
So I tried to signify when does the data event callback function end.
I changed the way I bind data event in parent.js with:
child.stdout.on('data', function(data) {
process.stdout.write('child echoed with: ' + data);
process.stdout.write('callback ends\n');
});
And I get this output:
child echoed with: hello
callback ends
child echoed with: world
greatings, earthlings!
callback ends
It turns out with three write, only two data events get fired?
So why this is happening?
A stream is just a stream of bytes, so it is not safe to assume there will be any correlation between the number of write calls and the number of data events. For standard implementations and usage, chances are that they will be close to the same, but it depends on how quickly you are writing the data, and how much data you are writing. For instance, one call to write could trigger a data event for each byte written, or some number of write calls could be buffered before data is emitted.
To be clear, what is happening in your case is this:
process.stdout.write(data + '\n');
// Emits 'data' event with "hello\n"
process.stdout.write('world\n');
// Doesn't emit 'data' because it is buffered somewhere.
process.stdout.write('greatings, earthlings!\n');
// Emits 'data' event with "world\ngreatings, earthlings!\n"
So that second event has two lines. Which means when you run
process.stdout.write('child echoed with: ' + data)
you are printing exactly what you see:
child echoed with: world\ngreatings, earthlings!\n
which renders as
child echoed with: world
greatings, earthlings!
Related
I have a c program (I didn't code it) that prints some data in the terminal. I launch the program as a child process in node with the spawn function.
const child_process = spawn('./myProgram', ['--arg']);
After that, I code the event to get the printed data:
child_process.stdout.on('data', function(data) {
console.log(data);
});
When I run the program I can't see the output data from my c program in my nodejs terminal. If I initialize the child process with stdio as inherit it works.
const child_process = spawn('./myProgram', ['--arg'], {stdio :'inherit'});
The key point here is that I need to process that data in my nodejs app. I suppose the way the c file prints the data is not the standard one, so my nodjs program does not get it.
The file was outputting to stderr instead to stdout. It was fixed by adding the event to stderr:
child_process.stderr.on('data', function(data) {
console.log(data);
});
#tadman got the answere.
Take the following code in nodejs-:
console.log("Hello world");
process.stdin.on('connect', function() {
});
This prints Hello World and then Node exits. But when I replace the connect event with 'data' event, the Node runtime does not exit.
Why is that ? What is so special about the EventEmitter's data event ? Does it open a socket connection ? So in the on() method is there code like the following -:
function on(event, callback) {
if(event === 'data') {
//open socket
//do work
}
else {
//do non-socket work
}
}
Is there a clear answer to why adding a listener to the data event "magically" open a socket.
Node.js event loop has couple phases of processing, in your case it's poll phase. Which process for example incoming data (process.stdin.on('data', cb)) so until there is a callback that can handle this event, a this event can occur, node event loop is not empty and node will not exit.
process.stdin is Readable Stream which has fallowing events:
close
data
end
error
readable
so there is nothing like connect.
process.stdin.on('readable', function() {
console.log('Readable');
});
Code above will print Readable and exit because after firing event stream is not in flowing state so it will exit event loop because it's empty, but data event sets stream state to flowing,
so if stream state is set to flowing and if readableState.reading is true it will prevent node to exit.
if you do
process.stdin.on('data', function(data) {
console.log(data.toString());
});
if you write anything in console when this is running it will work like echo.
https://github.com/nodejs/node/blob/master/lib/_stream_readable.js#L774-L795
You can read full explanation how event loop works here https://nodejs.org/en/docs/guides/event-loop-timers-and-nexttick/
If we assume index.js contains
process.stdin.on('data', console.log)
and you do node index.js, then the process waits for input on stdin.
If you send some data on stdin via e.g. echo woohoo | node index.js then the buffer will be written and the process exits.
I was trying to write a node.js script that only takes input from stdin if it's piped (as opposed to wait input from keyboard). Therefore I need to determine whether the stdin piped in is null.
First I tried using the readable event:
var s = process.stdin;
s.on('readable', function () {
console.log('Event "readable" is fired!');
var chunk = s.read();
console.log(chunk);
if (chunk===null) s.pause();
});
And the result is as expected:
$ node test.js
Event "readable" is fired!
null
$
Then I tried to do the same thing using data event, because I like to use flowing mode:
var s = process.stdin;
s.on('data', function (chunk) {
console.log('Event "data" is fired!');
console.log(chunk);
if (chunk===null) s.pause();
});
but this time it waited for keyboard input before the null check, and stucked there. I was wondering why it does that? Does that mean in order to do a null check, I need to pause it first, and wait readable to be fired, do the null check, and then resume the stream, just to prevent node.js from waiting keyboard input? This seems awkward to me. Is there a way to avoid using readable event?
Use tty.isatty() from the node core library. That function will return false if stdin is a pipe.
I spawn a child process like this:
var child = require('child_process');
var proc = child.spawn('python', ['my_script.py', '-p', 'example']);
I also set some data handling:
proc.stdin.setEncoding('utf8');
proc.stdout.setEncoding('utf8');
proc.stderr.setEncoding('utf8');
proc.stdout.on('data', function (data) {
console.log('out: ' + data);
});
proc.stderr.on('data', function (data) {
console.log('err: ' + data);
});
proc.on('close', function (code) {
console.log('subprocess exited with status ' + code);
proc.stdin.end();
});
My Python script reads lines from stdin and for each line does some operations and prints to stdout. It works fine in the shell (I write a line and I get the output immediately) but when I do this in Node:
for (var i = 0; i < 10; i++) {
proc.stdin.write('THIS IS A TEST\n');
}
I get nothing.
I got to fix it calling proc.stdin.end() but that also terminates the child process (which I want to stay in background, streaming data).
I also triggered a flush filling the buffer with lots of writes, but that's not really an option.
Is there any way to manually flush the stream?
You are not flushing the output from Python after print statement. I had similar problem and #Alfe answered my question. Take a look at this:
Stream child process output in flowing mode
I'm using node-tail to read a file in linux and send it down to a socket.
node.js sending data read from a text file
var io = require('socket.io');
Tail = require('tail').Tail;
tail = new Tail("/tmp/test.txt");
io.sockets.on('connection', function (socket) {
tail.on("line", function(data) {
socket.emit('Message', { test: data });
});
});
Receiving side
var socket = io.connect();
socket.on('Message', function (data) {
console.log(data.test);
});
This works but when I try to modify this part
tail = new Tail("/tmp/test.txt");
to this
tail = new Tail("/tmp/FIFOFILE");
I can't get any data from it.
Is there anyway to read a named pipe in linux? or a package that can read a named pipe?
I can get it to work in a silly way:
// app.js
process.stdin.resume();
process.stdin.on('data', function(chunk) {
console.log('D', chunk);
});
And start like this:
node app.js < /tmp/FIFOFILE
If I create a readable stream for the named pipe, it ends after having read the first piece of data written to the named pipe. Not sure why stdin is special.
The OS will send an EOF when the last process finishes writing to the FIFO. If only one process is writing to the FIFO then you get an EOF when that process finishes writing its stuff. This EOF triggers Node to close the stream.
The trick to avoiding this is given by #JoshuaWalsh in this answer, namely: you open the pipe yourself FOR READING AND WRITING - even though you have no intention of ever writing to it. This means that the OS sees that there is always at least one process writing to the file and so you never get the EOF.
So... just add in something like:
let fifoHandle = fs.open(fifoPath, fs.constants.O_RDWR,function(){console.log('FIFO open')});
You don't ever have to do anything with fifoHandle - just make sure it sticks around and doesn't get garbage collected.
In fact... in my case I was using createReadStream, and I found that simply adding the fs.constants.O_RDWR to this was enough (even though I have no intention of ever writing to the fifo.
let fifo = fs.createReadStream(fifoPath,{flags: fs.constants.O_RDWR});
fifo.on('data',function(data){
console.log('Got data:'+data.toString());
}