Consider the following example
process.stdin.resume();
process.stdin.on("data", function(data) {
console.log("recieved " + data)
})
process.stdin.write("foo\n")
process.stdin.write("bar\n")
When I type something in terminal, I get
received something
Why it doesn't work in the same way for foo and bar which I sent earlier using stdin.write?
E.g. how can I trigger this event (stdin.on("data)) in the code? I was expected process.stdin.write to do this, but I'm just getting the same output back.
It's a Readable Stream that gets its input from the stdin file descriptor. I don't think you can write into that descriptor (but you could connect it to another writable descriptor).
However, the easiest solution in your case it to just simulate the 'data' events. Every stream is an EventEmiiter, so the following will work:
process.stdin.resume();
process.stdin.on("data", function(data) {
console.log("recieved " + data)
});
process.stdin.emit('data', 'abc');
Related
I'm just starting learning about streams in node. I have a string in memory and I want to put it in a stream that applies a transformation and pipe it through to process.stdout. Here is my attempt to do it:
var through = require('through');
var stream = through(function write(data) {
this.push(data.toUpperCase());
});
stream.push('asdf');
stream.pipe(process.stdout);
stream.end();
It does not work. When I run the script on the cli via node, nothing is sent to stdout and no errors are thrown. A few questions I have:
If you have a value in memory that you want to put into a stream, what is the best way to do it?
What is the difference between push and queue?
Does it matter if I call end() before or after calling pipe()?
Is end() equivalent to push(null)?
Thanks!
Just use the vanilla stream API
var Transform = require("stream").Transform;
// create a new Transform stream
var stream = new Transform({
decodeStrings: false,
encoding: "ascii"
});
// implement the _transform method
stream._transform = function _transform(str, enc, done) {
this.push(str.toUpperCase() + "\n";
done();
};
// connect to stdout
stream.pipe(process.stdout);
// write some stuff to the stream
stream.write("hello!");
stream.write("world!");
// output
// HELLO!
// WORLD!
Or you can build your own stream constructor. This is really the way the stream API is intended to be used
var Transform = require("stream").Transform;
function MyStream() {
// call Transform constructor with `this` context
// {decodeStrings: false} keeps data as `string` type instead of `Buffer`
// {encoding: "ascii"} sets the encoding for our strings
Transform.call(this, {decodeStrings: false, encoding: "ascii"});
// our function to do "work"
function _transform(str, encoding, done) {
this.push(str.toUpperCase() + "\n");
done();
}
// export our function
this._transform = _transform;
}
// extend the Transform.prototype to your constructor
MyStream.prototype = Object.create(Transform.prototype, {
constructor: {
value: MyStream
}
});
Now use it like this
// instantiate
var a = new MyStream();
// pipe to a destination
a.pipe(process.stdout);
// write data
a.write("hello!");
a.write("world!");
Output
HELLO!
WORLD!
Some other notes about .push vs .write.
.write(str) adds data to the writable buffer. It is meant to be called externally. If you think of a stream like a duplex file handle, it's just like fwrite, only buffered.
.push(str) adds data to the readable buffer. It is only intended to be called from within our stream.
.push(str) can be called many times. Watch what happens if we change our function to
function _transform(str, encoding, done) {
this.push(str.toUpperCase());
this.push(str.toUpperCase());
this.push(str.toUpperCase() + "\n");
done();
}
Output
HELLO!HELLO!HELLO!
WORLD!WORLD!WORLD!
First, you want to use write(), not push(). write() puts data in to the stream, push() pushes data out of the stream; you only use push() when implementing your own Readable, Duplex, or Transform streams.
Second, you'll only want to write() data to the stream after you've setup the pipe() (or added some event listeners). If you write to a stream with nothing wired to the other end, the data you've written will be lost. As #naomik pointed out, this isn't true in general since a Writable stream will buffer write()s. In your example you do need to write() after pipe() though. Otherwise, the process will end before writing anything to STDOUT. This is possibly due to how the through module is implemented, but I don't know that for sure.
So, with that in mind, you can make a couple simple changes to your example to get it to work:
var through = require('through');
var stream = through(function write(data) {
this.push(data.toUpperCase());
});
stream.pipe(process.stdout);
stream.write('asdf');
stream.end();
Now, for your questions:
The easiest way to get data from memory in to a writable stream is to simply write() it, just like we're doing with stream.wrtie('asdf') in your example.
As far as I know, the stream doesn't have a queue() function, did you mean write()? Like I said above, write() is used to put data in to a stream, push() is used to push data out of the stream. Only call push() in your owns stream implementations.
Only call end() after all your data has been written to your stream. end() basically says: "Ok, I'm done now. Please finish what you're doing and close the stream."
push(null) is pretty much equivalent to end(). That being said, don't call push(null) unless you're doing it inside your own stream implementation (as stated above). It's almost always more appropriate to call end().
Based on the examples for stream (http://nodejs.org/api/stream.html#stream_readable_pipe_destination_options)
and through (https://www.npmjs.org/package/through)
it doesn't look like you are using your stream correctly... What happens if you use write(...) instead of push(...)?
I was trying to write a node.js script that only takes input from stdin if it's piped (as opposed to wait input from keyboard). Therefore I need to determine whether the stdin piped in is null.
First I tried using the readable event:
var s = process.stdin;
s.on('readable', function () {
console.log('Event "readable" is fired!');
var chunk = s.read();
console.log(chunk);
if (chunk===null) s.pause();
});
And the result is as expected:
$ node test.js
Event "readable" is fired!
null
$
Then I tried to do the same thing using data event, because I like to use flowing mode:
var s = process.stdin;
s.on('data', function (chunk) {
console.log('Event "data" is fired!');
console.log(chunk);
if (chunk===null) s.pause();
});
but this time it waited for keyboard input before the null check, and stucked there. I was wondering why it does that? Does that mean in order to do a null check, I need to pause it first, and wait readable to be fired, do the null check, and then resume the stream, just to prevent node.js from waiting keyboard input? This seems awkward to me. Is there a way to avoid using readable event?
Use tty.isatty() from the node core library. That function will return false if stdin is a pipe.
I'll illustrate the problem with the following code.
child.js :
process.stdin.resume();
process.stdin.on('data', function(data) {
process.stdout.write(data + '\n');
process.stdout.write('world\n');
process.stdout.write('greatings, earthlings!\n');
});
parent.js :
var spawn = require('child_process').spawn;
var child = spawn('node', ['child.js']);
child.stdin.write('hello');
child.stdout.on('data', function(data) {
process.stdout.write('child echoed with: ' + data);
});
And in the Windows cmd, I run
node parent.js
it outputs:
child echoed with: hello
child echoed with: world
greatings, earthlings!
Here I did bind the data event on every child's stdout, and I shall get echoed back in a pattern as 'child echoed with data'.
As you can see, on the third line of the outputs, it's not in that pattern. So why?
I assume one write triggers one data event(true?).
So I tried to signify when does the data event callback function end.
I changed the way I bind data event in parent.js with:
child.stdout.on('data', function(data) {
process.stdout.write('child echoed with: ' + data);
process.stdout.write('callback ends\n');
});
And I get this output:
child echoed with: hello
callback ends
child echoed with: world
greatings, earthlings!
callback ends
It turns out with three write, only two data events get fired?
So why this is happening?
A stream is just a stream of bytes, so it is not safe to assume there will be any correlation between the number of write calls and the number of data events. For standard implementations and usage, chances are that they will be close to the same, but it depends on how quickly you are writing the data, and how much data you are writing. For instance, one call to write could trigger a data event for each byte written, or some number of write calls could be buffered before data is emitted.
To be clear, what is happening in your case is this:
process.stdout.write(data + '\n');
// Emits 'data' event with "hello\n"
process.stdout.write('world\n');
// Doesn't emit 'data' because it is buffered somewhere.
process.stdout.write('greatings, earthlings!\n');
// Emits 'data' event with "world\ngreatings, earthlings!\n"
So that second event has two lines. Which means when you run
process.stdout.write('child echoed with: ' + data)
you are printing exactly what you see:
child echoed with: world\ngreatings, earthlings!\n
which renders as
child echoed with: world
greatings, earthlings!
The readable event is not triggered in the process.stdin
test.js
var self = process.stdin, data ;
self.on('readable', function() {
var chunk = this.read();
if (chunk === null) {
handleArguments();
} else {
data += chunk;
}
});
self.on('end', function() {
console.log("end event",data )
});
Then when i do node test.jsand start typing the in the console, the readable event is not triggered at all.
Please tell me how to attach readable listener to process.stdin stream.
If you are trying to capture what you type on to console, try these steps,
process.stdin.resume();
process.stdin starts in paused state.You need to bring it to ready state.
listen on 'data' event to capture the data typed in console.
process.stdin.on('data', function(data) {
console.log(data.toString());
});
I am not sure if this helped your actual problem.Hope atleast it gives you some insight.
Additional Info:
The readable event is introduced in Node.js v0.9.4. So check if the node you are using is gte 0.9.4.
Note from node api docs:
The 'data' event emits either a Buffer (by default) or a string if setEncoding() was used.
Note that adding a 'data' event listener will switch the Readable stream into "old mode", where data is emitted as soon as it is available, rather than waiting for you to call read() to consume it.
I'm using node-tail to read a file in linux and send it down to a socket.
node.js sending data read from a text file
var io = require('socket.io');
Tail = require('tail').Tail;
tail = new Tail("/tmp/test.txt");
io.sockets.on('connection', function (socket) {
tail.on("line", function(data) {
socket.emit('Message', { test: data });
});
});
Receiving side
var socket = io.connect();
socket.on('Message', function (data) {
console.log(data.test);
});
This works but when I try to modify this part
tail = new Tail("/tmp/test.txt");
to this
tail = new Tail("/tmp/FIFOFILE");
I can't get any data from it.
Is there anyway to read a named pipe in linux? or a package that can read a named pipe?
I can get it to work in a silly way:
// app.js
process.stdin.resume();
process.stdin.on('data', function(chunk) {
console.log('D', chunk);
});
And start like this:
node app.js < /tmp/FIFOFILE
If I create a readable stream for the named pipe, it ends after having read the first piece of data written to the named pipe. Not sure why stdin is special.
The OS will send an EOF when the last process finishes writing to the FIFO. If only one process is writing to the FIFO then you get an EOF when that process finishes writing its stuff. This EOF triggers Node to close the stream.
The trick to avoiding this is given by #JoshuaWalsh in this answer, namely: you open the pipe yourself FOR READING AND WRITING - even though you have no intention of ever writing to it. This means that the OS sees that there is always at least one process writing to the file and so you never get the EOF.
So... just add in something like:
let fifoHandle = fs.open(fifoPath, fs.constants.O_RDWR,function(){console.log('FIFO open')});
You don't ever have to do anything with fifoHandle - just make sure it sticks around and doesn't get garbage collected.
In fact... in my case I was using createReadStream, and I found that simply adding the fs.constants.O_RDWR to this was enough (even though I have no intention of ever writing to the fifo.
let fifo = fs.createReadStream(fifoPath,{flags: fs.constants.O_RDWR});
fifo.on('data',function(data){
console.log('Got data:'+data.toString());
}