nodejs stdin readable event not triggered - node.js

The readable event is not triggered in the process.stdin
test.js
var self = process.stdin, data ;
self.on('readable', function() {
var chunk = this.read();
if (chunk === null) {
handleArguments();
} else {
data += chunk;
}
});
self.on('end', function() {
console.log("end event",data )
});
Then when i do node test.jsand start typing the in the console, the readable event is not triggered at all.
Please tell me how to attach readable listener to process.stdin stream.

If you are trying to capture what you type on to console, try these steps,
process.stdin.resume();
process.stdin starts in paused state.You need to bring it to ready state.
listen on 'data' event to capture the data typed in console.
process.stdin.on('data', function(data) {
console.log(data.toString());
});
I am not sure if this helped your actual problem.Hope atleast it gives you some insight.
Additional Info:
The readable event is introduced in Node.js v0.9.4. So check if the node you are using is gte 0.9.4.
Note from node api docs:
The 'data' event emits either a Buffer (by default) or a string if setEncoding() was used.
Note that adding a 'data' event listener will switch the Readable stream into "old mode", where data is emitted as soon as it is available, rather than waiting for you to call read() to consume it.

Related

Why does the data event only stop the NodeJS run time from exiting?

Take the following code in nodejs-:
console.log("Hello world");
process.stdin.on('connect', function() {
});
This prints Hello World and then Node exits. But when I replace the connect event with 'data' event, the Node runtime does not exit.
Why is that ? What is so special about the EventEmitter's data event ? Does it open a socket connection ? So in the on() method is there code like the following -:
function on(event, callback) {
if(event === 'data') {
//open socket
//do work
}
else {
//do non-socket work
}
}
Is there a clear answer to why adding a listener to the data event "magically" open a socket.
Node.js event loop has couple phases of processing, in your case it's poll phase. Which process for example incoming data (process.stdin.on('data', cb)) so until there is a callback that can handle this event, a this event can occur, node event loop is not empty and node will not exit.
process.stdin is Readable Stream which has fallowing events:
close
data
end
error
readable
so there is nothing like connect.
process.stdin.on('readable', function() {
console.log('Readable');
});
Code above will print Readable and exit because after firing event stream is not in flowing state so it will exit event loop because it's empty, but data event sets stream state to flowing,
so if stream state is set to flowing and if readableState.reading is true it will prevent node to exit.
if you do
process.stdin.on('data', function(data) {
console.log(data.toString());
});
if you write anything in console when this is running it will work like echo.
https://github.com/nodejs/node/blob/master/lib/_stream_readable.js#L774-L795
You can read full explanation how event loop works here https://nodejs.org/en/docs/guides/event-loop-timers-and-nexttick/
If we assume index.js contains
process.stdin.on('data', console.log)
and you do node index.js, then the process waits for input on stdin.
If you send some data on stdin via e.g. echo woohoo | node index.js then the buffer will be written and the process exits.

NodeJS streams and premature end

Assuming a Readable Stream in NodeJS and a Data (on('data', ...)) event handler tied to it that is relatively slow, is it possible for the End event to fire before the last Data handler(s) has finished, and if so, will it prematurely terminate that handler? Or, will all Data events get dispatched and run?
In my case, I am working with large files and want to commit to a DB every data chunk. I am worried that I may lose the last record or two (or more) if End is fired before the last DB calls in the handler actually complete.
Event 'end' fire after last 'data' event. But it may happend before the last Data handler has finished. It is possible that before one 'data' handler has finished, next is started. It depends of what you have in your code, but it is possible that later call of event 'data' finish before earlier. It may cause errors and problems in your code.
Example how to cause problems (to your own tests):
var fs = require('fs');
var rr = fs.createReadStream('somebigfile.jpg');
var i=0;
rr.on('data', function(chunk) {
i++;
var s = i;
console.log('readable:' + s);
setTimeout(function(){
console.log('timeout:'+s);
}, 50-i*10);
});
rr.on('end', function() {
console.log('end');
});
It will print in your console when start each 'data' event handler. And after some miliseconds when it finish. Finish may be in different order.
Solution:
Readable Streams have two modes 'flowing mode' and a 'paused mode'. When you add 'data' event handler, you auto set Readable Streams to flowing mode.
From documentation :
When in flowing mode, data is read from the underlying system and
provided to your program as fast as possible
In this mode events will not wait for your slow actions to finish. For your need is 'paused mode'.
From documentation:
In paused mode, you must explicitly call stream.read() to get chunks
of data out. Streams start out in paused mode.
In other words: you demand chunk of data, you get it, you work with it, and when you ready you ask for new chunk of data. In this mode you controll when you want to get your data.
How to change to 'paused mode':
It is default mode for this stream. But when you register 'data' event handler it switch to 'flowing mode'. Therefore not use readstream.on('data',...)
Instead use readstream.on('readable', function(){...}) when it fire, then it means that stream is ready to give chunk of data. To get chunk of data use var chunk = readstream.read();
Example from docs:
var fs = require('fs');
var rr = fs.createReadStream('foo.txt');
rr.on('readable', function() {
console.log('readable:', rr.read());
});
rr.on('end', function() {
console.log('end');
});
Please read documentation for more details, because there are more posibilities when stream is auto switched to 'flowing mode'.
Work with slow handlers and flowing mode:
If you want/need work in 'flowing mode', there is also solution. You can pause and resume stream. When you get chunk form readstream('data'), pause stream and when you finish work then resume it.
Example from documentation:
var readable = getReadableStreamSomehow();
readable.on('data', function(chunk) {
console.log('got %d bytes of data', chunk.length);
readable.pause();
console.log('there will be no more data for 1 second');
setTimeout(function() {
console.log('now data will start flowing again');
readable.resume();
}, 1000);
});

How do I prevent node.js from waiting for keyboard input?

I was trying to write a node.js script that only takes input from stdin if it's piped (as opposed to wait input from keyboard). Therefore I need to determine whether the stdin piped in is null.
First I tried using the readable event:
var s = process.stdin;
s.on('readable', function () {
console.log('Event "readable" is fired!');
var chunk = s.read();
console.log(chunk);
if (chunk===null) s.pause();
});
And the result is as expected:
$ node test.js
Event "readable" is fired!
null
$
Then I tried to do the same thing using data event, because I like to use flowing mode:
var s = process.stdin;
s.on('data', function (chunk) {
console.log('Event "data" is fired!');
console.log(chunk);
if (chunk===null) s.pause();
});
but this time it waited for keyboard input before the null check, and stucked there. I was wondering why it does that? Does that mean in order to do a null check, I need to pause it first, and wait readable to be fired, do the null check, and then resume the stream, just to prevent node.js from waiting keyboard input? This seems awkward to me. Is there a way to avoid using readable event?
Use tty.isatty() from the node core library. That function will return false if stdin is a pipe.

NodeJS sockets initialized as unpaused?

A net.Socket object in NodeJS is a Readable Stream, however one note in the docs got me concerned:
For the Net.Socket 'data' event, the docs say
Note that the data will be lost if there is no listener when a Socket emits a 'data' event.
That seems to imply a Socket is returned to the calling script in "flowing-mode" and already un-paused? However, for a generic Readable Stream, the documentation for the 'data' event says
If you attach a data event listener, then it will switch the stream into flowing mode, and data will be passed to your handler as soon as it is available.
That "If" seems to imply if you wait a bit to bind to the 'data' event, the stream will wait for you, and if you intentionally want to miss the 'data' events, the example in the resume() method seems to indicate you must call the resume() method to start the flow of data.
My concern is that when working with a net.Server, when you receive a net.Socket as part of a 'connection' event, is it imperative that you start handling the 'data' events right away since it's already opened? Meaning if I do:
var s = new net.Server();
s.on('connection', function(socket) {
// Do some lengthy setup process here, blocking execution for a few seconds...
socket.on('data', function(d) { console.log(d); });
});
s.listen(8080);
Meaning not bind to the 'data' event right away, I could lose data? So is this a more robust way to handle incoming connections if you have a lengthy setup required for each one?
var s = new net.Server();
s.on('connection', function(socket) {
socket.pause(); // Not ready for you yet!
// Do some lengthy setup process here, blocking execution for a few seconds...
socket.on('data', function(d) { console.log(d); });
socket.resume(); // Okay, go!
});
s.listen(8080);
Anyone have experience working with listening on raw socket streams to know if this data loss is an issue?
I'm hoping this is an instance where the Net.Socket documentation wasn't updated since v0.10, since the stream documentation has a section that mentions 'data' events started emitting right away in versions prior to 0.10. Were TCP sockets properly updated to not start emitting 'data' packets right away, and the documentation not updated appropriately?
Yes, this is the docs flaw. Here is an example:
var net = require('net')
var server = net.createServer(onConnection)
function onConnection (socket) {
console.log('onConnection')
setTimeout(startReading, 1000)
function startReading () {
socket.on('data', read)
socket.on('end', stopReading)
}
function stopReading () {
socket.removeListener('data', read)
socket.removeListener('end', stopReading)
}
}
function read (data) {
console.log('Received: ' + data.toString('utf8'))
}
server.listen(1234, onListening)
function onListening () {
console.log('onListening')
net.connect(1234, onConnect)
}
function onConnect () {
console.log('onConnect')
this.write('1')
this.write('2')
this.write('3')
this.write('4')
this.write('5')
this.write('6')
}
All the data is received. If you explicitly resume() socket, you will lose it.
Also, if you do your "lengthy" setup in a blocking manner (which you shouldn't) you can't lose any IO as it has no chance to be processed, so no events will be emitted.

How to trigger event stdin.on("data", [callback]) in the code?

Consider the following example
process.stdin.resume();
process.stdin.on("data", function(data) {
console.log("recieved " + data)
})
process.stdin.write("foo\n")
process.stdin.write("bar\n")
When I type something in terminal, I get
received something
Why it doesn't work in the same way for foo and bar which I sent earlier using stdin.write?
E.g. how can I trigger this event (stdin.on("data)) in the code? I was expected process.stdin.write to do this, but I'm just getting the same output back.
It's a Readable Stream that gets its input from the stdin file descriptor. I don't think you can write into that descriptor (but you could connect it to another writable descriptor).
However, the easiest solution in your case it to just simulate the 'data' events. Every stream is an EventEmiiter, so the following will work:
process.stdin.resume();
process.stdin.on("data", function(data) {
console.log("recieved " + data)
});
process.stdin.emit('data', 'abc');

Resources