Node - piping process.stdout doesn't drain automatically - node.js

Consider this Readable stream:
class ReadableStream extends stream.Readable {
constructor() {
super({objectMode:true, highWaterMark:128});
}
i = 0;
_read(size: number) {
while (this.push({key:this.i++})){}
}
}
Piping to process.stdout doesn't drain it automatically. Nothing happens, and the program exits.
new ReadableStream().pipe(process.stdout);
Now, let's pipe it to this Writable stream instead:
class WritableStream extends stream.Writable {
constructor() {
super({objectMode:true, highWaterMark:128});
}
_write(chunk: any, encoding: string, callback: (error?: (Error | null)) => void) {
console.log(chunk);
callback();
}
}
new ReadableStream().pipe(new WritableStream());
The console is instantly filled with numbers, and so it goes into infinity.
Why process.stdout or fs.createWriteStream do automatically request data?

process.stdout is not an object mode stream and does not work properly when you pipe an object mode stream to it. If you change your readableStream to not be an object mode stream, then the .pipe() will work properly.
In fact, if you attach an event handler for the error event such as:
new ReadableStream().pipe(process.stdout).on('error', err => {
console.log(err);
});
Then, you will get this:
TypeError [ERR_INVALID_ARG_TYPE]: The "chunk" argument must be one of type string or Buffer. Received type object
at validChunk (_stream_writable.js:268:10)
at WriteStream.Writable.write (_stream_writable.js:303:21)
at ReadableStream.ondata (_stream_readable.js:727:22)
at ReadableStream.emit (events.js:210:5)
at ReadableStream.Readable.read (_stream_readable.js:525:10)
at flow (_stream_readable.js:1000:34)
at resume_ (_stream_readable.js:981:3)
at processTicksAndRejections (internal/process/task_queues.js:80:21) {
code: 'ERR_INVALID_ARG_TYPE'
}
Which shows that stdout is not expecting to get an object.

Related

Create an ongoing stream from buffer and append to the stream

I am receiving a base 64 encode string in my Nodejs server in chunks, and I want to convert it to a stream that can be read by another process but I am not finding how to do it. Currently, I have this code.
const stream = Readable.from(Buffer.from(data, 'base64'));
But this creates a new instance of stream, but what I would like to do it to keep appending to the open stream until no more data is received from my front end. How do I create an appending stream that I can add to and can be read by another process?
--- Additional information --
Client connect to the NodeJS server via websocket. I read the "data" from the payload on the websocket message received.
socket.on('message', async function(res) {
try
{
let payload = JSON.parse(res);
let payloadType = payload['type'];
let data = payload['data'];
---Edit --
I am getting this error message after pushing to the stream.
Error [ERR_METHOD_NOT_IMPLEMENTED]: The _read() method is not implemented
at Readable._read (internal/streams/readable.js:642:9)
at Readable.read (internal/streams/readable.js:481:10)
at maybeReadMore_ (internal/streams/readable.js:629:12)
at processTicksAndRejections (internal/process/task_queues.js:82:21) {
code: 'ERR_METHOD_NOT_IMPLEMENTED'
}
This is the code where I am reading it from, and connected to the stream:
const getAudioStream = async function* () {
for await (const chunk of micStream) {
if (chunk.length <= SAMPLE_RATE) {
yield {
AudioEvent: {
AudioChunk: encodePCMChunk(chunk),
},
};
}
}
};

Throwing an error from a Node.js Transform stream

I need to throw an error in a Transform stream.
Normally, I'd do this with the callback function on _transform(). I can't in my situation because I need to throw the error even if no data is currently flowing through my stream. That is, if no data is flowing, _transform() isn't getting called, and there's no callback I can call.
Currently, I'm emitting an error. Something like this:
import { Transform } from 'stream';
export default class MyTransformStream extends Transform {
constructor(opts) {
super(opts);
setTimeout(() => {
this.emit('error', new Error('Some error!'));
}, 10_000);
}
_transform(chunk, encoding, callback) {
this.push(chunk);
callback();
}
}
This seems to work fine. However, the documentation has a nice warning about it:
Avoid overriding public methods such as write(), end(), cork(), uncork(), read() and destroy(), or emitting internal events such as 'error', 'data', 'end', 'finish' and 'close' through .emit(). Doing so can break current and future stream invariants leading to behavior and/or compatibility issues with other streams, stream utilities, and user expectations.
Unfortunately, the documentation doesn't seem to suggest what to do instead.
What's the right way to throw this error outside of a _transform() call?

Can Transform stream in node js read strings, but write in object mode?

I am trying to make transform stream to write in object mode, but keep reading strings. Is it possible? Documentation says, that for Duplex stream I can set readableObjectMode and writableObjectMode separately, but somehow it is not working for me.
When I use callback with my object in _flush, I get error: Invalid non-string/buffer chunk
Am I doing something wrong or it doesn't work in Transform streams?
Here is my code:
class stream extends Transform {
private logs: { name: string, errors: any[] };
constructor() {
super({ writableObjectMode: true });
this.logs = { name: this.tableName, errors: [] };
}
_transform(chunk, encoding, callback) {
// stuff here
callback();
}
_flush(callback) {
//here I get error
callback(undefined, this.logs);
}
}
I found the answer. I need to set { readableObjectMode: true } instead, because this is actually a readable side of my transform stream that I am using, not writable.

How do I write into a writable stream conditionally only if it is open?

I have this function in my module which writes to a child process's stdin stream. But sometimes I face
events.js:160
throw er; // Unhandled 'error' event
^
Error: write EPIPE
at exports._errnoException (util.js:1020:11)
at WriteWrap.afterWrite (net.js:800:14)
I think its happening because sometimes the writable stdin stream is closed before I write into it. Basically I want to check whether its closed or not. If its open I'll write into it otherwise I won't write into it.
Relevant Code
/**
* Write the stdin into the child process
* #param proc Child process refrence
* #param stdin stdin string
*/
export function writeToStdin(proc: ChildProcess, stdin: string) {
if (stdin) {
proc.stdin.write(stdin + '\r\n');
proc.stdin.end();
}
}
Is there an API to check it as I couldn't find any?
You can try the finished api from streams
const { finished } = require('stream');
const rs = fs.createReadStream('archive.tar');
finished(rs, (err) => {
if (err) {
console.error('Stream failed', err);
} else {
console.log('Stream is done reading');
}
});
rs.resume(); // drain the stream
nodejs documentation references the field writable on WritableStream objects, which seems to be what you're looking for: https://nodejs.org/api/stream.html#writablewritable
Is true if it is safe to call writable.write(), which means the stream has not been destroyed, errored or ended.

NodeJS Stream splitting

I have an infinite data stream from a forked process. I want this stream to be processed by a module and sometimes I want to duplicate the data from this stream to be processed by a different module (e.g. monitoring a data stream but if anything interesting happens I want to log the next n bytes to file for further investigation).
So let's suppose the following scenario:
I start the program and start consuming the readable stream
2 secs later I want to process the same data for 1 sec by a different stream reader
Once the time is up I want to close the second consumer but the original consumer must stay untouched.
Here is a code snippet for this:
var stream = process.stdout;
stream.pipe(detector); // Using the first consumer
function startAnotherConsumer() {
stream2 = new PassThrough();
stream.pipe(stream2);
// use stream2 somewhere else
}
function stopAnotherConsumer() {
stream.unpipe(stream2);
}
My problem here is that unpiping the stream2 doesn't get it closed. If I call stream.end() after the unpipe command, then it crashes with the error:
events.js:160
throw er; // Unhandled 'error' event
^
Error: write after end
at writeAfterEnd (_stream_writable.js:192:12)
at PassThrough.Writable.write (_stream_writable.js:243:5)
at Socket.ondata (_stream_readable.js:555:20)
at emitOne (events.js:101:20)
at Socket.emit (events.js:188:7)
at readableAddChunk (_stream_readable.js:176:18)
at Socket.Readable.push (_stream_readable.js:134:10)
at Pipe.onread (net.js:548:20)
I even tried to pause the source stream to help the buffer to be flushed from the second stream but it didn't work either:
function stopAnotherConsumer() {
stream.pause();
stream2.once('unpipe', function () {
stream.resume();
stream2.end();
});
stream.unpipe(stream2);
}
Same error as before here (write after end).
How to solve the problem? My original intent is to duplicate the streamed data from one point, then close the second stream after a while.
Note: I tried to use this answer to make it work.
As there were no answers, I post my (patchwork) solution. In case anyone'd have a better one, don't hold it back.
A new Stream:
const Writable = require('stream').Writable;
const Transform = require('stream').Transform;
class DuplicatorStream extends Transform {
constructor(options) {
super(options);
this.otherStream = null;
}
attachStream(stream) {
if (!stream instanceof Writable) {
throw new Error('DuplicatorStream argument is not a writeable stream!');
}
if (this.otherStream) {
throw new Error('A stream is already attached!');
}
this.otherStream = stream;
this.emit('attach', stream);
}
detachStream() {
if (!this.otherStream) {
throw new Error('No stream to detach!');
}
let stream = this.otherStream;
this.otherStream = null;
this.emit('detach', stream);
}
_transform(chunk, encoding, callback) {
if (this.otherStream) {
this.otherStream.write(chunk);
}
callback(null, chunk);
}
}
module.exports = DuplicatorStream;
And the usage:
var stream = process.stdout;
var stream2;
duplicatorStream = new DuplicatorStream();
stream.pipe(duplicatorStream); // Inserting my duplicator stream in the chain
duplicatorStream.pipe(detector); // Using the first consumer
function startAnotherConsumer() {
stream2 = new stream.PassThrough();
duplicatorStream.attachStream(stream2);
// use stream2 somewhere else
}
function stopAnotherConsumer() {
duplicatorStream.once('detach', function () {
stream2.end();
});
duplicatorStream.detachStream();
}

Resources