Is it possible to use a different buffer with Node Stream? - node.js

I have a class that extends Writable from Stream.
From the following link https://nodejs.org/api/stream.html#stream_streams_under_the_hood
.. I understand that this inherently uses a buffer, but I want it to use a ring buffer instead. Is it easy to implement? Or am I on the wrong path here? I'm a Node noob.

Related

How does buffer works in node js?

I'm new in node js and trying to broadcast video streaming, but not getting any idea how to do this. Want to know how buffering works in a node js application?
Buffers are instances of the Buffer class in node, which is designed to handle raw binary data. Each buffer corresponds to some raw memory allocated outside V8. Buffers act somewhat like arrays of integers, but aren't resizable and have a whole bunch of methods specifically for binary data. In addition, the "integers" in a buffer each represent a byte and so are limited to values from 0 to 255 (2^8 - 1), inclusive.
More about buffers here.
Looks something like this:
Data is processed in terms of streams , instead whole of data at a time. These streams are collected in a buffer and once the buffer is full, the streams are passed on from one point to another (to the client requesting the data).
something like streaming movies online. This way we don't have to wait for the whole of data to arrive but receive in chunk and start using it even before the data is arrived. This video is simple and helpful.

How to read specific chunk from file Node JS?

How to read buffer by selecting a start position and end position from file while on streaming.
It solves my problem read-chunk
you could create a custom writable stream with the writable interface. But maybe that defeats the whole purpose of a stream... do you know upfront which positions you need to read or is it random? if do you need scan patterns?

Node.js read and write stream to the same file at the same time

TL;DR
I'm browsing through a number of solutions on npm and github looking for something that would allow me to read and write to the same file in two different places at the same time. So far I'm having trouble actually finding anything like this. Is there a module of some sort that will allow that?
Background
In essence my requirement is that in a large file I need to, in the following order:
read
transform
write
Ideally the usage would be something like:
const fd = fs.open(file, "r+");
const read = createReadStreamSomehowFrom(fd);
const write = createWriteStreamSomehowFrom(fd);
read
.pipe(new Transform(transform() {...}))
.pipe(write);
I could do that with standard fs.create[Read/Write]Stream but there's no way to control the flow of both streams and if my write position goes beyond read position then I'm reading something I just wrote...
The use case is the same as perl -p -i -e, read and write to the same file (meaning the same inode) asynchronously and replace the contents without loading everything into memory.
I would expect this a real world use case, yet all implementations I found actually load the whole file into memory and then save it. Am I missing a known module here or is there a need to actually write something like this?
Hmm... a tough one it seems. :)
So here's for the record - I found no such module and actually discussed this with some people responsible for a nice in-file replacing module. Seeing no way to solve this I decided to write it from scratch and here it is:
signicode/rw-stream repo on github
rw-stream at npm
The module works on a simple principle that no byte can be written until it has been consumed in the readable stream and it's fairly simple underneath (couple fs.read/write ops with keeping eye on the point of read and write).
If you find this useful then I'm happy. :)

How to synchronously read from a ReadStream in node

I am trying to read UTF-8 text from a file in a memory and time efficient way. There are two ways to read directly from a file synchronously:
fs.readFileSync will read the entire file and return a buffer containing the file's entire contents
fs.readSync will read a set amount of bytes from a file and return a buffer containing just those contents
I initially just used fs.readFileSync because it's easiest, but I'd like to be able to efficiently handle potentially large files by only reading in chunks of text at a time. So I started using fs.readSync instead. But then I realized that fs.readSync doesn't handle UTF-8 decoding. UTF-8 is simple, so I could whip up some logic to manually decode it, but Node already has services for that, so I'd like to avoid that if possible.
I noticed fs.createReadStream, which returns a ReadStream that can be used for exactly this purpose, but unfortunately it seems to only be available in an asynchronous mode of operation.
Is there a way to read from a ReadStream in a synchronous way? I have a massive stack built on top of this already, and I'd rather not have to refactor it to be asynchronous.
I discovered the string_decoder module, which handles all that UTF-8 decoding logic I was worried I'd have to write. At this point, it seems like a no-brainer to use this on top of fs.readSync to get the synchronous behavior I was looking for.
You basically just keep feeding bytes to it, and as it is able to successfully decode characters, it will emit them. The Node documentation is sufficient at describing how it works.

Write stream into buffer object

I have a stream that is being read from an audio source and I'm trying to store it into a Buffer. From the documentation that I've read, you are able to pipe the stream into one using fs.createWriteStream(~buffer~) instead of a file path.
I'm doing this currently as:
const outputBuffer = Buffer.alloc(150000)
const stream = fs.createWriteStream(outputBuffer)
but when I run it, it throws an error saying that the Path: must be a string without null bytes for the file system call.
If I'm misunderstanding the docs or missing something obvious please let me know!
The first parameter to fs.createWriteStream() is the filename to read. That is why you receive that particular error.
There is no way to read from a stream directly into an existing Buffer. There was a node EP to support this, but it more or less died off because there are some potential gotchas with it.
For now you will need to either copy the bytes manually or if you don't want node to allocate extra Buffers, you will need to manually call fs.open(), fs.read() (this is the method that allows you to pass in your Buffer instance, along with an offset), and fs.close().

Resources