Does a WriteStream have to be closed? - node.js

I used var writeStream = fs.createWriteStream('...') to open a WriteStream, but there is no writeStream.close() function. How is this handled in node?

Call stream.end()
http://nodejs.org/api/stream.html#stream_stream_end

Stream Has to be close after you done your file's writing work.
use
stream.end()
You can't use stream.write('xxx') after stream.end().
While you don't have end/close the writeable stream, you/other can write into file anytime, and NodeJS will keep writable stream's reference in RAM.

Related

Node - Is it possible that a stream closes or stops writing and how to reopen?

I could not find much information in the documentation. Using fs, is it possible that a stream opened with fs.createWriteStream closes unexpectedly or stops writing to file before calling stream.end()?
One scenario that comes to my mind is an error happens and the OS closes the stream. Is this possible, and in this case would fs reopen the stream automatically?
I think we can use something along the lines of
let stream = fs.createWriteStream('file.log', {flags: 'a'});
stream.on('close', () => {
stream = fs.createWriteStream('file.log', {flags: 'a'});
});
But I wonder if this approach is prone to memory leaks or other issues. Thanks!

Node.js: closing a file after writing

I'm currently getting a writable stream to a file using writer = fs.createWriteStream(url), doing a number of writes using writer.write(), and at the end I do writer.end(). I notice that when I do writer.end(), the file size is still zero, and remains at zero until the program terminates, at which point it reaches its correct size and the contents are visible and correct.
So it seems that writer.end() isn't closing the file.
The spec says "If autoClose is set to true (default behavior) on 'error' or 'finish' the file descriptor will be closed automatically."
I'm not sure what 'error' or 'finish' refer to here. Events presumably: but how do they relate to my call on writer.end()? Is there something I have to do to cause these events?
I would try getting a file descriptor directly and calling fd.close() explicitly to close it after writing, but I don't see a method to get a writeable stream given a file descriptor.
Any advice please?
When you call .write Node does not write immediately to the file, but it buffers all chunks until highWaterMark bytes are reached. At that point it will try to flush the contents to disk.
Reason why it's important to check .write return value, if false is returned it means that you need to wait until drain event is emitted, if you don't do this, you can exhaust the memory of the application, see:
why does attempting to write a large file cause js heap to run out of memory
The same happens for .end. It won't close the file immediately, first it will flush the buffer and after everything has been written into the file, it will close the fd.
So once you call .end you'll have to wait until finish event has been emitted.
The 'finish' event is emitted after the stream.end() method has been
called, and all data has been flushed to the underlying system.
const { once } = require('events');
const fs = require('fs');
const writer = fs.createWriteStream('/tmp/some-file');
// using top-level await, wrap in IIFE if you're running an older version
for(let i = 0; i < 10; i++) {
if(!writer.write('a'))
await once(writer, 'drain');
}
writer.end();
await once(writer, 'finish');
consle.log('File is closed and all data has been flushed');

Possible to delete data read from readable stream?

I have some ffmpeg processes running in order and all writing into one stream (fs.createWriteStream).
Is it possible to delete the data read through fs.createReadStream from the file?
I want to run the script 24/7 and want the stream to act like a buffer.
Thanks in advance!
You can actually "append data" to running ffmpeg instance - or any other writable stream. To make this possible, you need to use this option for pipe:
myFile.pipe(ffmpegRunner, {end: false});
This will tell pipe to not notify the ffmpeg that the file has ended. Then, you can switch the files once the first one ends:
myFile.on("end", () => {
myFile.unpipe(ffmpegRunner);
anotherFile.pipe(ffmpegRunner, {end: false});
});
You can do that even before the stream ends I guess.

Create a Read Stream remotely without writing it node.js

I'm trying to create a Read Stream from a remote file without writing it to disc.
var file = fs.createWriteStream('Video.mp4');
var request = http.get('http://url.tld/video.mp4', function(response){
response.pipe(file);
});
Can I create a Read Stream directly from an HTTP response without writing it to disc ? Maybe creating a buffer in chunks and covert it to readable stream ?
Seems like you can use the request module.
Have a look at 7zark7's answer here: https://stackoverflow.com/a/14552721/7189461

Piping to a stream that needs to be both readable and writable

I'd like to be able to pipe from the src stream to the something stream. I understand that something needs to be a writable stream, as it's being piped to, however, it needs to also be a readable stream, so that it can pipe to somethingElse'. What shouldsomething` return in order to make this work?
example.task('taskOne', function() {
return example
.src('pathName')
.pipe(something())
.pipe(somethingElse())
});
Solved!
Made use of node modules through2 which solves this exact issue.

Resources