Buffer entire file in memory with Node.js - node.js

I have a relatively small file (some hundreds of kilobytes) that I want to be in memory for direct access for the entire execution of the code.
I don't know exactly the internals of Node.js, so I'm asking if a fs open is enough or I have to read all file and copy to a Buffer?

Basically, you need to use the readFile or readFileSync function from the fs module. They return the complete content of the given file, but differ in their behavior (asynchronous versus synchronous).
If blocking Node.js (e.g. on startup of your application) is not an issue, you can go with the synchronized version, which is as easy as:
var fs = require('fs');
var data = fs.readFileSync('/etc/passwd');
If you need to go asynchronous, the code is like that:
var fs = require('fs');
fs.readFile('/etc/passwd', function (err, data ) {
// ...
});
Please note that in either case you can give an options object as the second parameter, e.g. to specify the encoding to use. If you omit the encoding, the raw buffer is returned:
var fs = require('fs');
fs.readFile('/etc/passwd', { encoding: 'utf8' }, function (err, data ) {
// ...
});
Valid encodings are utf8, ascii, utf16le, ucs2, base64 and hex. There is also a binary encoding, but it is deprecated and should not be used any longer. You can find more details on how to deal with encodings and buffers in the appropriate documentation.

As easy as
var buffer = fs.readFileSync(filename);

With Node 0.12, it's possible to do this synchronously now:
var fs = require('fs');
var path = require('path');
// Buffer mydata
var BUFFER = bufferFile('../public/mydata');
function bufferFile(relPath) {
return fs.readFileSync(path.join(__dirname, relPath)); // zzzz....
}
fs is the file system. readFileSync() returns a Buffer, or string if you ask.
fs correctly assumes relative paths are a security issue. path is a work-around.
To load as a string, specify the encoding:
return readFileSync(path,{ encoding: 'utf8' });

Related

Is there a more elegant way to read then write *the same file* with node js stream

I wanna read file then change it with through2 then write into the same file, code like:
const rm = require('rimraf')
const through2 = require('through2')
const fs = require('graceful-fs')
// source file path
const replacementPath = `./static/projects/${destPath}/index.html`
// temp file path
const tempfilePath = `./static/projects/${destPath}/tempfile.html`
// read source file then write into temp file
await promiseReplace(replacementPath, tempfilePath)
// del the source file
rm.sync(replacementPath)
// rename the temp file name to source file name
fs.renameSync(tempfilePath, replacementPath)
// del the temp file
rm.sync(tempfilePath)
// promiseify readStream and writeStream
function promiseReplace (readfile, writefile) {
return new Promise((res, rej) => {
fs.createReadStream(readfile)
.pipe(through2.obj(function (chunk, encoding, done) {
const replaced = chunk.toString().replace(/id="wrap"/g, 'dududud')
done(null, replaced)
}))
.pipe(fs.createWriteStream(writefile))
.on('finish', () => {
console.log('replace done')
res()
})
.on('error', (err) => {
console.log(err)
rej(err)
})
})
}
the above code works, but I wanna know can I make it more elegant ?
and I also try some temp lib like node-temp
unfortunately, it cannot readStream and writeStream into the same file as well, and I open a issues about this.
So any one know a better way to do this tell me, thank you very much.
You can make the code more elegant by getting rid of unnecessary dependencies and using the newer simplified constructor for streams.
const fs = require('fs');
const util = require('util');
const stream = require('stream');
const tempWrite = require('temp-write');
const rename = util.promisify(fs.rename);
const goat2llama = async (filePath) => {
const str = fs.createReadStream(filePath, 'utf8')
.pipe(new stream.Transform({
decodeStrings : false,
transform(chunk, encoding, done) {
done(null, chunk.replace(/goat/g, 'llama'));
}
}));
const tempPath = await tempWrite(str);
await rename(tempPath, filePath);
};
Tests
AVA tests to prove that it works:
import fs from 'fs';
import path from 'path';
import util from 'util';
import test from 'ava';
import mkdirtemp from 'mkdirtemp';
import goat2llama from '.';
const writeFile = util.promisify(fs.writeFile);
const readFile = util.promisify(fs.readFile);
const fixture = async (content) => {
const dir = await mkdirtemp();
const fixturePath = path.join(dir, 'fixture.txt');
await writeFile(fixturePath, content);
return fixturePath;
};
test('goat2llama()', async (t) => {
const filePath = await fixture('I like goats and frogs, but goats the best');
await goat2llama(filePath);
t.is(await readFile(filePath, 'utf8'), 'I like llamas and frogs, but llamas the best');
});
A few things about the changes:
Through2 is not really needed anymore. It used to be a pain to set up passthrough or transform streams properly, but that is not the case anymore thanks to the simplified construction API.
You probably don't need graceful-fs, either. Unless you are doing a lot of concurrent disk I/O, EMFILE is not usually a problem, especially these days as Node has gotten smarter about file descriptors. But that library does help with temporary errors caused by antivirus software on Windows, if that is a problem for you.
You definitely do not need rimraf for this. You only need fs.rename(). It is similar to mv on the command line, with a few nuances that make it distinct, but the differences are not super important here. The point is there will be nothing at the temporary path after you rename the file that was there.
I used temp-write because it generates a secure random filepath for you and puts it in the OS temp directory (which automatically gets cleaned up now and then), plus it handles converting the stream to a Promise for you and takes care of some edge cases around errors. Disclosure: I wrote the streams implementation in temp-write. :)
Overall, this is a decent improvement. However, there remains the boundary problem discussed in the comments. Luckily, you are not the first person to encounter this problem! I wouldn't call the actual solution particularly elegant, certainly not if you implement it yourself. But replacestream is here to help you.
const fs = require('fs');
const util = require('util');
const tempWrite = require('temp-write');
const replaceStream = require('replacestream');
const rename = util.promisify(fs.rename);
const goat2llama = async (filePath) => {
const str = fs.createReadStream(filePath, 'utf8')
.pipe(replaceStream('goat', 'llama'));
const tempPath = await tempWrite(str);
await rename(tempPath, filePath);
};
Also...
I do not like temp files
Indeed, temp files are often bad. However, in this case, the temp file is managed by a well-designed library and stored in a secure, out-of-the-way location. There is virtually no chance of conflicting with other processes. And even if the rename() fails somehow, the file will be cleaned up by the OS.
That said, you can avoid temp files altogether by using fs.readFile() and fs.writeFile() instead of streaming. The former also makes text replacement much easier since you do not have to worry about chunk boundaries. You have to choose one approach or the other, however for very big files, streaming may be the only option, aside from manually chunking the file.
Streams are useless in this situation, because they return you chunks of file that can break the string that you're searching for. You could use streams, then merge all these chunks to get content, then replace the string that you need, but that will be longer code that will provoke just one question: why do you read file by chunks if you don't use them ?
The shortest way to achieve what you want is:
let fileContent = fs.readFileSync('file_name.html', 'utf8')
let replaced = fileContent.replace(/id="wrap"/g, 'dududud')
fs.writeFileSync('file_name.html', replaced)
All these functions are synchronous, so you don't have to promisify them

Node writeFileSync encoding options for images

I'm using fs.writeFileSync(file, data[, options]) to save a file returned from http.get(options[, callback])
This works fine for text files but images, pdfs etc end up being corrupted. From the searching around that I've done, it's apparently because fs.writeFileSync(file, data[, options]) defaults to UTF-8
I've tried setting the encoding to 'binary', the mime-type and the extension to no avail. It feels like something really obvious that I'm overlooking, can anyone point me in the right direction?
Thank you in advance
Update
I'm running this through electron. I didn't think it was worth mentioning as electron is just running node, but I'm not a node or electron expert so I'm not sure
Create a Buffer from the image data and set its encoding to binary. Then pass that data into a stream.PassThrough and pipe that into a stream.Writable.
var fs = require('fs');
var stream = require('stream');
var imgStream = new stream.PassThrough();
imgStream.end(Buffer.from(data, 'binary'));
var wStream = fs.createWriteStream('./<dest>.<ext>');
imgStream.once('end', () => {
console.log('Image Written');
});
imgStream.once('error', (err) => {
console.log(err);
});
imgStream.pipe(wStream);

NodeJS require('./path/to/image/image.jpg') as base64

Is there a way to tell require that if file name ends with .jpg then it should return base64 encoded version of it?
var image = require('./logo.jpg');
console.log(image); // data:image/jpg;base64,/9j/4AAQSkZJRgABAgA...
I worry about the "why", but here is "how":
var Module = require('module');
var fs = require('fs');
Module._extensions['.jpg'] = function(module, fn) {
var base64 = fs.readFileSync(fn).toString('base64');
module._compile('module.exports="data:image/jpg;base64,' + base64 + '"', fn);
};
var image = require('./logo.jpg');
There's some serious issues with this mechanism: for one, the data for each image that you load this way will be kept in memory until your app stops (so it's not useful for loading lots of images), and because of that caching mechanism (which also applies to regular use of require()), you can only load an image into the cache once (requiring an image a second time, after its file has changed, will still yield the first—cached—version, unless you manually start cleaning the module cache).
In other words: you don't really want this.
You can use fs.createReadStream("/path/to/file")

Writing long strings to file (node js)

I have a string which is 169 million chars long, which I need to write to a file and then read from another process.
I have read about WriteStream and ReadStream, but how do I write the string to a file when it has no method 'pipe'?
Create a write stream is a good idea. You can use it like this:
var fs = require('fs');
var wstream = fs.createWriteStream('myOutput.txt');
wstream.write('Hello world!\n');
wstream.write('Another line\n');
wstream.end();
You can call to write as many time as you need, with parts of that 16 million chars string. Once you have finished writing the file, you can create a read stream to read chunks of the file.
However, 16 million chars are not that much, I would say you could read and write it at once and keep in memory the whole file.
Update: As requested in comment, I update with an example to pipe the stream to zip on-the-fly:
var zlib = require('zlib');
var gzip = zlib.createGzip();
var fs = require('fs');
var out = fs.createWriteStream('input.txt.gz');
gzip.pipe(out);
gzip.write('Hello world!\n');
gzip.write('Another line\n');
gzip.end();
This will create a gz file, and inside, only one file with same name (without the .gz at the end).
This might solve your problem
var fs = require('fs');
var request = require('request');
var stream = request('http://i.imgur.com/dmetFjf.jpg');
var writeStream = fs.createWriteStream('./testimg.jpg');
stream.pipe(writeStream);
Follow the link for more details
http://neethack.com/2013/12/understand-node-stream-what-i-learned-when-fixing-aws-sdk-bug/
If you're looking to write what's called a blocking process, eg something that will prevent you from doing something else, approaching that process asynchronously is the best solution (and why node.js is good at solving these types of problems). With that said, avoid methods that have fs.*Sync as that will be a synchronous method. fs.writeFile is what I believe you're looking for. Read the Docs

nodejs: each line in separate file

I want to split a file: each line in a separate file. The initial file is really big. I finished with code bellow:
var fileCounter = -1;
function getWritable() {
fileCounter++;
writable = fs.createWriteStream('data/part'+ fileCounter + '.txt', {flags:'w'});
return writable;
}
var readable = fs.createReadStream(file).pipe(split());
readable.on('data', function (line) {
var flag = getWritable().write(line, function() {
readable.resume();
});
if (!flag) {
readable.pause();
}
});
It works but it is ugly. Is there more nodish way to do that? maybe with piping and without pause/resume.
NB: it's not a question about lines/files/etc . The question is about streams and I just try to illustrate it with the problem
You can use Node's built-in readline module.
var fs = require('fs');
var readline = require('readline');
var fileCounter = -1;
var file = "foo.txt";
readline.createInterface({
input: fs.createReadStream(file),
terminal: false
}).on('line', function(line) {
var writable = fs.createWriteStream('data/part'+ fileCounter + '.txt', {flags:'w'});
writable.write(line);
fileCounter++
});
Note that this will lose the last line of the file if there is no newline at the end, so make sure your last line of data is followed by a newline.
Also note that the docs indicate that it is Stability index 2, meaning:
Stability: 2 - Unstable The API is in the process of settling, but has
not yet had sufficient real-world testing to be considered stable.
Backwards-compatibility will be maintained if reasonable.
How about the following? Did you try? Pause and resume logic isn't realy needed here.
var split = require('split');
var fs = require('fs');
var fileCounter = -1;
var readable = fs.createReadStream(file).pipe(split());
readable.on('data', function (line) {
fileCounter++;
var writable = fs.createWriteStream('data/part'+ fileCounter + '.txt', {flags:'w'});
writable.write(line);
writable.close();
});
Piping dynamically would be hard...
EDIT: You could create a writable (so pipe()able) object that would, on('data') event, do the "create file, open it, write the data, close it" but it :
wouldn't be reusable
wouldn't follow the KISS principle
would require a special and specific logic for file naming (It would accept a string pattern as an argument in its constructor with a placeholder for the number. Etc...)
I realy don't recommend that path, or you're going to take ages implementing a non-realy-reusable module. Though, that would make a good writable implementation exercise.

Resources