How to convert file to buffer in node.js? - node.js

I am writing a sails.js app. I am writing an API to accept a file and encrypt it.
var file = req.body('myFile');
var fileBuffer = convertToBuffer(file);
How do I convert a file to buffer?

That looks like you've got a string which represents the body of your file.
You just have to make a new buffer with it.
var fileBuffer = Buffer.from(file)
If your encoding is NOT utf8 you can specify an alternate encoding as a second optional argument.
var fileBuffer = Buffer.from(file, 'base64')
If the file is actually on disk, this is even easier, since, by default, the fs.readFile operation returns a buffer.
fs.readFile(file, function(err, buffer){})
If you're in a real old version of node Buffer.from doesn't exist and you have to use a very memory-unsafe new constructor. Please consider upgrading your node instance to support Buffer.from

Related

Upload text file to Google Cloud storage with ASCII encoding with node

A legacy 3rd party I'm working with requires that we provide them with text files using ANSI (~ASCII) character encoding.
The content to be saved to the file is large so I'm using streams. If using the fs library I can do something like this:
const file = fs.createWriteStream(filePath, {encoding: 'ascii'});
data.pipe(file).on('error', handleError).on('finish', handleSuccess);
I'm trying to do the equivalent in the GCS node client:
const storage = new Storage();
const gcsFile = storage.bucket(bucket).file(fileName, {});
const fileStream = gcsFile.createWriteStream();
data.pipe(fileStream).on('error', handleError).on('finish', handleSuccess);
However the createWriteStream method has no such option to specify the character encoding.
Is there a way to explicitly stream data using ASCII character encoding to GCS?
I think you can pass options.metadata to the createWriteStream. Does something like the following work for you?
gcsFile = gcsBucket.file(fileName);
fs.createReadStream('myLocalFile.txt')
.pipe(myFile.createWriteStream({
metadata: {
contentEncoding: 'ascii'
}
}))
.on('error', handleError)
.on('finish', handleSuccess);
See the client ref docs here: https://googleapis.dev/nodejs/storage/latest/global.html#CreateWriteStreamOptions

Encoding a file to base 64 Nodejs

I used the code below to encode a file to base64.
var bitmap = fs.readFileSync(file);
return new Buffer(bitmap).toString('base64');
I figured that in the file we have issues with “” and ‘’ characters, but it’s fine with "
When we have It’s, node encodes the characters, but when I decode, I see it as
It’s
Here's the javascript I'm using to decode:
fs.writeFile(reportPath, body.buffer, {encoding: 'base64'}
So, once the file is encoded and decoded, it becomes unusable with these funky characters - It’s
Can anyone shed some light on this?
This should work.
Sample script:
const fs = require('fs')
const filepath = './testfile'
//write "it's" into the file
fs.writeFileSync(filepath,"it's")
//read the file
const file_buffer = fs.readFileSync(filepath);
//encode contents into base64
const contents_in_base64 = file_buffer.toString('base64');
//write into a new file, specifying base64 as the encoding (decodes)
fs.writeFileSync('./fileB64',contents_in_base64,{encoding:'base64'})
//file fileB64 should now contain "it's"
I suspect your original file does not have utf-8 encoding, looking at your decoding code:
fs.writeFile(reportPath, body.buffer, {encoding: 'base64'})
I am guessing your content comes from a http request of some sorts so it is possible that the content is not utf-8 encoded. Take a look at this:
https://www.w3.org/International/articles/http-charset/index if charset is not specified Content-Type text/ uses ISO-8859-1.
Here is the code that helped.
var bitmap = fs.readFileSync(file);
// Remove the non-standard characters
var tmp = bitmap.toString().replace(/[“”‘’]/g,'');
// Create a buffer from the string and return the results
return new Buffer(tmp).toString('base64');
You can provide base64 encoding to the readFileSync function itself.
const fileDataBase64 = fs.readFileSync(filePath, 'base64')

Nodejs, store bin files as BYTEA into pgsql (corrupted files)

For some reason I need to store some files (mostly images or pdfs) into my database (PG 9.2.20).
Those files are uploaded by users and when I download them back, they are corrupted.
I'm working with nodejs.
The column type I store the file is BYTEA.
This is how I store them :
const { files, fields } = await asyncBusboy(ctx.req);
const fileName = files[0].filename;
const mimeType = files[0].mimeType;
const bufferedFile = fs.readFileSync(files[0].path, { encoding: 'hex' });
const fileData = `\\x${bufferedFile}`;
//Just a basic insert into with knex.raw
const fileId = await storageModel.create(fields.name, fields.description, fileName, mimeType, fileData, ctx.user);
And this is how I retrieve my file :
const file = await storageModel.find(ctx.params.fileId, ctx.user);
ctx.body = Buffer.from(file.file_bin, 'hex');
ctx.set('Content-disposition', `attachment; filename=${file.file_name}`);
The file is corrupted, and of course, if I look closely, the uploaded file and the one I downloaded are different.
See hex screenshot, there is some additional data at the start of the downloaded one : http://imgur.com/a/kTRAB
After some more testing I can tell the problem lies into the koa part, when I put the buffer into the ctx.body. It got corrupted (???)
EDIT : I was working with Swagger UI : https://github.com/swagger-api/swagger-ui/issues/1605
You should not use bytea as a regular text string. You should pass in type Buffer directly, and have the driver escape it for you correctly.
Not sure which driver you are using, but for example...
pg-promise does it automatically, see the example
node-postgres is supposed to do it automatically, which it does mostly, but I know there were issues with arrays, recently fixed here.
massive.js - based on pg-promise since v3.0, so the same story - it just works.

Save zip file represented as a string

im downloading a zip file from the internet. I recieve it using a XHR request (using node-webkit) and this means that the content of the zip comes as a string in xhr.responseText. I now want to save this file to the disk, however, i cant seem to get it saved as a noncurrupted zip archive.
I have basically used fs.writeFile, fs.write, fs.createWriteStream, but I cant seem to get it right.
I am using a node module named AdmZip which accepts a file buffer that then can be saved as a zip archive. So, I guess, this could be one way to go, but how to I make a buffer out the the string that i recieve?
btw: i can't use the http module to recieve the file from the internet due to a bug in node.js, therefore im using the xhr request.
So, I found a soulution, by first and foremost setting the xhr.responseType = 'arraybuffer' and then turning the response into a Uint8Array. From there I converted the Uint8Array to a nodejs buffer which I then could save.
var arrayBuffer = xhr.response,
byteArray = new Uint8Array(arrayBuffer);
var buffer = new Buffer(byteArray.length);
for (var i = 0; i < byteArray.length; i++) {
buffer.writeUInt8(byteArray[i], i);
}
fs.writeFileSync(fname, buffer);

Converting binary strings to Buffers in Node.js

I have a web service that takes a base 64 encoded string representing an image, creates a thumbnail of that image using the imagemagick library, then stores both of them in mongodb. I am doing this with the following code (approximately):
var buf = new Buffer(req.body.data, "base64"); //original image
im.resize({ srcData: buf, width: 256 }, function(err, stdout, stderr) {
this.thumbnail = new Buffer(stdout, "binary");
//store buf and stdout in mongo
});
You will notice that I am creating a Buffer object using the "binary" encoding, which the docs say not to do:
'binary' - A way of encoding raw binary data into strings by using
only the first 8 bits of each character. This encoding method is
deprecated and should be avoided in favor of Buffer objects where
possible. This encoding will be removed in future versions of Node.
First off I'm not sure what they are saying there. I'm trying to create a Buffer object and they seem to imply I should already have one.
Secondly, the source of the problem appears to be that the imagemagick resize method returns a string containing binary data. Doing typedef(stdout) return "string" and printing it out to the screen certainly appears to show a bunch of non-character data.
So what do I do here? I can't change how imagemagick works. Is there another way of doing what I'm trying to do?
Thats how I am doing the same with success, storing images in mongodb.
//original ---> base64
var thumbnail = new Buffer(req.body.data).toString('base64');
//you can store this string value in a mongoose model property, and save to mongodb
//base64 ---> image
var buffer = new Buffer(thumbnail, "base64");
I am not sure if storing images as base64 is the best way to do it
Please try this as your base64 might not be pre-handled:
var imgRawData =
req.body.images[0].replace(/^data:image\/png;base64,|^data:image\/jpeg;base64,|^data:image\/jpg;base64,|^data:image\/bmp;base64,/, "");
var yourBuffer = new Buffer(imgRawData, "base64");
Then, save the yourBuffer into MongoDB buffer.

Resources