Cannot read File from fs with FileReader - node.js

Hi i am trying to read a file and i am having trouble with the fileReader readAsArrayBuffer function in nodejs.
var FileReader = require("filereader");
let p12_path = __dirname + "/file.p12";
var p12xxx = fs.readFileSync(p12_path, "utf-8");
var reader = new FileReader();
reader.readAsArrayBuffer(p12xxx);//The problem is here
reader.onloadend = function() {
arrayBuffer = reader.result;
var arrayUint8 = new Uint8Array(arrayBuffer);
var p12B64 = forge.util.binary.base64.encode(arrayUint8);
var p12Der = forge.util.decode64(p12B64);
var p12Asn1 = forge.asn1.fromDer(p12Der);
............
}
-------The error
Error: cannot read as File: "0�6�\.............

You are reading a PDF file which is not a text based format and should not have an encoding specified. As per the fs docs "If the encoding option is specified then this function returns a string" but because its mostly a binary file its reading invalid UTF8 characters. When you exclude the encoding it should give you a Buffer object instead which is what you most likely want.

According to the npm filereader Doc, the reader created with fs.readFileSync(p12_path, "utf-8"); needs to get a path of a file in the utf-8 encoding, otherwise it cannot read it.
The printed out "0�6�\............. shows the file is obviously not in utf8 and therefor not readable.

Related

Cannot read cyrillic symbols from a .csv file

I need to read some .csv file, get data in .json format and work with it.
I'm using npm package convert-csv-to-json. As a result - cyrillic symbols aren't displaying properly:
const csvToJson = require('convert-csv-to-json');
let json = csvToJson.fieldDelimiter(',').getJsonFromCsv("input.csv");
console.log(json);
Result:
If I try to decode file:
const csvToJson = require('convert-csv-to-json');
let json = csvToJson.asciiEncoding().fieldDelimiter(',').getJsonFromCsv("input.csv");
console.log(json);
result is:
When I open a .csv file using AkelPad or notepad++ - it displays as it has to, and detected format is Win 1251 (ANSI - кириллица).
Is there a way to read a file with properly encoding, or to decode a result string?
Try using UTF-8 encoding instead of ASCII.
As a result, change
let json = csvToJson.asciiEncoding().fieldDelimiter(',').getJsonFromCsv("input.csv");
to
let json = csvToJson.utf8Encoding().fieldDelimiter(',').getJsonFromCsv("input.csv");
This is a code to solve the problem:
const fs = require('fs');
var iconv = require('iconv-lite');
const Papa = require('papaparse');
// read csv file and get buffer
const buffer = fs.readFileSync("input.csv");
// parse buffer to string with encoding
let dataString = iconv.decode(buffer, 'win1251');
// parse string to array of objects
let config = {
header: true
};
const parsedOutput = Papa.parse(dataString, config);
console.log('parsedOutput: ', parsedOutput);

How to convert a node gd image to a stream that I can pipe?

I'm using node-gd to process images, but I'd like to do a few things before saving them to the disk. Right now I save the file with the .savePng() and .saveJpeg() functions.
I'd like to convert it to a stream which can be piped to an FS stream.
I tried the module streamifier because it sounds like it would do what I need, but when running the code below, the exported image is unreadable (though the same size as exporting via node-gd).
Here is what I attempted to do:
var gd = require("node-gd");
var fs = require("fs");
const streamifier = require('streamifier');
var inputImage = gd.createFromPng('input.png');
var writeStream = fs.createWriteStream('output.png');
var pngstream = inputImage.pngPtr();
streamifier.createReadStream(pngstream).pipe(writeStream);
Is there something I'm missing?
The png pointer needs to first be converted to a buffer like so
var pngstream = Buffer.from(inputImage.pngPtr(), 'binary');

Access variable from another file

I have ex1.js and ex2.js. And in ex2.js I want to get variable which one is in ex1.js
I can read the file, but I want to get exactly this var Value.
var fs = require('fs');
var readMe = fs.readFileSync('path', 'utf8');
console.log(readMe);
As long as they are both JavaScript files this is how it is generally done.
// in the js2 file
exports.var_js2 = 15;
// in the js1 file
var_js2 = require('./js2').var_js2

how to give file name a input in baby parser

I am trying to use baby parser for parsing csv file but i am getting below output if i give file name
file and code are in same directory
my code:
var Papa = require('babyparse');
var fs = require('fs');
var file = 'test.csv';
Papa.parse(file,{
step: function(row){
console.log("Row: ", row.data);
}
});
Out put :
Row: [ [ 'test.csv' ] ]
file must be a File object: http://papaparse.com/docs#local-files. In nodejs, you should use the fs API to load the content of the file and then pass it to PapaParse: https://nodejs.org/api/fs.html#fs_fs_readfilesync_filename_options
var Papa = require('babyparse');
var fs = require('fs');
var file = 'test.csv';
var content = fs.readFileSync(file, { encoding: 'binary' });
Papa.parse(content, {
step: function(row){
console.log("Row: ", row.data);
}
});
The encoding option is important and setting it to binary works for any text/csv file, you could also set it to utf8 if your file is in unicode.

Converting a string from utf8 to latin1 in NodeJS

I'm using a Latin1 encoded DB and can't change it to UTF-8 meaning that I run into issues with certain application data. I'm using Tesseract to OCR a document (tesseract encodes in UTF-8) and tried to use iconv-lite; however, it creates a buffer and to convert that buffer into a string. But again, buffer to string conversion does not allow "latin1" encoding.
I've read a bunch of questions/answers; however, all I get is setting client encoding and stuff like that.
Any ideas?
Since Node.js v7.1.0, you can use the transcode function from the buffer module:
https://nodejs.org/api/buffer.html#buffer_buffer_transcode_source_fromenc_toenc
For example:
const buffer = require('buffer');
const latin1Buffer = buffer.transcode(Buffer.from(utf8String), "utf8", "latin1");
const latin1String = latin1Buffer.toString("latin1");
You can create a buffer from the UFT8 string you have, and then decode that buffer to Latin 1 using iconv-lite, like this
var buff = new Buffer(tesseract_string, 'utf8');
var DB_str = iconv.decode(buff, 'ISO-8859-1');
I've found a way to convert any encoded text file, to UTF8
var
fs = require('fs'),
charsetDetector = require('node-icu-charset-detector'),
iconvlite = require('iconv-lite');
/* Having different encodings
* on text files in a git repo
* but need to serve always on
* standard 'utf-8'
*/
function getFileContentsInUTF8(file_path) {
var content = fs.readFileSync(file_path);
var original_charset = charsetDetector.detectCharset(content);
var jsString = iconvlite.decode(content, original_charset.toString());
return jsString;
}
I'ts also in a gist here: https://gist.github.com/jacargentina/be454c13fa19003cf9f48175e82304d5
Maybe you can try this, where content should be your database buffer data (in latin1 encoding)

Resources