Write CSV from array with ANSI from Node - node.js

I'd like to write an array of objects to an ANSI (windows-1252) encoded CSV. I'm using the fast-csv and iconv-lite packages. Is there a way to do this without going through a buffer or intermediate streams? My code (which writes a ASCII CSV at present) is as follows:
csv
.writeToStream(
fs.createWriteStream(filename, {encoding: "ascii"}), objectArray, {headers: true})
.on("finish", function() {
{console.log("done!");
});

If you're using iconv-lite, you can use an encodeStream like so:
iconv.encodeStream("win1252").pipe(fs.createWriteStream(filename))
in place of the usual createWriteStream invocation.

Related

Why the line breaks are different of CSV (Macintosh) and CSV parsing with using node module csv-parser?

I'm using node module csv-parser for the streaming csv parsing. It's working fine when uploading a CSV (Comma separated value) but when we upload a CSV (Macintosh) file the problem occurs with line breaks. The CSV that's generated on Windows contains the line breaks like this \r\n but with CSV (MAC) it contains only \r as it's the Mac format. What configuration needs to be done to make it work for both file types?
Here's the code snippet where the streams hooking is done.
// Create a read stream for the passed file path and abort if the file is not found
let readStream: fs.ReadStream;
try {
readStream = fs.createReadStream(filePath);
} catch (error) {
console.log('Skipped order batch file processing. File not found.');
resolve();
return;
}
// Create the CSV transform
let csvStream: Transform;
if (file.mapping) {
csvStream = csv({ headers: false });
} else {
csvStream = csv();
}
readStream
.pipe(csvStream);
CSV-PARSER has an option of newline parameter it's default value is "\n" using "\r" it worked.
csvStream = csv({ headers: false, newline:"\r" });
How can I make the newline value to conditionally set for example if it's csv (Mac) it should "\r" for CSV (Windows) "\r\n" and for linux "\n"?
Note: I need to detect this on File Reading.
Your Help would be really appreciated!
Thanks!

How do I read csv file line by line, modify each line, write result to another file

I recently used event-stream library for nodejs to parse a huge csv file, saving results to database.
How do I solve the task of not just reading a file, but modifying each line, writing result to new file?
Is it some combination of through and map method, or duplex? Any help is highly appreciated.
If you use event-stream for read you can use split() method process csv line by line. Then change and write line to new writable stream.
var fs = require('fs');
var es = require('event-stream');
const newCsv = fs.createWriteStream('new.csv');
fs.createReadStream('old.csv')
.pipe(es.split())
.pipe(
es.mapSync(function(line) {
// modify line way you want
newCsv.write(line);
}))
newCsv.end();

In node.js: How to convert jpg images to binaries data?

And on the contrary, how can I convert the binaries data back to image? Because the image data save in the backend are stored as binaries.
Try this .
var fs = require("fs");
fs.readFile('image.jpg', function(err, data) {
if (err) throw err;
// Encode to base64
var encodedImage = new Buffer(data, 'binary').toString('base64');
// Decode from base64
var decodedImage = new Buffer(encodedImage, 'base64').toString('binary');
});
Hope it will be useful for you.
You can do it by using fs.createReadStream instead of Buffer, Buffer is deprecated method.
Find more info about the differences in https://medium.com/tensult/stream-and-buffer-concepts-in-node-js-87d565e151a0
If you want a solution for reading files(obviously you can read images too) and get it converted to binary, I wrote a small code in NodeJS, have a look hope it will help you out. It is all about reading a file into binary, but surely you can convert the string to array or byte-array. If you get suck here, please let me know in the comments below.
Here is a simple yet robust snippet you can try.
params format:
getBinary({
path : '<file_relative_path>',
padlength: '<prepending_padding_length>', (Default: 4)
debug: false, (Default: true)
limit: 10 (Default: Full_File_Length)
putSpacing: Boolean (Default: false)
})
Params Description:
1. path: Specifies the relative file path, to be read.
2. padlength: After reading the file, it reads object as number
(ex: hex(f): 1111, hex(0): 0), so if you need a
uniform length binary string then you will need to
fill the strings. as hex(0): 0000 when padlength is 4.
3. limit: limits the read buffer to render.
4. putSpacing: if true it puts a space after each padlength.
or
getBinary('<file_relative_path>');
Get it here: https://computopedia.com/how-to-convert-image-to-binary-nodejs/
Gist: https://gist.github.com/shankha96/cffe620776066078289ea1f8b15956e0

Node.js import csv with blank fields

I'm trying to import & parse a CSV file using the csv-parse package, but having difficulty with requireing the csv file in the first place.
When I do input = require('../../path-to-my-csv-file')
I get an error due to consecutive commas because some fields are blank:
e","17110","CTSZ16","Slitzerâ„¢ 16pc Cutlery Set in Wood Block",,"Spice up
^
SyntaxError: Unexpected token ,
How do I import the CSV file into the node environment to begin with?
Package examples are Here
To solve your first problem, reading CSV with empty entries:
Use the 'fast-csv' node package. It will parse csv with emtpy entries.
To answer your second question, how to import a CSV into node:
You don't really "import" csv files into node. You should fs.open the file
or use fs.createReadStream to read the csv file at the appropriate location.
Below is a script that uses fs.createReadStream to parse a CSV called 'test.csv' that is one directory up from the script that is running it.
The first section sets up our program, makes basic declarations of the objects were going use to store our parsed list.
var csv = require('fast-csv') // require fast-csv module
var fs = require('fs') // require the fs, filesystem module
var uniqueindex = 0 // just an index for our array
var dataJSON = {} // our JSON object, (make it an array if you wish)
This next section declares a stream that will intercept data as it's read from our CSV file and do stuff to it. In this case we're intercepting the data and storing it in a JSON object and then saving that JSON object once the stream is done. It's basically a filter that intercepts data and can do what it wants with it.
var csvStream = csv() // - uses the fast-csv module to create a csv parser
.on('data',function(data){ // - when we get data perform function(data)
dataJSON[uniqueindex] = data; // - store our data in a JSON object dataJSON
uniqueindex++ // - the index of the data item in our array
})
.on('end', function(){ // - when the data stream ends perform function()
console.log(dataJSON) // - log our whole object on console
fs.writeFile('../test.json', // - use fs module to write a file
JSON.stringify(dataJSON,null,4), // - turn our JSON object into string that can be written
function(err){ // function(err) only gets performed once were done saving the file and err will be nil if there is no error
if(err)throw err //if there's an error while saving file throw it
console.log('data saved as JSON yay!')
})
})
This section creates what is called a "readStream" from our csv file. The path to the file is relative. A stream is just a way of reading a file. It's pretty powerful though because the data from a stream can be piped into another stream.
So we'll create a stream that reads the data from our CSV file, and then well pipe it into our pre-defined readstream / filter in section 2.
var stream = fs.createReadStream('../test.csv')
stream.pipe(csvStream)
This will create a file called 'test.json' one directory up from the place where our csv parsing script is. test.json will contain the parsed CSV list inside a JSON object. The order in which the code appears here is how it should appear in a script you make.

How to convert filetype using GraphicsMagick for node.js

How do you "Convert" a file from one type to another using gm? (e.g .png > .jpg)
I found this but it doesn't seem that the node.js version has the same method:
You'll need to change the format of the output file.
var writeStream = fs.createWriteStream("output.jpg");
gm("img.png").setFormat("jpg").write(writeStream, function(error){
console.log("Finished saving", error);
});
http://aheckmann.github.io/gm/docs.html#setformat

Resources