Node.js - ZLIB Gunzip returns empty file - node.js

I'm just testing ZLIB of Node.js but quickly facing strange results. Here is my script (inspired from the Node.js manual example http://nodejs.org/api/zlib.html#zlib_examples):
var zlib = require('zlib') ,
fs = require('fs') ,
inp1 = fs.createReadStream('file.txt') ,
out1 = fs.createWriteStream('file.txt.gz') ,
inp2 = fs.createReadStream('file.txt.gz') ,
out2 = fs.createWriteStream('output.txt') ;
inp1.pipe(zlib.createGzip()).pipe(out1); /* Compress to a .gz file*/
inp2.pipe(zlib.createGunzip()).pipe(out2); /* Uncompress the .gz file */
In this example, and before executing the script, I created a file called file.txt and I fullfill it with a sample text (say a Lorem Ipsum).
The previous script creates successfully the .gz file, that I can unzip from the finder (I'm on Mac OSX), but the uncompressed output.txt file is empty.
Why? Do you have any idea?

Node streams are asynchronous, so both of your streams will run at the same time. That means that when you initially open inp2 that file.txt.gz is empty, because the other write stream hasn't added anything to it yet.
var zlib = require('zlib') ,
fs = require('fs');
var src = 'file.txt',
zip = 'file.txt.gz',
dst = 'output.txt';
var inp1 = fs.createReadStream(src);
var out1 = fs.createWriteStream(zip);
inp1.pipe(zlib.createGzip()).pipe(out1);
out1.on('close', function(){
var inp2 = fs.createReadStream(zip);
var out2 = fs.createWriteStream(dst);
inp2.pipe(zlib.createGunzip()).pipe(out2);
})

Related

Cannot read File from fs with FileReader

Hi i am trying to read a file and i am having trouble with the fileReader readAsArrayBuffer function in nodejs.
var FileReader = require("filereader");
let p12_path = __dirname + "/file.p12";
var p12xxx = fs.readFileSync(p12_path, "utf-8");
var reader = new FileReader();
reader.readAsArrayBuffer(p12xxx);//The problem is here
reader.onloadend = function() {
arrayBuffer = reader.result;
var arrayUint8 = new Uint8Array(arrayBuffer);
var p12B64 = forge.util.binary.base64.encode(arrayUint8);
var p12Der = forge.util.decode64(p12B64);
var p12Asn1 = forge.asn1.fromDer(p12Der);
............
}
-------The error
Error: cannot read as File: "0�6�\.............
You are reading a PDF file which is not a text based format and should not have an encoding specified. As per the fs docs "If the encoding option is specified then this function returns a string" but because its mostly a binary file its reading invalid UTF8 characters. When you exclude the encoding it should give you a Buffer object instead which is what you most likely want.
According to the npm filereader Doc, the reader created with fs.readFileSync(p12_path, "utf-8"); needs to get a path of a file in the utf-8 encoding, otherwise it cannot read it.
The printed out "0�6�\............. shows the file is obviously not in utf8 and therefor not readable.

How to convert a node gd image to a stream that I can pipe?

I'm using node-gd to process images, but I'd like to do a few things before saving them to the disk. Right now I save the file with the .savePng() and .saveJpeg() functions.
I'd like to convert it to a stream which can be piped to an FS stream.
I tried the module streamifier because it sounds like it would do what I need, but when running the code below, the exported image is unreadable (though the same size as exporting via node-gd).
Here is what I attempted to do:
var gd = require("node-gd");
var fs = require("fs");
const streamifier = require('streamifier');
var inputImage = gd.createFromPng('input.png');
var writeStream = fs.createWriteStream('output.png');
var pngstream = inputImage.pngPtr();
streamifier.createReadStream(pngstream).pipe(writeStream);
Is there something I'm missing?
The png pointer needs to first be converted to a buffer like so
var pngstream = Buffer.from(inputImage.pngPtr(), 'binary');

How can i read a json file in gzip?

There is an archive format gzip. There are json files. We need to get each file in turn , to do with it and what is written in the other gzip. I realized that I need to use the standard library createReadStream and zlib.
Well, following the example from https://nodejs.org/api/zlib.html#zlib_examples the following process could be done for a single gzipped file:
var unzip = zlib.createUnzip();
var fs = require('fs');
var inp = fs.createReadStream('input.json.gz');
var out = fs.createWriteStream('output.json');
inp.pipe(unzip).pipe(out);
However if there are multiple files within a gzip, I am not sure how one would go about doing that. I could not find documentation to do that and the only way I found that multiple files could be unzipped from a gzip file in node is if they were tar'd first. A process for unzipping tar.gz in node can be found here. Following that example, one could do something like this:
var unzip = zlib.createUnzip();
var fs = require('fs');
var tar = require('tar-fs');
var inp = fs.createReadStream('input.tar.gz');
var out = './output'; // output directory
inp.pipe(unzip).pipe(tar.extract(out));

Node.js piping gzipped streams in CSV module

I have a gzipped CSV file that I would like to read, perform some transformations, and write back somewhere gzipped. I am using the node-csv module for CSV transformations.
An simplified version of the code looks like this:
// dependencies
var fs = require('fs'),
zlib = require('zlib'),
csv = require('csv'); // http://www.adaltas.com/projects/node-csv/
// filenames
var sourceFileName = process.argv[2] || 'foo.csv.gz';
targetFileName = process.argv[3] || 'bar.csv.gz';
// streams
var reader = fs.createReadStream(sourceFileName),
writer = fs.createWriteStream(__dirname + '\\' + targetFileName),
gunzip = zlib.createGunzip(),
gzip = zlib.createGzip();
csv()
.from.stream( reader.pipe(gunzip) )
.to.stream( gzip.pipe(writer) ) // <-- the output stream
.transform( function(row) {
// some operation here
return row;
});
The problem is that this codes effectively writes a file with the specified name, although not gzipped, i.e. if the file gets the .gz removed, it can be opened as a regular CSV.
The question then is, how can the csv().to.stream() be passed an output stream that gzips the data and pipes it to a writer?
Thanks!
You're pipeing the csv to the writer because .pipe returns it's argument for chaining.
You need to change:
.to.stream( gzip.pipe(writer) ) // <-- the output stream
To:
.to.stream( gzip ) // <-- the output stream
. . .
gzip.pipe(writer);

How to pipe one readable stream into two writable streams at once in Node.js?

The goal is to:
Create a file read stream.
Pipe it to gzip (zlib.createGzip())
Then pipe the read stream of zlib output to:
1) HTTP response object
2) and writable file stream to save the gzipped output.
Now I can do down to 3.1:
var gzip = zlib.createGzip(),
sourceFileStream = fs.createReadStream(sourceFilePath),
targetFileStream = fs.createWriteStream(targetFilePath);
response.setHeader('Content-Encoding', 'gzip');
sourceFileStream.pipe(gzip).pipe(response);
... which works fine, but I need to also save the gzipped data to a file so that I don't need to regzip every time and be able to directly stream the gzipped data as a response.
So how do I pipe one readable stream into two writable streams at once in Node?
Would sourceFileStream.pipe(gzip).pipe(response).pipe(targetFileStream); work in Node 0.8.x?
Pipe chaining/splitting doesn't work like you're trying to do here, sending the first to two different subsequent steps:
sourceFileStream.pipe(gzip).pipe(response);
However, you can pipe the same readable stream into two writeable streams, eg:
var fs = require('fs');
var source = fs.createReadStream('source.txt');
var dest1 = fs.createWriteStream('dest1.txt');
var dest2 = fs.createWriteStream('dest2.txt');
source.pipe(dest1);
source.pipe(dest2);
I found that zlib returns a readable stream which can be later piped into multiple other streams. So I did the following to solve the above problem:
var sourceFileStream = fs.createReadStream(sourceFile);
// Even though we could chain like
// sourceFileStream.pipe(zlib.createGzip()).pipe(response);
// we need a stream with a gzipped data to pipe to two
// other streams.
var gzip = sourceFileStream.pipe(zlib.createGzip());
// This will pipe the gzipped data to response object
// and automatically close the response object.
gzip.pipe(response);
// Then I can pipe the gzipped data to a file.
gzip.pipe(fs.createWriteStream(targetFilePath));
you can use "readable-stream-clone" package
const fs = require("fs");
const ReadableStreamClone = require("readable-stream-clone");
const readStream = fs.createReadStream('text.txt');
const readStream1 = new ReadableStreamClone(readStream);
const readStream2 = new ReadableStreamClone(readStream);
const writeStream1 = fs.createWriteStream('sample1.txt');
const writeStream2 = fs.createWriteStream('sample2.txt');
readStream1.pipe(writeStream1)
readStream2.pipe(writeStream2)

Resources