There is an archive format gzip. There are json files. We need to get each file in turn , to do with it and what is written in the other gzip. I realized that I need to use the standard library createReadStream and zlib.
Well, following the example from https://nodejs.org/api/zlib.html#zlib_examples the following process could be done for a single gzipped file:
var unzip = zlib.createUnzip();
var fs = require('fs');
var inp = fs.createReadStream('input.json.gz');
var out = fs.createWriteStream('output.json');
inp.pipe(unzip).pipe(out);
However if there are multiple files within a gzip, I am not sure how one would go about doing that. I could not find documentation to do that and the only way I found that multiple files could be unzipped from a gzip file in node is if they were tar'd first. A process for unzipping tar.gz in node can be found here. Following that example, one could do something like this:
var unzip = zlib.createUnzip();
var fs = require('fs');
var tar = require('tar-fs');
var inp = fs.createReadStream('input.tar.gz');
var out = './output'; // output directory
inp.pipe(unzip).pipe(tar.extract(out));
Related
Currently, I tried to make zip file(or any format of compressed file) containing few files that I want to put into zip file.
I thought it would work with adm-zip module.
but I found out that the way adm-zip module put files into zip is buffer.
It takes a lot of memory when I put files that size is very huge.
In the result, My server stopped working.
Below is What I'd done.
var zip = new AdmZip();
zip.addLocalFile('../largeFile', 'dir1'); //put largeFile into /dir1 of zip
zip.addLocalFile('../largeFile2', 'dir1');
zip.addLocalFile('../largeFile3', 'dir1/dir2');
zip.writeZip(/*target file name*/ `./${threadId}.zip`);
Is there any solution to solve this situation?
to solve memory issue the best practice is to use streams and not load all files into memory for example
import {
createReadStream,
createWriteStream
} from 'fs'
import { createGzip } from 'zlib'
const [, , src, dest] = process.argv
const srcStream = createReadStream(src)
const gzipStream = createGzip()
const destStream = createWriteStream(dest)
srcStream
.pipe(gzipStream)
.pipe(destStream)
I'm using node-gd to process images, but I'd like to do a few things before saving them to the disk. Right now I save the file with the .savePng() and .saveJpeg() functions.
I'd like to convert it to a stream which can be piped to an FS stream.
I tried the module streamifier because it sounds like it would do what I need, but when running the code below, the exported image is unreadable (though the same size as exporting via node-gd).
Here is what I attempted to do:
var gd = require("node-gd");
var fs = require("fs");
const streamifier = require('streamifier');
var inputImage = gd.createFromPng('input.png');
var writeStream = fs.createWriteStream('output.png');
var pngstream = inputImage.pngPtr();
streamifier.createReadStream(pngstream).pipe(writeStream);
Is there something I'm missing?
The png pointer needs to first be converted to a buffer like so
var pngstream = Buffer.from(inputImage.pngPtr(), 'binary');
I have a string which is 169 million chars long, which I need to write to a file and then read from another process.
I have read about WriteStream and ReadStream, but how do I write the string to a file when it has no method 'pipe'?
Create a write stream is a good idea. You can use it like this:
var fs = require('fs');
var wstream = fs.createWriteStream('myOutput.txt');
wstream.write('Hello world!\n');
wstream.write('Another line\n');
wstream.end();
You can call to write as many time as you need, with parts of that 16 million chars string. Once you have finished writing the file, you can create a read stream to read chunks of the file.
However, 16 million chars are not that much, I would say you could read and write it at once and keep in memory the whole file.
Update: As requested in comment, I update with an example to pipe the stream to zip on-the-fly:
var zlib = require('zlib');
var gzip = zlib.createGzip();
var fs = require('fs');
var out = fs.createWriteStream('input.txt.gz');
gzip.pipe(out);
gzip.write('Hello world!\n');
gzip.write('Another line\n');
gzip.end();
This will create a gz file, and inside, only one file with same name (without the .gz at the end).
This might solve your problem
var fs = require('fs');
var request = require('request');
var stream = request('http://i.imgur.com/dmetFjf.jpg');
var writeStream = fs.createWriteStream('./testimg.jpg');
stream.pipe(writeStream);
Follow the link for more details
http://neethack.com/2013/12/understand-node-stream-what-i-learned-when-fixing-aws-sdk-bug/
If you're looking to write what's called a blocking process, eg something that will prevent you from doing something else, approaching that process asynchronously is the best solution (and why node.js is good at solving these types of problems). With that said, avoid methods that have fs.*Sync as that will be a synchronous method. fs.writeFile is what I believe you're looking for. Read the Docs
I have a file named mytext.txt and I'd like to compress this file to archive.rar. How can I do this in nodejs?
I've found nothing similar to rar only zip.
Find an rar command line utility that you can execute like
$ rar myfile.dat compressed.rar
Node.js can do command line calls. (See child_process.exec)
Give the normal command to the exec function, and it should get the job done.
For zipping a single file, zlib module can be very useful.
(function () {
'use strict';
var zlib = require('zlib');
var gzip = zlib.createGzip();
var fs = require('fs');
var inp = fs.createReadStream('mytext.txt');
var out = fs.createWriteStream('mytext.txt.gz');
inp.pipe(gzip).pipe(out);
}());
Unfortunately Nodejs dosn't native support Rar compression/decompression, i frustated with this too so i created a module called "super-winrar" making super easy deal with rar files in nodejs :)
check it out: https://github.com/KiyotakaAyanokojiDev/super-winrar
Exemple creating a file "archive.rar" and appending "mytext.txt" file:
const Rar = require('super-winrar');
// create a rar constructor with file path! (created if not exists)
const rar = new Rar('archive.rar');
// handle erros, otherwise will throw an exception!
rar.on('error', err => console.log(err.message));
rar.once('ready', async () => {
await rar.append(['mytext.txt']);
});
The goal is to:
Create a file read stream.
Pipe it to gzip (zlib.createGzip())
Then pipe the read stream of zlib output to:
1) HTTP response object
2) and writable file stream to save the gzipped output.
Now I can do down to 3.1:
var gzip = zlib.createGzip(),
sourceFileStream = fs.createReadStream(sourceFilePath),
targetFileStream = fs.createWriteStream(targetFilePath);
response.setHeader('Content-Encoding', 'gzip');
sourceFileStream.pipe(gzip).pipe(response);
... which works fine, but I need to also save the gzipped data to a file so that I don't need to regzip every time and be able to directly stream the gzipped data as a response.
So how do I pipe one readable stream into two writable streams at once in Node?
Would sourceFileStream.pipe(gzip).pipe(response).pipe(targetFileStream); work in Node 0.8.x?
Pipe chaining/splitting doesn't work like you're trying to do here, sending the first to two different subsequent steps:
sourceFileStream.pipe(gzip).pipe(response);
However, you can pipe the same readable stream into two writeable streams, eg:
var fs = require('fs');
var source = fs.createReadStream('source.txt');
var dest1 = fs.createWriteStream('dest1.txt');
var dest2 = fs.createWriteStream('dest2.txt');
source.pipe(dest1);
source.pipe(dest2);
I found that zlib returns a readable stream which can be later piped into multiple other streams. So I did the following to solve the above problem:
var sourceFileStream = fs.createReadStream(sourceFile);
// Even though we could chain like
// sourceFileStream.pipe(zlib.createGzip()).pipe(response);
// we need a stream with a gzipped data to pipe to two
// other streams.
var gzip = sourceFileStream.pipe(zlib.createGzip());
// This will pipe the gzipped data to response object
// and automatically close the response object.
gzip.pipe(response);
// Then I can pipe the gzipped data to a file.
gzip.pipe(fs.createWriteStream(targetFilePath));
you can use "readable-stream-clone" package
const fs = require("fs");
const ReadableStreamClone = require("readable-stream-clone");
const readStream = fs.createReadStream('text.txt');
const readStream1 = new ReadableStreamClone(readStream);
const readStream2 = new ReadableStreamClone(readStream);
const writeStream1 = fs.createWriteStream('sample1.txt');
const writeStream2 = fs.createWriteStream('sample2.txt');
readStream1.pipe(writeStream1)
readStream2.pipe(writeStream2)