How to download zip and directly extract zip via node? - node.js

I was wondering if it is possible to use https.get() from the Node standard library to download a zip and directly extract it into a subfolder.
I have found many solutions that download the zip first and extract it afterwards. But is there a way to do it directly?
This was my attempt:
const zlib = require("node:zlib");
const fs = require("fs");
const { pipeline } = require("node:stream");
const https = require("https");
const DOWNLOAD_URL =
"https://downloadserver.com/data.zip";
const unzip = zlib.createUnzip();
const output = fs.createWriteStream("folderToExtract");
https.get(DOWNLOAD_URL, (res) => {
pipeline(res, unzip, output, (error) => {
if (error) console.log(error);
});
});
But I get this error:
Error: incorrect header check
at Zlib.zlibOnError [as onerror] (node:zlib:189:17) {
errno: -3,
code: 'Z_DATA_ERROR'
}
I am curious, is this even possible?

Most unzippers start at the end of the zip file, reading the central directory there and using that to find the entries to unzip. This requires having the entire zip file accessible at the start.
What you'd need is a streaming unzipper, which starts at the beginning of the zip file. You can try unzip-stream and see if it meets your needs.

I think this is similar to Simplest way to download and unzip files in Node.js cross-platform?
An answer in the above discussion using same package:
You're getting the error probably because zlib only support gzip files and not zip

Related

decompressing a gz file using zlib package in NodeJS

I am trying to decompress a .gz file using zlib, one of the inbuilt library of NodeJS. But while decompressing it is throwing incorrect header check error. I am using following code to decompress.
import fs from 'fs';
import zlib from 'zlib';
const rStream = fs.createReadStream('./path-to-gz-file');
const wStream = fs.createWriteStream('./path-to-gz-file'.replace('.gz', '');
rStream
.pipe(zlib.createGunzip())
.on('error', err => { console.log(err); }
.pipe(wStream);
.on('error', err => { console.log(err); }
.pipe(wStream);
In one of the solution on internet, it was suggested to change the encoding of read and write stream to binary but that also doesn't work. I've also tried almost every solution of this issues that are available online, but nothing works.
If anyone have any further question please let me know, I will clarify as soon as possible.
PS: The same file when decompressed using gzip which is default compression library of linux it get extracted as expected by using the following command.

How to generate zip and put files into zip using stream in Node js

Currently, I tried to make zip file(or any format of compressed file) containing few files that I want to put into zip file.
I thought it would work with adm-zip module.
but I found out that the way adm-zip module put files into zip is buffer.
It takes a lot of memory when I put files that size is very huge.
In the result, My server stopped working.
Below is What I'd done.
var zip = new AdmZip();
zip.addLocalFile('../largeFile', 'dir1'); //put largeFile into /dir1 of zip
zip.addLocalFile('../largeFile2', 'dir1');
zip.addLocalFile('../largeFile3', 'dir1/dir2');
zip.writeZip(/*target file name*/ `./${threadId}.zip`);
Is there any solution to solve this situation?
to solve memory issue the best practice is to use streams and not load all files into memory for example
import {
createReadStream,
createWriteStream
} from 'fs'
import { createGzip } from 'zlib'
const [, , src, dest] = process.argv
const srcStream = createReadStream(src)
const gzipStream = createGzip()
const destStream = createWriteStream(dest)
srcStream
.pipe(gzipStream)
.pipe(destStream)

Electron app createWriteStream throwing ENOENT error

I'm trying to download files to the filesystem in an electron app. My code, in the main thread, looks like this:
const dir = `${__dirname}/media`;
if (!fs.existsSync(dir)){
fs.mkdirSync(dir);
}
const file = fs.createWriteStream(`${dir}/${name}`);
file.on("open", function() {
const request = http.get(url, function(response) {
response.pipe(file);
response.on('end', function() {
file.close();
...
});
});
request.on('error', function(err) {
...
});
});
This works when running in development using electron . But after I build it with electron-builder, I get the error in an alert:
Uncaught Exception:
Error: ENOENT, media/uploads_2016_02_BASF_Holistic_Program.jpg not found in /Users/nicholasstephan/Desktop/XXXXXXX/dist/Mac/XXXXXX.app/Contents/Resources/app.asar
at notFoundError (ELECTRON_ASAR.js:109:19)
at Object.module.(anonymous function) [as open] (ELECTRON_ASAR.js:209:16)
at WriteStream.open (fs.js:1890:6)
at new WriteStream (fs.js:1876:10)
at Object.fs.createWriteStream (fs.js:1831:10)
at next (/Users/nicholasstephan/Desktop/XXXXXXXX/dist/Mac/XXXXXXXX.app/Contents/Resources/app.asar/media.js:19:18)
at /Users/nicholasstephan/Desktop/XXXXXXXX/dist/Mac/XXXXXXXX.app/Contents/Resources/app.asar/media.js:52:4
...
where the media.js, ln 19, being referred to is the const file = fs.createWriteStream(${dir}/${name}); line in the code.
I've tried the solutions offered in about a dozen other similar stackoverflow answers, but none have fixed the problem.
What's going on here?
Thanks.
The built Electron app uses the Asar format. Asar is an archive format (it's really just one big file) though in Electron you are able to read from it as if it were a standard directory.
I presume (though I have not seen it explicitly documented) that it is not possible to write to an Asar with the fs functions. In any case there are almost certainly more appropriate locations to write data.
Try writing to a different path. Electron provides a number of useful paths using app.getPath(name) so you could for example write to the userData directory which holds configuration files for your app.

gunzip partials read from read-stream

I use Node.JS to fetch files from my S3 bucket.
The files over there are gzipped (gz).
I know that the contents of each file is composed by lines, where each line is a JSON of some record that failed to be put on Kinesis.
Each file consists of ~12K such records. and I would like to be able to process the records while the file is being downloaded.
If the file was not gzipped, that could be easily done using streams and readline module.
So, the only thing that stopping me from doing this is the gunzip process which, to my knowledge, needs to be executed on the whole file.
Is there any way of gunzipping a partial of a file?
Thanks.
EDIT 1: (bad example)
Trying what #Mark Adler suggested:
const fileStream = s3.getObject(params).createReadStream();
const lineReader = readline.createInterface({input: fileStream});
lineReader.on('line', line => {
const gunzipped = zlib.gunzipSync(line);
console.log(gunzipped);
})
I get the following error:
Error: incorrect header check
at Zlib._handle.onerror (zlib.js:363:17)
Yes. node.js has a complete interface to zlib, which allows you to decompress as much of a gzip file at a time as you like.
A working example that solves the above problem
The following solves the problem in the above code:
const fileStream = s3.getObject(params).createReadStream().pipe(zlib.createGunzip());
const lineReader = readline.createInterface({input: fileStream});
lineReader.on('line', gunzippedLine => {
console.log(gunzippedLine);
})

How can I compress a file into .rar using nodejs?

I have a file named mytext.txt and I'd like to compress this file to archive.rar. How can I do this in nodejs?
I've found nothing similar to rar only zip.
Find an rar command line utility that you can execute like
$ rar myfile.dat compressed.rar
Node.js can do command line calls. (See child_process.exec)
Give the normal command to the exec function, and it should get the job done.
For zipping a single file, zlib module can be very useful.
(function () {
'use strict';
var zlib = require('zlib');
var gzip = zlib.createGzip();
var fs = require('fs');
var inp = fs.createReadStream('mytext.txt');
var out = fs.createWriteStream('mytext.txt.gz');
inp.pipe(gzip).pipe(out);
}());
Unfortunately Nodejs dosn't native support Rar compression/decompression, i frustated with this too so i created a module called "super-winrar" making super easy deal with rar files in nodejs :)
check it out: https://github.com/KiyotakaAyanokojiDev/super-winrar
Exemple creating a file "archive.rar" and appending "mytext.txt" file:
const Rar = require('super-winrar');
// create a rar constructor with file path! (created if not exists)
const rar = new Rar('archive.rar');
// handle erros, otherwise will throw an exception!
rar.on('error', err => console.log(err.message));
rar.once('ready', async () => {
await rar.append(['mytext.txt']);
});

Resources