Node.js File extraction module for both zip and rar? - node.js

I am working on a node.js project which involves file extraction of multiple formats (zip, rar and potentially more). I have tried a few node modules to extract rar file like node-unrar, but none of them handles the job perfectly, not to say to handle extraction for both zip and rar. I am wondering if there is some wrapper module that handles extraction of multiple formats, or if not then what is the best (most robust and easy-to-use) node module for handling rar file extraction.

RAR
unrar-promise
const unrarp = require('unrar-promise');
unrarp
.extractAll('rar-file-path', 'extract-directory')
.then(result => {
cb(null, result);
})
.catch(err => {
cb(err);
});
7z ( npm unzip have unreadable code(zip file name encoding is gbk/gb2312))
7zip-min
7za.exe supports only 7z, lzma, cab, zip, gzip, bzip2, Z and tar
formats.
const _7z = require('7zip-min');
_7z.unpack('zip-7z-file-path', 'extract-directory', err => {
if (err) {
return cb(err);
}
cb(null, 'extract-directory');
});

Related

extract 7z file on S3 using node.js

can someone suggest npm package for extract 7z file for node.js.
I can see some npm package available for ZIP file but that does not work for the 7Z file.
I'm basically looking to extract 7z password protect file on S3 and read the data from 7z file.
Give node-7z package a try:
npm i node-7z
import Seven from 'node-7z'
// myStream is a Readable stream
const myStream = Seven.extractFull('./archive.7z', './output/dir/', {
$progress: true
})
myStream.on('data', function (data) {
doStuffWith(data) //? { status: 'extracted', file: 'extracted/file.txt" }
})
myStream.on('progress', function (progress) {
doStuffWith(progress) //? { percent: 67, fileCount: 5, file: undefinded }
})
myStream.on('end', function () {
// end of the operation, get the number of folders involved in the operation
myStream.info.get('Folders') //? '4'
})
myStream.on('error', (err) => handleError(err))
It also supports password feature you were requesting mate.

How can you archive with tar in NodeJS while only storing the subdirectory you want?

Basically I want to do the equivalent of this How to strip path while archiving with TAR but with the tar commands imported to NodeJS, so currently I'm doing this:
const gzip = zlib.createGzip();
const pack = new tar.Pack(prefix="");
const source = Readable.from('public/images/');
const destination = fs.createWriteStream('public/archive.tar.gz');
pipeline(source, pack, gzip, destination, (err) => {
if (err) {
console.error('An error occurred:', err);
process.exitCode = 1;
}
});
But doing so leaves me with files like: "/public/images/a.png" and "public/images/b.png", when what I want is files like "/a.png" and "/b.png". I want to know how I can add to this process to strip out the unneeded directories, while keeping the files where they are.
You need to change working directory:
// cwd The current working directory for creating the archive. Defaults to process.cwd().
new tar.Pack({ cwd: "./public/images" });
const source = Readable.from('');
Source: documentation of node-tar
Example: https://github.com/npm/node-tar/blob/main/test/pack.js#L93

Bad file descriptor, read, while extracting zip file using node-stream-zip

I have a zip file that has a folder like
1234/pic1.png
1234/pic2.png
1234/data.xlsx
I am trying to extract the spreadsheet (failing that, all files) using node-stream-zip.
const StreamZip = require('node-stream-zip');
const zip = new StreamZip({
file: path.join(downloadsDir, fileToFind),
storeEntries: true
});
zip.on('ready', () => {
if(!fs.existsSync('extracted')) {
fs.mkdirSync('extracted');
}
zip.extract('1234/', './extracted', err => {
console.log(err);
});
zip.close();
});
This produces
EBADF: bad file descriptor, read
In the extracted folder is one of the png files. But when following the guide to extract just the xlsx file it appears that the xlsx file is the one causing this error.
zip.extract('1234/data.xlsx', './extracted.xlsx', err => {
console.log(err);
});
Is the problem with the xlsx file? I can open it manually. Is it permissions-related? Node? This particular package?
Your problem is related to zip.close(). You're closing it on the same tick as you're invoking zip.extract().

How to decompress gzip files in Node.js

I want to unzip gzip files in Node.js I've tried [some] packages but nothing is working. Can you provide a package with sample code which can decompress gzip files in Node.js?
gunzip-file node package worked fine!
Do:
npm install gunzip-file
Then:
'use strict'
const gunzip = require('gunzip-file')
// 'sitemap.xml.gz' - source file
// 'sitemap.xml' - destination file
// () => { ... } - notification callback
gunzip('sitemap.xml.gz', 'sitemap.xml', () => {
console.log('gunzip done!')
})
Finally, run with Node at your shell.
I would suggest to use zlib.Gunzip.
Function prototype is zlib.Gunzip(buf, callback). The first argument is the raw archive data as a buffer that you want to extract, the second one is a callback which accept two arguments (result and error).
An implementation would be:
zlib.Gunzip(raw_data, function (error, result) {
if (error) throw error;
// Access data here through result as a Buffer
})
You can use tar.gz npm package to take care of this problem.
write following method to extract it to specific path:
targz().extract('/bkp/backup.tar.gz', '/home/myuser')
.then(function(){
console.log('Job done!');
})
.catch(function(err){
console.log('Something is wrong ', err.stack);
});
For more details you can follow this link
To unzip files in a directory:
To zip files in a directory:
More information:
https://medium.com/#harrietty/zipping-and-unzipping-files-with-nodejs-375d2750c5e4

JPEG File Encoding and writeFile in Node JS

I'm using http.request to download JPEG file. I am then using fs.writeFile to try to write the JPEG file out to the hard drive.
None of my JPEG files can be opened, they all show an error (but they do have a file size). I have tried all of the different encodings with fs.writeFile.
What am I messing up in this process?
Here's what the working one is showing when viewing it raw:
And here is what the bad one using fs.writeFile is showing:
Figured it out, needed to use res.setEncoding('binary'); on my http.request.
Thank you, looking to the previous response, I was able to save de media correctly:
fs.writeFile(
filepath + fileName + extension,
mediaReceived, // to use with writeFile
{ encoding: "binary" }, // to use with writeFile ***************WORKING
(err) => {
if (err) {
console.log("An error ocurred while writing the media file.");
return console.log(err);
}
}
);

Resources