I call an API which returns a buffer data of a .zip file. I want to read files' buffer data which resides inside the .zip file using its buffer data without saving the .zip file. Is it possible?
Try the zlib library (its a core node.js library - docs: https://nodejs.org/api/zlib.html#zlib), with this example I took from the documentation
const {unzip } = require('node:zlib');
const buffer = Buffer.from('eJzT0yMAAGTvBe8=', 'base64');
unzip(buffer, (err, buffer) => {
if (err) {
console.error('An error occurred:', err);
process.exitCode = 1;
}
console.log(buffer.toString());
});
Related
I am trying to write an array of objects into a csv file in node.js. I have the following code:
fs=require('fs');
const data=[{name:'John'},{name:'Peter' }];
fs.writeFile('test.csv', data, 'utf8', function (err) {if (err)
{console.log('Some error occured - file either not saved or corrupted file saved.');
} else
{console.log('It\'s saved!');
}});
However when i open the saved csv file I have weird chinese characters only in the file. Would anyone have a clue what's going on here?
PS: I am on Windows; node version is 10.15.0
The data has to passed as a string - You can use JSON.stringify() to convert JavaScript Object(including Arrays) to a string.
https://nodejs.org/api/fs.html#fs_fs_writefile_file_data_options_callback
const fs = require('fs');
const data = [{name:'John'},{name:'Peter' }];
fs.writeFile('test.csv', JSON.stringify(data), 'utf8', function (err) {if (err)
{console.log('Some error occured - file either not saved or corrupted file saved.');
} else
{console.log('It\'s saved!');
}});
Note: From the data you are passing, this wouldn't be a csv file, most likely a JSON file.
I have a zip file that has a folder like
1234/pic1.png
1234/pic2.png
1234/data.xlsx
I am trying to extract the spreadsheet (failing that, all files) using node-stream-zip.
const StreamZip = require('node-stream-zip');
const zip = new StreamZip({
file: path.join(downloadsDir, fileToFind),
storeEntries: true
});
zip.on('ready', () => {
if(!fs.existsSync('extracted')) {
fs.mkdirSync('extracted');
}
zip.extract('1234/', './extracted', err => {
console.log(err);
});
zip.close();
});
This produces
EBADF: bad file descriptor, read
In the extracted folder is one of the png files. But when following the guide to extract just the xlsx file it appears that the xlsx file is the one causing this error.
zip.extract('1234/data.xlsx', './extracted.xlsx', err => {
console.log(err);
});
Is the problem with the xlsx file? I can open it manually. Is it permissions-related? Node? This particular package?
Your problem is related to zip.close(). You're closing it on the same tick as you're invoking zip.extract().
I have an unzipped xlsx file, in it I edit some files to be able to generate a new xlsx file containing new data.
In linux to recompress the file in xlsx I just need to go into the terminal and type
find . -type f | xargs zip ../newfile.xlsx
into the folder where the xlsx files are.
The question now is how can I do this using node.js?
The solution is to compress a direct list of files contained in xlsx, for some reason if we try to compress the folder the file has corrupted.
The code looks like this if you use JSZIP
var fs = require('fs');
var JSZip = require("jszip");
var zip = new JSZip();
var file = [];
file.push("_rels/.rels");
file.push("docProps/core.xml");
file.push("docProps/app.xml");
file.push("docProps/custom.xml");
file.push("[Content_Types].xml");
file.push("xl/_rels/workbook.xml.rels");
file.push("xl/styles.xml");
file.push("xl/pivotTables/_rels/pivotTable3.xml.rels");
file.push("xl/pivotTables/_rels/pivotTable1.xml.rels");
file.push("xl/pivotTables/_rels/pivotTable2.xml.rels");
file.push("xl/pivotTables/pivotTable3.xml");
file.push("xl/pivotTables/pivotTable1.xml");
file.push("xl/pivotTables/pivotTable2.xml");
file.push("xl/workbook.xml");
file.push("xl/worksheets/_rels/sheet2.xml.rels");
file.push("xl/worksheets/_rels/sheet1.xml.rels");
file.push("xl/worksheets/_rels/sheet3.xml.rels");
file.push("xl/worksheets/sheet4.xml");
file.push("xl/worksheets/sheet1.xml");
file.push("xl/worksheets/sheet3.xml");
file.push("xl/worksheets/sheet2.xml");
file.push("xl/sharedStrings.xml");
file.push("xl/pivotCache/_rels/pivotCacheDefinition1.xml.rels");
file.push("xl/pivotCache/pivotCacheDefinition1.xml");
file.push("xl/pivotCache/pivotCacheRecords1.xml");
for (var i = 0; i < file.length; i++) {
zip.file(file[i], fs.readFileSync("/home/user/xlsx_FILES/"+file[i]));
}
zip.generateAsync({type:"blob"}).then(function(content) {
// see FileSaver.js
saveAs(content, "yourfile.xlsx");
});
Take a look at archiver, a compression library for nodejs. The docs for the library look like they are comprehensive. The library also allows you to append archives and take advantage of streaming api's for appending and creating new archives.
Here is an example snippet from their docs which shows how to use the library.
// require modules
var fs = require('fs');
var archiver = require('archiver');
// create a file to stream archive data to.
var output = fs.createWriteStream(__dirname + '/example.zip');
var archive = archiver('zip', {
store: true // Sets the compression method to STORE.
});
// listen for all archive data to be written
output.on('close', function() {
console.log(archive.pointer() + ' total bytes');
console.log('archiver has been finalized and the output file descriptor has closed.');
});
// good practice to catch this error explicitly
archive.on('error', function(err) {
throw err;
});
// pipe archive data to the file
archive.pipe(output);
I want to use the following library to compress images
https://github.com/imagemin/imagemin
The problem is when the user uploads using a form, how do I plug in the file details from the form to the image min plugin? Like for example, if the file form field is call example-image, how do I plug that file form field to image min plugin so that it can compress the images?
I tried:
req is from the express/nodejs req
var filebus = new Busboy({ headers: req.headers }),
promises = [];
filebus.on('file', function(fieldname, file, filename, encoding, contentType) {
var fileBuffer = new Buffer(0),
s3ImgPath;
if (file) {
file.on('data', function (d) {
fileBuffer = Buffer.concat([fileBuffer, d]);
}).on('end', function () {
Imagemin.buffer(fileBuffer, {
plugins: [
imageminMozjpeg(),
imageminPngquant({quality: '80'})
]
}).then(function (data) {
console.log(data[0]);
if (s3ImgPath) {
promises.push(this.pushImageToS3(s3ImgPath, data[0].data, contentType));
}
});
}.bind(this));
}
});
But the problem is I rather have a buffer of the file that I can upload to S3. I don't want to come the files to a build/images folder. I want to get a buffer for the file, compress it, and upload that buffer to s3. How can I use image min to get a buffer of the file upload via html form and upload that to s3?
The documentation for the output parameter shows that it is optional (though admittedly, the function declaration did not, which might be confusing).
output
Type: string
Set the destination folder to where your files will be written. If no
destination is specified no files will be written.
Therefore, you can opt out of writing the files to storage and just use the output in memory:
imagemin([file.path], {
plugins: [
imageminMozjpeg(),
imageminPngquant({quality: '65-80'})
]
}).then(files => {
// upload file to S3
});
I'm using http.request to download JPEG file. I am then using fs.writeFile to try to write the JPEG file out to the hard drive.
None of my JPEG files can be opened, they all show an error (but they do have a file size). I have tried all of the different encodings with fs.writeFile.
What am I messing up in this process?
Here's what the working one is showing when viewing it raw:
And here is what the bad one using fs.writeFile is showing:
Figured it out, needed to use res.setEncoding('binary'); on my http.request.
Thank you, looking to the previous response, I was able to save de media correctly:
fs.writeFile(
filepath + fileName + extension,
mediaReceived, // to use with writeFile
{ encoding: "binary" }, // to use with writeFile ***************WORKING
(err) => {
if (err) {
console.log("An error ocurred while writing the media file.");
return console.log(err);
}
}
);