ExcelJS - Add Rows To Existing Worksheet - node.js

I have some xlsx files stored in an s3 bucket that I am trying to update. To do this, I am creating an array of rows from a series of json files (one at a time). I get the rows from each json file as follows.
let worksheetRows = [];
for (let i = 0; i < json.length; i++) {
let data = json;
worksheetRows.push({ id: data[i]['id'], url: data[i].url.toString(), name: data[i].name })
}
I then download the existing xlsx from s3 and pass the stream (existingFileStream) to the function below.
async function loadWorkbookAndUpdate(existingFileStream, stream, bucket, worksheetRows, s3ExcelKey) {
const workbook = new ExcelJS.Workbook();
await workbook.xlsx.read(existingFileStream)
.then(async () => {
let worksheetUpdated = workbook.getWorksheet('My Sheet');
for (worksheetRow of worksheetRows) {
// Add rows in the headers
worksheetUpdated.addRow([
worksheetRow.id,
worksheetRow.url,
worksheetRow.name,
]);
}
})
await workbook.xlsx.write(stream)
.then(async () => {
await s3.upload({
Key: s3ExcelKey,
Bucket: bucket,
Body: stream,
ServerSideEncryption: "AES256",
ContentType: 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
}).promise();
})
.catch(function (e) {
console.log(e.message, 'Sorry. The updated master list file (xlsx) for the default images could not be uploaded to s3.')
});
}
I then try to add the worksheetRows to the existing file stream and upload the updated file stream to s3 to be saved as an updated xlsx file. Ninety five percent of the time it works fine, but for a few of the json files I get this error. I verified that the json in the offending files was valid and the excel that the "existingStream" is generated from does not appear to be corrupt.
I am not using jszip, unless ExcelJS has some dependencies on it that I am not aware of...
Invoke Error {"errorType":"Error","errorMessage":"Corrupted zip or bug: unexpected signature (\x9B\x4C\xF3\x06, expected \x50\x4B\x03\x04)","stack":["Error: Corrupted zip or bug: unexpected signature (\x9B\x4C\xF3\x06, expected \x50\x4B\x03\x04)"," at ZipEntries.checkSignature (/var/task/node_modules/jszip/lib/zipEntries.js:28:19)"," at ZipEntries.readLocalFiles (/var/task/node_modules/jszip/lib/zipEntries.js:121:18)"," at ZipEntries.load (/var/task/node_modules/jszip/lib/zipEntries.js:258:14)"," at /var/task/node_modules/jszip/lib/load.js:48:24"," at processTicksAndRejections (internal/process/task_queues.js:97:5)"," at async XLSX.load (/var/task/node_modules/exceljs/lib/xlsx/xlsx.js:279:17)"," at async loadWorkbookAndUpdate (/var/task/app.js:176:2)","
Has anyone else run into this issue or have any ideas as to how I can debug this and figure out what's going on?

This happened to be some random discrepancies in property names in a few of my json files that were causing my code to try and add an empty array of rows to my worksheet. The error was a little confusing, because I didn't initially realize that ExcelJS had a dependency on jszip and the file I was populating the rows from was valid json. The code above did work once I found the issue in the property names.

Related

Copying a pdf using node's fs results in a different file

I have the following code, which I expect to copy a pdf but it doesn't copy it exactly and the file size is off between 286KB vs the original 202KB and the copy does not open in a pdf reader. I tried this on other languages and I get the same issue. I get a similar if not identical result from opening the original pdf as a text file on vs code, copying and pasting the contents into a new file. Thank you!
const fs = require('fs');
fs.readFile('./original.pdf', 'utf8', (err, data) => {
fs.writeFile('./copy.pdf', chunk, err => {
console.error(err);
});
});
EDIT: To clarify, I'm not looking for another approach/library/api, but rather an explanation of why my method does not work and a modification of either the code or the copying and pasting the contents approach. Thank you!
You can use copyFile method from fs/promises:
import { copyFile } from 'fs/promises';
await copyFile('./original.pdf', './copy.pdf');
You can read more about it here.
const fs = require('fs');
fs.readFile('./original.pdf', (err, data) => {
fs.writeFile('./copy.pdf', data, err => {
console.error(err);
});
});
simply delete the encoding from the readFile-function and use the variable 'data' from the callback in the writeFile-function

Adding files in directory to an array

I am really new to node.js. I need to read .json files from a directory and then add them to an array and return it. I am able to read each file separately by passing the address:
const fs = require("fs");
fs.readFile("./fashion/customer.json", "utf8", (err, jsonString) => {
if (err) {
console.log("Error reading file from disk:", err);
return;
}
try {
const customer = JSON.parse(jsonString);
console.log("Customer address is:", customer.address); // => "Customer address is: Infinity Loop Drive"
} catch (err) {
console.log("Error parsing JSON string:", err);
}
});
But the same fashion folder has multiple json files. I want to add these files to an array and then return it. I tried using readdirSync but that just returned the file names. Is it possible to add json files to an array and return it?
Basically I require an array of this format:
Array[{contents of json file1}, {contents of json file2}, .....]
Any help is appreciated!
Here is a simple solution to your question:
const fs = require("fs");
const jsonFolder = './fashion'
var customerDataArray = []
fs.readdirSync(jsonFolder).forEach(file => {
let fileData = JSON.parse(fs.readFileSync(jsonFolder+'/'+file))
customerDataArray.push(fileData)
});
console.log(customerDataArray)
readdirSync returns an array with all the file names or objects in the directory. You can use forEach to iterate through every item in the array, which will be the file names in this scenario. To read the contents of each file, use readFileSync and specify the path to the file as the name of the directory plus the name of the file. The data is returned as a buffer and needs to be parsed using JSON.parse(), and then it is pushed to the customerDataArray.
I hope this answers your question!

Trying to create excel file from ExcelJs but its give me corrupted file

Below code give me the corrupted file please help.
exports.testExcelCreation = async function () {
// construct a streaming XLSX workbook writer with styles and shared strings
const options = {
filename: 'assets/Uploads/Reports/TEST/streamed-workbook.xlsx',
useStyles: true,
useSharedStrings: true
};
const workBook = new ExcelJs.stream.xlsx.WorkbookWriter(options);
const workSheet = workBook.addWorksheet("sheet 1");
console.log("Success");
}
I think you forget add await workbook.commit(); before console.log("Success");
I face the similar problem. I tried to update excel file asynchronously from multiple places in my code simultaneously. If we try to open the file in read mode when it was in write mode already, it'll make the file corrupted.
I was stuck in the below error.
Error Error: Corrupted zip or bug: expected 16 records in central dir, got 0
at ZipEntries.readCentralDir (/node_modules/jszip/lib/zipEntries.js:146:23)
at ZipEntries.load (/node_modules/jszip/lib/zipEntries.js:257:14)
at /node_modules/jszip/lib/load.js:48:24
at processTicksAndRejections (node:internal/process/task_queues:96:5)
at async XLSX.load (/node_modules/exceljs/lib/xlsx/xlsx.js:279:17)
at async XLSX.readFile (/node_modules/exceljs/lib/xlsx/xlsx.js:55:24)
I carefully gone through my code and found that I've been asynchronously calling update excel method multiple times simultaneously. I made it to be synchronous and removed unwanted code calling update excel method. This fixes my problem.

How to use imagemin to compress images and busboy

I want to use the following library to compress images
https://github.com/imagemin/imagemin
The problem is when the user uploads using a form, how do I plug in the file details from the form to the image min plugin? Like for example, if the file form field is call example-image, how do I plug that file form field to image min plugin so that it can compress the images?
I tried:
req is from the express/nodejs req
var filebus = new Busboy({ headers: req.headers }),
promises = [];
filebus.on('file', function(fieldname, file, filename, encoding, contentType) {
var fileBuffer = new Buffer(0),
s3ImgPath;
if (file) {
file.on('data', function (d) {
fileBuffer = Buffer.concat([fileBuffer, d]);
}).on('end', function () {
Imagemin.buffer(fileBuffer, {
plugins: [
imageminMozjpeg(),
imageminPngquant({quality: '80'})
]
}).then(function (data) {
console.log(data[0]);
if (s3ImgPath) {
promises.push(this.pushImageToS3(s3ImgPath, data[0].data, contentType));
}
});
}.bind(this));
}
});
But the problem is I rather have a buffer of the file that I can upload to S3. I don't want to come the files to a build/images folder. I want to get a buffer for the file, compress it, and upload that buffer to s3. How can I use image min to get a buffer of the file upload via html form and upload that to s3?
The documentation for the output parameter shows that it is optional (though admittedly, the function declaration did not, which might be confusing).
output
Type: string
Set the destination folder to where your files will be written. If no
destination is specified no files will be written.
Therefore, you can opt out of writing the files to storage and just use the output in memory:
imagemin([file.path], {
plugins: [
imageminMozjpeg(),
imageminPngquant({quality: '65-80'})
]
}).then(files => {
// upload file to S3
});

Check uploaded file extension in Sails js

How we can check uploaded file extension in sails js?
I tried on skipper and multer but have no result.
any suggestion?
You should use saveAs options for each file before saving.
var md5 = require('md5');
module.exports = {
testUpload:function(req,res){
// setting allowed file types
var allowedTypes = ['image/jpeg', 'image/png'];
// skipper default upload directory .tmp/uploads/
var allowedDir = "../../assets/images";
// don not define dirname , use default path
req.file("uploadFiles").upload({
saveAs:function(file, cb) {
var d = new Date();
var extension = file.filename.split('.').pop();
// generating unique filename with extension
var uuid=md5(d.getMilliseconds())+"."+ extension;
// seperate allowed and disallowed file types
if(allowedTypes.indexOf(file.headers['content-type']) === -1) {
// save as disallowed files default upload path
cb(null,uuid);
}else{
// save as allowed files
cb(null,allowedDir+"/"+uuid);
}
}
},function whenDone(err,files){
return res.json({
files:files,
err:err
});
});
}
}
Just get uploaded files array and check last chunk of string after dot.
req.file('file').upload({
maxBytes: 2000000,
dirname: 'uploadFolder'
}, function (error, files) {
if (error) return sails.log.error(error);
// You have files array, so you can do this
files[0].fd.split('.').pop(); // You get extension
}
What is going on here? When upload is finished you will get array of files with their filenames. You can get data from that array and see where this file is located (full path).
The last thing is splitting string by dots and get last item from the array with pop() method.

Resources