Getting the file in busboy to upload to firebase storage - node.js

I am using busboy in node to upload a file to the firebase storage.
But everytme i send the post request with a file, it says..
{
"e": {
"code_": "storage/invalid-argument",
"message_": "Firebase Storage: Invalid argument in `put` at index 0: Expected Blob or File.",
"serverResponse_": null,
"name_": "FirebaseError"
}
}
I can't figure out how do i send the file using busboy..
My code snippet is..
export const uploadAvatar = (req: any, res: any) => {
const busboy = new BusBoy({ headers: req.headers });
let filepath: any;
let imgToBeUploaded: any;
busboy.on(
'file',
(
fieldname: string,
file: any,
filename: string,
encoding: string,
mimetype: string
) => {
if (mimetype !== 'image/jpeg' && mimetype !== 'image/png') {
res.status(400).json({
error: 'Wrong image format',
});
}
const imgExt: string = filename.split('.')[
filename.split('.').length - 1
];
const imgName = `avatar_${req.user.userName}.${imgExt}`;
filepath = path.join(os.tmpdir(), imgName);
// modifiedUrl = `avatar_${req.user.userName}_200x200.${imgExt}`;
file.pipe(fs.createWriteStream(filepath));
imgToBeUploaded = { filepath, mimetype, file };
}
);
busboy.on('finish', async () => {
const storageRef = storage.ref();
try {
const uploadTask = await storageRef.put(imgToBeUploaded.filepath);
console.log(`UploadTask : ${uploadTask}`);
return res.json('File uploaded');
} catch (e) {
return res.status(400).json({ e });
}
});
busboy.end(req.rawBody);
};
The console.log of 'file' returns the location in tempdir, where the file is stored...
Please help me figure out how do i get busboy to return the file, which i can pass to the storageRef.put() 's argument.

For anyone who's here looking for an answer... I've had this problem for over a week now. Kept getting Error: TypeError: Cannot read properties of undefined (reading 'byteLength'). Try doing
storageRef.put(fs.readFileSync(imgToBeUploaded.filepath))
It will actually read the data into a buffer from the temporary file on your local computer and send it on to firestore.
Also it might help to console.log your fs.statSync(imgToBeUploaded.filepath) to make sure the file is actually written. Check the size to make sure it's in the expected range for your image.

Related

Express JS API req.body shows buffer data

I created an API below:
app.post("/categories", async (req, res) => {
console.log(`req.body: ${JSON.stringify(req.body)}`)
console.log(`req.body.title: ${JSON.stringify(req.body.title)}`)
console.log(`req.files: ${JSON.stringify(req.files)}`)
res.json({})
});
Where the data passed is:
{
"title": "Video Title"
"description": "Video Description"
"thumbnail": [object File]
"video": [object File]
}
The data passed is powered by VueJS and Axios:
methods: {
async createCategory() {
const formData = new window.FormData();
formData.set("title", this.category.title);
formData.set("description", this.category.description);
formData.set("thumbnail", this.thumbnail);
formData.set("video", this.video);
await $this.axios.post("clothing/v1/categories/", formData, {
headers: { "Content-Type": "multipart/form-data" },
});
}
}
However the shown data in the req.body is:
req.body: {"type":"Buffer","data":[45,45,45,45,45,45,87,101,98,75,105,116,70,111,114,109,66,111,117,110,100,97,114,121,121,104,112,52,54,97,82,89,68,121,77,82,57,66,52,110,13,10,67,111,110,116,101,110,116,45,68,105,115,112,111,115,105,116,105,111,110,58,32,102,111,114,109,45,100,97,116,97,59,32,110,97,109,101,61,34,116,105,116,108,101,34,13,10,13,10,86,105,100,101,111,32,84,105,116,108,101,13,10,45,45,45,45,45,45,87,101,98,75,105,116,70,111,114,109,66,111,117,110,100,97,114,121,121,104,112,52,54,97,82,89,68,121,77,82,57,66,52,110,13,10,67,111,110,116,101,110,116,45,68,105,115,112,111,115,105,116,105,111,110,58,32,102,111,114,109,45,100,97,116,97,59,32,110,97,109,101,61,34,100,101,115,99,114,105,112,116,105,111,110,34,13,10,13,10,86,105,100,101,111,32,68,101,115,99,114,105,112,116,105,111,110,13,10,45,45,45,45,45,45,87,101,98,75,105,116,70,111,114,109,66,111,117,110,100,97,114,121,121,104,112,52,54,97,82,89,68,121,77,82,57,66,52,110,13,10,67,111,110,116,101,110,116,45,68,105,115,112,111,115,105,116,105,111,110,58,32,102,111,114,109,45,100,97,116,97,59,32,110,97,109,101,61,34,116,104,117,109,98,110,97,105,108,34,13,10,13,10,91,111,98,106,101,99,116,32,70,105,108,101,93,13,10,45,45,45,45,45,45,87,101,98,75,105,116,70,111,114,109,66,111,117,110,100,97,114,121,121,104,112,52,54,97,82,89,68,121,77,82,57,66,52,110,13,10,67,111,110,116,101,110,116,45,68,105,115,112,111,115,105,116,105,111,110,58,32,102,111,114,109,45,100,97,116,97,59,32,110,97,109,101,61,34,118,105,100,101,111,34,13,10,13,10,91,111,98,106,101,99,116,32,70,105,108,101,93,13,10,45,45,45,45,45,45,87,101,98,75,105,116,70,111,114,109,66,111,117,110,100,97,114,121,121,104,112,52,54,97,82,89,68,121,77,82,57,66,52,110,45,45,13,10]}
I am hoping that I can retrieve my passed data inside my API something like: req.body: {"title":"Example","description":"example"} as I will use these data to save in FireStore and upload the files in Cloud Storage.
NOTE:
I tried using multer but got the error below:
> return fn.apply(this, arguments);
> ^
>
> TypeError: Cannot read properties of undefined (reading 'apply')
> at Immediate.<anonymous> (/Users/adminadmin/Desktop/projects/dayanara-environments/dayanara-clothing-api/functions/node_modules/express/lib/router/index.js:641:15)
> at processImmediate (node:internal/timers:468:21)
I did not mention that I was developing NodeJS with Google Cloud Functions and only in local and testing development.
The error below always shows whenever there is any kind of error in my code.
> return fn.apply(this, arguments);
> ^
>
> TypeError: Cannot read properties of undefined (reading 'apply')
> at Immediate.<anonymous> (/Users/adminadmin/Desktop/projects/dayanara-environments/dayanara-clothing-api/functions/node_modules/express/lib/router/index.js:641:15)
> at processImmediate (node:internal/timers:468:21)
As for the multipart, I used busboy like below:
app.post("/categories", (req, res) => {
let writeResult;
const storageRef = admin.storage().bucket(`gs://${storageBucket}`);
const busboy = Busboy({headers: req.headers});
const tmpdir = os.tmpdir();
// This object will accumulate all the fields, keyed by their name
const fields = {};
// This object will accumulate all the uploaded files, keyed by their name.
const uploads = {};
// This code will process each non-file field in the form.
busboy.on('field', (fieldname, val) => {
/**
* TODO(developer): Process submitted field values here
*/
console.log(`Processed field ${fieldname}: ${val}.`);
fields[fieldname] = val;
});
const fileWrites = [];
// This code will process each file uploaded.
busboy.on('file', (fieldname, file, {filename}) => {
// Note: os.tmpdir() points to an in-memory file system on GCF
// Thus, any files in it must fit in the instance's memory.
console.log(`Processed file ${filename}`);
const filepath = path.join(tmpdir, filename);
uploads[fieldname] = filepath;
const writeStream = fs.createWriteStream(filepath);
file.pipe(writeStream);
// File was processed by Busboy; wait for it to be written.
// Note: GCF may not persist saved files across invocations.
// Persistent files must be kept in other locations
// (such as Cloud Storage buckets).
const promise = new Promise((resolve, reject) => {
file.on('end', () => {
writeStream.end();
});
writeStream.on('finish', resolve);
writeStream.on('error', reject);
});
fileWrites.push(promise);
});
// Triggered once all uploaded files are processed by Busboy.
// We still need to wait for the disk writes (saves) to complete.
busboy.on('finish', async () => {
console.log('finished busboy')
await Promise.all(fileWrites);
/**
* TODO(developer): Process saved files here
*/
for (const file in uploads) {
const filePath = uploads[file]
const name = fields.name.replaceAll(' ', '-')
const _filePath = filePath.split('/')
const fileName = _filePath[_filePath.length - 1]
const destFileName = `${name}/${fileName}`
// eslint-disable-next-line no-await-in-loop
const uploaded = await storageRef.upload(filePath, {
destination: destFileName
})
const _file = uploaded[0];
const bucketFile = "https://firebasestorage.googleapis.com/v0/b/" + storageBucket + "/o/" + encodeURIComponent(_file.name) + "?alt=media"
fields[file] = bucketFile
}
writeResult = await admin
.firestore()
.collection(collection)
.add({
name: fields.name,
description: fields.description,
timestamp: admin.firestore.Timestamp.now(),
thumbnail: fields.thumbnail,
video: fields.video
});
const written = await writeResult.get();
res.json(written.data());
}
});
Then I needed to change how I pass formData from my VueJS and Axios where I replaced using model to refs on my file data. I only needs to use model in Django so I thought it would be the same on ExpressJS:
methods: {
async createCategory() {
const formData = new window.FormData();
const thumbnail = this.$refs.thumbnail;
const video = this.$refs.video;
formData.set("name", this.category.name);
formData.set("description", this.category.description);
formData.set("thumbnail", thumbnail.files[0]);
formData.set("video", video.files[0]);
await $this.axios.post("clothing/v1/categories/", formData, {
headers: { "Content-Type": "multipart/form-data" },
});
}
}
After the changes above, I can finally send multi-part/form-data properly. Resources below helped me a lot:
https://cloud.google.com/functions/docs/samples/functions-http-form-data#functions_http_form_data-nodejs
Handling multipart/form-data POST with Express in Cloud Functions

Smooch - create attachments from buffer

I'm trying to create an image via smooch-core API
I have an image as Buffer - base64, And I try something like this:
smoochClient.attachments
.create({
appId: appId,
props: {
for: 'message',
access: 'public',
appUserId: appUserId
},
source: myBuffer
})
.then(() => {
console.log('OK');
}).catch(err => {
console.log(JSON.stringify(err));
});
I get this error: "status":413,"statusText":"Payload Too Large"
[When I create this image normally through Postman it does work well, so it's not too big - I guess it's because of the Buffer's sending]
Anyone know how I can send a buffer to this API?
Are you able to submit the base64 data directly in the postman call?
Reading through the spec here it looks like source should be a filepath/name, and not raw binary data.
The easy way may be to save the base64 data to a[n appropriately encoded] file, then provide that file's path as source
Otherwise I'm not sure I'd go so far as to take apart api_instance.upload_attachment() to feed in the base64 data instead of opening/reading from the specified filename.
I found such a solution:
Create a temporary file to get it's read stream and send it in source instead of the myBuffer parameter and here is the code of creating the temporary file:
async getTempFileSource(bufferData) {
const fs = require("fs");
//remove mime type
if (bufferData.startsWith('data:'))
bufferData = bufferData.split('base64,')[1];
//Get file extension
const type = await require('file-type').fromBuffer(new Buffer(bufferData, 'base64'));
if (!type) {
console.log("getTempFileSource - The buffer data is corrupted", 'red');
return null;
}
//create temporary file
const tempFile = require('tmp').fileSync({postfix: '.' + type.ext});
//append buffer data to temp file
fs.appendFileSync(tempFile.name, new Buffer(bufferData, 'base64'));
//create read stream from the temp file
const source = fs.createReadStream(tempFile.name);
//remove the temp file
tempFile.removeCallback();
return source;
}
Here is the code for creating the attachment:
return new Promise(async (resolve, reject) => {
const source = await getTempFileSource(bufferData);
if (!source)
resolve(null);
else {
session.smoochClient.attachments
.create({
appId: appId,
props: {
for: 'message',
access: 'public',
appUserId: appUserId
},
source: source
})
.then(res => {
resolve(res);
}).catch(err => {
reject(err);
});
}
});

Read data from .xlsx file on S3 using Nodejs Lambda

I'm still new in NodeJs and AWS, so forgive me if this is a noob question.
I am trying to read the data from an excel file (.xlsx). The lambda function receives the extension of the file type.
Here is my code:
exports.handler = async (event, context, callback) => {
console.log('Received event:', JSON.stringify(event, null, 2));
if (event.fileExt === undefined) {
callback("400 Invalid Input");
}
let returnData = "";
const S3 = require('aws-sdk/clients/s3');
const s3 = new S3();
switch(event.fileExt)
{
case "plain":
case "txt":
// Extract text
const params = {Bucket: 'filestation', Key: 'MyTXT.'+event.fileExt};
try {
await s3.getObject(params, function(err, data) {
if (err) console.log(err, err.stack); // an error occurred
else{ // successful response
returnData = data.Body.toString('utf-8');
context.done(null, returnData);
}
}).promise();
} catch (error) {
console.log(error);
return;
}
break;
case "xls":
case "xlsx":
returnData = "Excel";
// Extract text
const params2 = {Bucket: 'filestation', Key: 'MyExcel.'+event.fileExt};
const readXlsxFile = require("read-excel-file/node");
try {
const doc = await s3.getObject(params2);
const parsedDoc = await readXlsxFile(doc);
console.log(parsedDoc)
} catch (err) {
console.log(err);
const message = `Error getting object.`;
console.log(message);
throw new Error(message);
}
break;
case "docx":
returnData = "Word doc";
// Extract text
break;
default:
callback("400 Invalid Operator");
break;
}
callback(null, returnData);
};
The textfile part works. But the xlsx part makes the function time out.
I did install the read-excel-file dependency and uploaded the zip so that I have access to it.
But the function times out with this message:
"errorMessage": "2020-11-02T13:06:50.948Z 120bfb48-f29c-4e3f-9507-fc88125515fd Task timed out after 3.01 seconds"
Any help would be appreciated! Thanks for your time.
using the xlsx npm library. here's how we did it.
assuming the file is under the root project path.
const xlsx = require('xlsx');
// read your excel file
let readFile = xlsx.readFile('file_example_XLSX_5000.xlsx')
// get first-sheet's name
let sheetName = readFile.SheetNames[0];
// convert sheets to JSON. Best if sheet has a headers specified.
console.log(xlsx.utils.sheet_to_json(readFile.Sheets[sheetName]));
You need to install xlsx (SheetJs) library into the project:
npm install xlsx
and then import the "read" function into the lambda, get the s3 object's body and send to xlsx like this:
const { read } = require('sheetjs-style');
const aws = require('aws-sdk');
const s3 = new aws.S3({ apiVersion: '2006-03-01' });
exports.handler = async (event) => {
const bucketName = 'excel-files';
const fileKey = 'Demo Data.xlsx';
// Simple GetObject
let file = await s3.getObject({Bucket: bucketName, Key: fileKey}).promise();
const wb = read(file.Body);
const response = {
statusCode: 200,
body: JSON.stringify({
read: wb.Sheets,
}),
};
return response;
};
(of course, you can receive the bucket and filekey from parameters if you send them...)
Very Important: Use the READ (not the readFile) function and send the Body property (with capital "B") as a paremeter
I changed the timeout to 20 seconds and it works. Only one issue remains: const parsedDoc = await readXlsxFile(doc); wants to receive a string (filepath) and not a file.
Solved by using xlsx NPM library. Using a stream and giving it buffers.

NODE GridFStorage with additional data from request doesn't always show up

Hey guys i'm confronting a very strange behaviour with GridFs while i try to upload a file:
So i send with formdata my file which i want to upload and a code which will be set in metadata in files,
the files are saved correctly and the originalname field is always added to the metadata but the code field which is a req.body paramater has a very strange behaviour.
files.ts
uploadFileFormSubmit(event) {
const formData = new FormData();
formData.append('file', event.target.files.item(0));
formData.append('code', this.courseCode);
this.fileService.uploadFile(formData).subscribe(res => ....
fileService.ts
uploadFile(data): Observable<GeneralResponse> {
return this.http.post<GeneralResponse>('/files/uploadFile', data);
}
here is the backend part:
files.js (back-end)
const storage = new GridFsStorage({
url: dbUrl,
file: (req, file) => {
return new Promise((resolve, reject) => {
crypto.randomBytes(16, (err, buf) => {
if (err) {
return reject(err);
}
const filename = buf.toString('hex') + path.extname(file.originalname);
console.log(req.body)
const code = JSON.parse(JSON.stringify(req.body));
console.log(code)
const fileInfo = {
filename: filename,
metadata: {
originalname: file.originalname,
materialCode: code.code
},
bucketName: 'files'
};
resolve(fileInfo);
});
});
}
});
As you can see i parse the req.body in order to get my property (i found it here this solution because the req.body was [Object: null prototype] { code: 'myCode'} )
And for some files this code data is passed but not always.
note that there are 2 console.logs(before and after JSON.parse()
the first object null is an excel file,
the second is a pdf
the third is jpg file
the fourth is a png file
maybe it's something with the extensions but i cannot imagine why the req.body sometimes gets parsed
and the code gets into metadata but other times not :/
So what can cause this behaviour? thanks for help in advance :D

download and untar file than check the content, async await problem, node.js

I am downloading a file in tar format with request-promise module. Then I untar that file with tar module using async await syntax.
const list = new Promise(async (resolve, reject) => {
const filePath = "somedir/myFile.tar.gz";
if (!fs.existsSync(filePath)) {
const options = {
uri: "http://tarFileUrl",
encoding: "binary"
};
try {
console.log("download and untar");
const response = await rp.get(options);
const file = await fs.createWriteStream(filePath);
file.write(response, 'binary');
file.on('finish', () => {
console.log('wrote all data to file');
//here is the untar process
tar.x(
{
file: filePath,
cwd: "lists"
}
);
console.log("extracted");
});
file.end();
} catch(e) {
reject();
}
console.log("doesn't exist");
}
}
//here I am checking if the file exists no need to download either extract it (the try catch block)
//then the Array is created which includes the the list content line by line
if (fs.existsSync(filePath)) {
const file = await fs.readFileSync("lists/alreadyExtractedFile.list").toString().match(/[^\r\n]+/g);
if (file) {
file.map(name => {
if (name === checkingName) {
blackListed = true;
return resolve(blackListed);
}
});
}
else {
console.log("err");
}
}
The console.log output sequence is like so:
download and untar
file doesn't exist
UnhandledPromiseRejectionWarning: Error: ENOENT: no such file or directory, open '...lists/alreadyExtractedFile.list'
wrote all data to file
extracted
So the file lists/alreadyExtractedFile.list is being checked before it's created. My guess is I am doing some wrong async await actions. As console.logs pointed that out the second checking block is somehow coming earlier than the file creating and untaring processes.
Please help me to figure out what I am doing wrong.
Your problem is here
const file = await fs.readFileSync("lists/alreadyExtractedFile.list").toString().match(/[^\r\n]+/g);
the readFileSync function doesn't return a promise, so you shouldn't await it:
const file = fs.readFileSync("lists/alreadyExtractedFile.list")
.toString().match(/[^\r\n]+/g);
This should solve the issue
You need to call resolve inside new Promise() callback.
If you write a local utility and use some sync methods, you can use sync methods whenever possible (in fs, tar etc).
This is a small example where a small archive from the Node.js repository is asynchronously downloaded, synchronously written and unpacked, then a file is synchronously read:
'use strict';
const fs = require('fs');
const rp = require('request-promise');
const tar = require('tar');
(async function main() {
try {
const url = 'https://nodejs.org/download/release/latest/node-v11.10.1-headers.tar.gz';
const arcName = 'node-v11.10.1-headers.tar.gz';
const response = await rp.get({ uri: url, encoding: null });
fs.writeFileSync(arcName, response, { encoding: null });
tar.x({ file: arcName, cwd: '.', sync: true });
const fileContent = fs.readFileSync('node-v11.10.1/include/node/v8-version.h', 'utf8');
console.log(fileContent.match(/[^\r\n]+/g));
} catch (err) {
console.error(err);
}
})();

Resources