Uploading filedata from API Call to MongoDB - node.js

Im receiving filedata from an API call from an external service.
I want to save this filedata to my MongoDB. I was met with the error that the files are too large.
I went to research GridFS as an extra collection in my MongoDB.
I really cant find anything that solves my issuse. Ive tried to use multer to upload the data like this:
async function addFileDataToDB(fileData) {
const storage = new GridFsStorage({
url: mongoose.connection,
file: (req, file) => {
console.log(file.mimetype)
if (file.mimetype === 'application/pdf') {
return {
bucketName: 'fileBucket'
};
} else {
return null;
}
}
});
const upload = multer({ storage });
upload(fileData)
console.log('YAY! : - )')
}
Doesnt seem like something i can use. If i understand it correctly i cant use multer to transfer the data received by the endpoint to MongoDB. Multer seems more like something you would use to upload files from a form etc.
Im looking for any kind of help to point me in the right dirrection to upload this file data from the endpoint to a collection in mongoDB.
To clearify the file data is in the format of a buffer containing bytes, and im trying to do this in nodejs/express
Im new to GridFS, keep that in mind.

Related

Writing base64 image to filesystem in React project

I'm successfully converting and sending an image to base64 in my React frontend.
This is received in 'req.body.imageString'. When I receive it, I'm then removing the unnecessary text at the front of the base64 string, using the .pop() method.
However, when I'm then attempting to write the file to my filesystem to be saved and queried later, it isn't saving anything despite there being no error, and it successfully console logging.
Here's my serverless function I'm using:
module.exports = async (req, res) => {
let base64Image = req.body.imageString
let finalImageString = base64Image.split(';base64,').pop()
fs.writeFile('assets/profilePictures/', finalImageString, { encoding: 'base64'}, function(err) {
console.log("File successfully written.")
})
res.send(200)
}
What I'd like to do is save the file into my 'assets' folder in my React app. So this would be the basic high level structure:
/root
// api
/// myServerlessFunction.js
// public
/// assets
Any tips would be appreciated!

How to display images of products stored on aws s3 bucket

I was practicing on this tutorial
https://www.youtube.com/watch?v=NZElg91l_ms&t=1234s
It is working absolutely like a charm for me but the thing is I am storing images of products I am storing them in bucket and lets say I upload 4 images they all are uploaded.
but when I am displaying them i got access denied error as I am displaying the list and repeated request are maybe detecting it as a spam
This is how i am trying to fetch them on my react app
//rest of data is from mysql datbase (product name,price)
//100+ products
{ products.map((row)=>{
<div className="product-hero"><img src=`http://localhost:3909/images/${row.imgurl}`</div>
<div className="text-center">{row.productName}</div>
})
}
as it fetch 100+ products from db and 100 images from aws it fails
Sorry for such detailed question but in short how can i fetch all product images from my bucket
Note I am aware that i can get only one image per call so how can I get all images one by one in my scenario
//download code in my app.js
const { uploadFile, getFileStream } = require('./s3')
const app = express()
app.get('/images/:key', (req, res) => {
console.log(req.params)
const key = req.params.key
const readStream = getFileStream(key)
readStream.pipe(res)
})
//s3 file
// uploads a file to s3
function uploadFile(file) {
const fileStream = fs.createReadStream(file.path)
const uploadParams = {
Bucket: bucketName,
Body: fileStream,
Key: file.filename
}
return s3.upload(uploadParams).promise()
}
exports.uploadFile = uploadFile
// downloads a file from s3
function getFileStream(fileKey) {
const downloadParams = {
Key: fileKey,
Bucket: bucketName
}
return s3.getObject(downloadParams).createReadStream()
}
exports.getFileStream = getFileStream
It appears that your code is sending image requests to your back-end, which retrieves the objects from Amazon S3 and then serves the images in response to the request.
A much better method would be to have the URLs in the HTML page point directly to the images stored in Amazon S3. This would be highly scalable and will reduce the load on your web server.
This would require the images to be public so that the user's web browser can retrieve the images. The easiest way to do this would be to add a Bucket Policy that grants GetObject access to all users.
Alternatively, if you do not wish to make the bucket public, you can instead generate Amazon S3 pre-signed URLs, which are time-limited URLs that provides temporary access to a private object. Your back-end can calculate the pre-signed URL with a couple of lines of code, and the user's web browser will then be able to retrieve private objects from S3 for display on the page.
I did sililar S3 image handling while I handle my blog's image upload functionality, but I did not use getFileStream() to upload my image.
Because nothing should be done until the image file is fully processed, I used fs.readFile(path, callback) instead to read the data.
My way will generate Buffer Data, but AWS S3 is smart enough to know to intercept this as image. (I have only added suffix in my filename, I don't know how to apply image headers...)
This is my part of code for reference:
fs.readFile(imgPath, (err, data) => {
if (err) { throw err }
// Once file is read, upload to AWS S3
const objectParams = {
Bucket: 'yuyuichiu-personal',
Key: req.file.filename,
Body: data
}
S3.putObject(objectParams, (err, data) => {
// store image link and read image with link
}
}

Cancel File Upload: Multer, MongoDB

I can't seem to find any up-to-date answers on how to cancel a file upload using Mongo, NodeJS & Angular. I've only come across some tuttorials on how to delete a file but that is NOT what I am looking for. I want to be able to cancel the file uploading process by clicking a button on my front-end.
I am storing my files directly to the MongoDB in chuncks using the Mongoose, Multer & GridFSBucket packages. I know that I can stop a file's uploading process on the front-end by unsubscribing from the subsribable responsible for the upload in the front-end, but the upload process keeps going in the back-end when I unsubscribe** (Yes, I have double and triple checked. All the chunks keep getting uploaded untill the file is fully uploaded.)
Here is my Angular code:
ngOnInit(): void {
// Upload the file.
this.sub = this.mediaService.addFile(this.formData).subscribe((event: HttpEvent<any>) => {
console.log(event);
switch (event.type) {
case HttpEventType.Sent:
console.log('Request has been made!');
break;
case HttpEventType.ResponseHeader:
console.log('Response header has been received!');
break;
case HttpEventType.UploadProgress:
// Update the upload progress!
this.progress = Math.round(event.loaded / event.total * 100);
console.log(`Uploading! ${this.progress}%`);
break;
case HttpEventType.Response:
console.log('File successfully uploaded!', event.body);
this.body = 'File successfully uploaded!';
}
},
err => {
this.progress = 0;
this.body = 'Could not upload the file!';
});
}
**CANCEL THE UPLOAD**
cancel() {
// Unsubscribe from the upload method.
this.sub.unsubscribe();
}
Here is my NodeJS (Express) code:
...
// Configure a strategy for uploading files.
const multerUpload = multer({
// Set the storage strategy.
storage: storage,
// Set the size limits for uploading a file to 120MB.
limits: 1024 * 1024 * 120,
// Set the file filter.
fileFilter: fileFilter
});
// Add new media to the database.
router.post('/add', [multerUpload.single('file')], async (req, res)=>{
return res.status(200).send();
});
What is the right way to cancel the upload without leaving any chuncks in the database?
So I have been trying to get to the bottom of this for 2 days now and I believe I have found a satisfying solution:
First, in order to cancel the file upload and delete any chunks that have already been uploaded to MongoDB, you need to adjust the fileFilter in your multer configuration in such a way to detect if the request has been aborted and the upload stream has ended. Then reject the upload by throwing an error using fileFilter's callback:
// Adjust what files can be stored.
const fileFilter = function(req, file, callback){
console.log('The file being filtered', file)
req.on('aborted', () => {
file.stream.on('end', () => {
console.log('Cancel the upload')
callback(new Error('Cancel.'), false);
});
file.stream.emit('end');
})
}
NOTE THAT: When canceling a file upload, you must wait for the changes to show up on your database. The chunks that have already been sent to the database will first have to be uploaded before the canceled file gets deleted from the database. This might take a while depending on your internet speed and the bytes that were sent before canceling the upload.
Finally, you might want to set up a route in your backend to delete any chunks from files that have not been fully uploaded to the database (due to some error that might have occured during the upload). In order to do that you'll need to fetch the all file IDs from your .chunks collection (by following the method specified on this link) and separate the IDs of the files whose chunks have been partially uploaded to the database from the IDs of the files that have been fully uploaded. Then you'll need to call GridFSBucket's delete() method on those IDs in order to get rid of the redundant chunks. This step is purely optional and for database maintenance reasons.
Try using try catch way.
There can be two ways it can be done.
By calling an api which takes the file that is currently been uploaded as it's parameter and then on backend do the steps of delete and clear the chunks that are present on the server
By handling in exception.
By sending a file size as a validation where if the backend api has received the file totally of it size then it is to be kept OR if the size of the received file is less that is due to cancellation of upload bin between then do the clearance steps where you just take the id and mongoose db of the files chuck and clear it.

Uploading data from firebase functions to firebase storage?

I have a website running with node.js, with the backend running on Firebase Functions. I want to store a bunch of JSON to Firebase Storage. The below snippet works just fine when I'm running on localhost, but when I upload it to Firebase functions, it says Error: EROFS: read-only file system, open 'export-stock-trades.json. Anyone know how to get around this?
fs.writeFile(fileNameToReadWrite, JSON.stringify(jsonObjToUploadAsFile), function(err){
bucket.upload(fileNameToReadWrite, {
destination: destinationPath,
});
res.send({success: true});
});
I can't tell for sure, since much of the context of your function is missing, but it looks like you function is attempting to write a file to local disk first (fs.writeFile), then upload it (bucket.upload).
On Cloud Functions, code you write only has write access to /tmp,
which is os.tmpdir() in node. Read more about that in the
documentation:
The only writeable part of the filesystem is the /tmp directory, which
you can use to store temporary files in a function instance. This is a
local disk mount point known as a "tmpfs" volume in which data written
to the volume is stored in memory. Note that it will consume memory
resources provisioned for the function.
This is probably what's causing your code to fail.
Incidentally, if the data you want to upload is in memory, you don't have to write it to a file first as you're doing now. You can instead use file.save() for that.
Another way I feel this could work is to convert the JSON file into a buffer and then perform an action like this (the code snippet below). I wrote an article on how you can do this using Google Cloud Storage but it works fine with Firebase storage. The only thing you need to change is the "service-account-key.json" file.
The link to the article can be found here: Link to article on medium
const util = require('util')
const gc = require('./config/')
const bucket = gc.bucket('all-mighti') // should be your bucket name
/**
*
* #param { File } object file object that will be uploaded
* #description - This function does the following
* - It uploads a file to the image bucket on Google Cloud
* - It accepts an object as an argument with the
* "originalname" and "buffer" as keys
*/
export const uploadImage = (file) => new Promise((resolve, reject) => {
const { originalname, buffer } = file
const blob = bucket.file(originalname.replace(/ /g, "_"))
const blobStream = blob.createWriteStream({
resumable: false
})
blobStream.on('finish', () => {
const publicUrl = format(
`https://storage.googleapis.com/${bucket.name}/${blob.name}`
)
resolve(publicUrl)
})
.on('error', () => {
reject(`Unable to upload image, something went wrong`)
})
.end(buffer)
})

Advice: flatiron, formidable and aws s3

I'm new with serverside programming with node.js. I'm sticking together a tiny webapp with it right now and having the usual startup learning to do. The following piece of code WORKS. But I would love to know if it's more or less a right way to do a simple file upload from a form and throw it into aws s3:
app.router.post('/form', { stream: true }, function () {
var req = this.req,
res = this.res,
form = new formidable.IncomingForm();
form
.parse(req, function(err, fields, files) {
console.log('Parsed file upload' + err);
if (err) {
res.end('error: Upload failed: ' + err);
} else {
var img = fs.readFileSync(files.image.path);
var data = {
Bucket: 'le-bucket',
Key: files.image.name,
Body: img
};
s3.client.putObject(data, function() {
console.log("Successfully uploaded data to myBucket/myKey");
});
res.end('success: Uploaded file(s)');
}
});
});
Note: I had to turn buffer off in union / flatiron.plugins.http.
What I would like to learn is, when to stream load a file and when to syncload it. It will be a really tiny webapp with little traffic.
If it's more or less good then please consider this as a token of working code which I also would throw into a gist. It's not that easy to find documenation and working examples of this kind of stuff. I like flatiron alot. But it's small module approach leads to lots of splattered docs and examples all over the net, speak alone of tutorials.
You should use other module than formidable because as far as I know formidable does not have s3 storage option , then you must save the files in your server before uploading it.
I would recommend you to use : multiparty
Use this example in order to upload directly to S3 without saving it locally in your server.

Resources