How to send multipart file upload straight to mongodb in node - node.js

I can save the file to disk with formidable and then send the file bits to mongo with node, but how can I just handle streaming the file bits directly to mongo?
I don't need gridfs, these are small files. Just want to write them to the normal store.

Use options.fileWriteStreamHandler to setup your own stream. Then write to mongodb if the API accepts a stream

Related

stream large file upload into database using pg-promise

I would like to allow my users to upload large files <1GB to my database. I am using a database since storing raw files can be dangerous and I would like to have a single source of state in my system since its meant to be serverless.
Now the VPS am I am planning to run it on has limited ram. And multiple users should of course be able to upload simultaneously.
So in order to not exceed this ram, I would need to either
stream the image into the database as it is being uploaded from user
or I would need to first stream it into a file using something like multer and then stream it from the file into PostgreSQL as a BLOB
So is there a way to do this using pg-promise? Stream a file into the database without ever loading the whole thing into ram?

Streaming multiple files in one response- Node.js

In Node.js how can I stream multiple files in one response stream? I want to make a single api call from the browser application and in the response I should be able to send multiple files from the server. How can I do this in Node.js ? Please share some hints or code samples.
The files are stored as blob data in MongoDB and I use the module called gridfs-stream to read the files.

How to serve binary (bytea) data from postgres using node?

I'm testing out postgres binary abilities by storing some mp3 data in a table. I've read you're supposed to store them in an external filesystem like S3, but for various reasons I don't want to do that right now.
So, for now I'd like to test storing files in the db. The mp3 files are TTS mp3 files from a third-party and I've stored them in a postgres table. This is working ok. But how do I serve them to the client? In other words:
client http requests the files.
node requests (pg-promise) the records (one or many).
the data arrives from db to node in binary format.
??? Do I have to convert it to a mp3 file before sending? Can I send the binary file directly? Which would be better?
client receives file(s)
client queues files in order for playing audio.
My main question is whether I need to convert the binary record I received from postgres before sending, and how to do that?

How to get file from cloud storage and process as local file without downloading?

I am working on a project where I have to extract frames from a video by using ffmpeg (node.js). I first upload video to firebase storage from my client, and then I want to process it in the backend server. However, ffmpeg only accept file as if it is stored locally.
const ff =new ffmpeg('C:/Users/alexh/Desktop/alex/name.avi');
It will not work with url. I am wondering is any way I can get file from url as if it is stored locally or firebase can provide me a way to get the file? I don't want to use filebase trigger event because I want to send http request to backend server.
Thank you so much
The fluent-ffmpeg package supports operating on readable streams instead of just files. The GCS Client library supports creating a readable stream for a GCS object. By combining these you can has ffmpeg operate directly from GCS.
If you were running this on Linux or OS X, you could use GCS Fuse to mount the bucket on your filesystem and then point FFmpeg directly to it.

Possible to stream part of a file from mongo with node?

I see how I can create a readable stream for a file in GridFS in Node.js using the native mongo driver. However, I'm writing a server that responds to byte range requests, so I'd like to only stream back part of the file. What's the best way to do this? Should I just implement my own read stream that pulls data from the database in chunks? Thanks!
Unfortunately the stream does not support arbitrary starting points so you'll have to implement your own that seeks into the correct chunk and then streams from there.

Resources