Streaming multiple files in one response- Node.js - node.js

In Node.js how can I stream multiple files in one response stream? I want to make a single api call from the browser application and in the response I should be able to send multiple files from the server. How can I do this in Node.js ? Please share some hints or code samples.
The files are stored as blob data in MongoDB and I use the module called gridfs-stream to read the files.

Related

Transfer large files between two node js client through aws using serverless services

I want to know how to transfer large files between two node js clients through aws using serverless services without saving files in s3 bucket.
In my client side I used socket.io-stream to send files from one client to another through the node.js server.
Then how i used stream to send files to the AWS and then that stream sends back to the other client. Curently i am trying to use AWS Websocket, but how do I use stream with websocket?

stream large file upload into database using pg-promise

I would like to allow my users to upload large files <1GB to my database. I am using a database since storing raw files can be dangerous and I would like to have a single source of state in my system since its meant to be serverless.
Now the VPS am I am planning to run it on has limited ram. And multiple users should of course be able to upload simultaneously.
So in order to not exceed this ram, I would need to either
stream the image into the database as it is being uploaded from user
or I would need to first stream it into a file using something like multer and then stream it from the file into PostgreSQL as a BLOB
So is there a way to do this using pg-promise? Stream a file into the database without ever loading the whole thing into ram?

How to send multipart file upload straight to mongodb in node

I can save the file to disk with formidable and then send the file bits to mongo with node, but how can I just handle streaming the file bits directly to mongo?
I don't need gridfs, these are small files. Just want to write them to the normal store.
Use options.fileWriteStreamHandler to setup your own stream. Then write to mongodb if the API accepts a stream

Nodejs multiple sse request

I have multiple sse streams which can be accessed using a token. I need to keep all streams open and update the response coming from the streams to the database. I tried one stream with http module as mentioned in link
https://medium.com/#moinism/using-nodejs-for-uni-directional-event-streaming-sse-c80538e6e82e
And it works fine but I don't know how to keep multiple streams open and how to differentiate streams? There is also mqtt stream available but that too I have the same question.
I expect an on data function which is kept open and keeps streaming from various sources with an ID to differentiate.

How to serve binary (bytea) data from postgres using node?

I'm testing out postgres binary abilities by storing some mp3 data in a table. I've read you're supposed to store them in an external filesystem like S3, but for various reasons I don't want to do that right now.
So, for now I'd like to test storing files in the db. The mp3 files are TTS mp3 files from a third-party and I've stored them in a postgres table. This is working ok. But how do I serve them to the client? In other words:
client http requests the files.
node requests (pg-promise) the records (one or many).
the data arrives from db to node in binary format.
??? Do I have to convert it to a mp3 file before sending? Can I send the binary file directly? Which would be better?
client receives file(s)
client queues files in order for playing audio.
My main question is whether I need to convert the binary record I received from postgres before sending, and how to do that?

Resources