NodeJS: Download torrent as stream [closed] - node.js

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 1 year ago.
Improve this question
TL;DR Is is possible to proxy torrent larger than available local disk while piping it to outbound stream ?
According to BitTorrent spec , all torrents are stored as pieces of equal length , I want to write a node app could pipe the torrent pieces to a http upload stream , does any library provide such functionality ?
All the implementation I have found downloads the whole file to local storage then propagate it further which can cause problems when running on small disk and large files .

Bittorrent is designed for random access to keep data available via the rarest-first strategy. See Section 2.4.2 of the bittorrent econ paper. While it is possible to operate it in a streaming manner anyway this generally isn't recommended and certainly shouldn't be the default, otherwise performance could severely degrade for all swarm members or content could even become unavailable.

Related

Where to store mp4 files on a node.js server? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
I am building some kind of video streaming web app using node.js/express and MongoDB. But I am facing an issue related to where to store the mp4 files that my clients will upload to my back-end. I am not sure if MongoDB is capable of storing large files(in the GB order) and my current idea is to keep the files on a directory and then keep track of each file path on MongoDB. Is this a good idea or is there a better method to do so?
My advice, use
s3.amazonaws.com
Yes, it's way better to store only a path inside a MongoDB instead of storing directly the video file inside the DB. Because your DB will grow up so fast if you did that. The disk space taken by both solutions is the same, but overloading your DB with these files will just result in a slower DB result

When to save user data to disk? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
I've been getting into web applications and node.js lately, and it's obvious that you should write user data to the disk, but when should I? It would be a bit overkill and very resource intensive to write to the disk every time data is updated, so when should you?
I'd recommend creating a temporary memory cache to store user data. (For actively connected users)
Read and write to this memory cache as needed to maintain user sessions / realtime functionality, then write to disk as necessary for persistent data. "Eventual persistence" is an option for avoiding writing to disk very often, but could lead to eventual issues if writes fail.

Uploading files in MongoDB [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
What would be the best practice to upload files in MongoDB. I recently came accross Multer. It's easy to use, but it uploads files in the file system instead of uploading directly into the MongoDB.
Multer is a high-level wrapper for Busboy and unfortunately doesn't callback with a Stream unless you're writing a StorageEngine for it. The reason you want a stream so badly is because otherwise you would have to buffer the whole file in-memory of your Node process before being able to do anything with it. Streams are much more efficient and allows you to stream the data somewhere meanwhile you're receiving it, only buffering in-memory in case you cannot pipe the data somewhere at equal rate as you're receiving it.
Combining a custom StorageEngine with gridfs-stream would allow you to write the data to GridFS in real-time as you receive the data (e.g the user is still uploading).
I've found two GridFS Storage Engines for Multer:
https://github.com/ISMAELMARTINEZ/gridfs-storage-engine
https://github.com/arjandepooter/multer-gridfs
The latter doesn't seem to have any docs but it's still easy to use just looking through the source.

dedicated servers for socket.io? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
One of the main features in my website is a simple One-to-One chat.
I'm debating whether or not I shall dedicate a server (or a cluster) for the sole purpose of this chat feature. The simpler option would be combining this feature as part of the web-servers and just scale out when necessary.
It is worth mentioning I'd like in the future to enable images transfer within the chat.
So what is the better option and why?
Well yes, Whether to use another dedicated server is not depending on how much traffic your site will have to handle. If you're dealing with images It will be a good idea to store them in another server and keep the root server clean.

put all images in a database or just in a folder [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I am developing a website which uses a lot of images.
The images get manipulated very often (every few seconds, by the clients). All images are on a linux server. It is also possible that two clients try to change an image at the same time.
So my question is: should I put the images into a database or just leave them in a folder (how does the OS handle the write-write-collisions?)?
I use node.js and mongoDB on the server.
You usually store the reference to the file location inside of the database. As far as write-write collisions In most whoever has the file open first gets it however it mostly depends on the OS that you are working with. You will want to look into file locking. This wikipedia article gives a good overview.
http://en.wikipedia.org/wiki/File_locking
It is also considered good practice in your code to check and notify the user if the file is in use if write collisions are likely to occur.
I suggest you store your images within the MongoDB using the GridFS file system. This allows you to keep images and their metadata together, it has an atomic update and two more advantages, if you are using replica sets for your database:
Your images have the same high availability as the rest of your data and get backed-up together
You can, eventually, direct queries to secondary members of the set
For more information, see
http://docs.mongodb.org/manual/applications/gridfs
http://docs.mongodb.org/manual/replication/
Does this help?
Cheers
Ronald

Resources