Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
I am building some kind of video streaming web app using node.js/express and MongoDB. But I am facing an issue related to where to store the mp4 files that my clients will upload to my back-end. I am not sure if MongoDB is capable of storing large files(in the GB order) and my current idea is to keep the files on a directory and then keep track of each file path on MongoDB. Is this a good idea or is there a better method to do so?
My advice, use
s3.amazonaws.com
Yes, it's way better to store only a path inside a MongoDB instead of storing directly the video file inside the DB. Because your DB will grow up so fast if you did that. The disk space taken by both solutions is the same, but overloading your DB with these files will just result in a slower DB result
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
i'm trying to create a web server that saves variables to a webserver on each situation, due to caching is it possible?,
like res.send(data)
i'm trying to make something like webdb(not sure if works)
This Webserver Uses Express
edit:
SOLVED
_
you have two main options in order to save your data values on the server-side:
1> save on the memory of your server using "Redis" data store.
Beginner’s Guide to Redis and Caching with NodeJS
2> save on the hard disk databases like using (MySQL db or mango db etc).
if your Json variable size is about small and it needs frequently used with your code the Redis memory store is a good option. (for session use also). if you rarely need to access the data or the size is high or another reason, mango db/MySQL db will be used.
(for caching purpose option 1 is better)
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
I am a beginner and I create a web app with react, I want to my web app be able to read and write a json or csv from my hard disk, I've done this easy with c++ and python I should learn about node.js, django or something like that? I've search and I don't know what to do
What should I do?
Edit: In this question I mean my disk no matter what, I readed the answers and already know what this is not a good idea
Part of the beauty of the web is that web browsers generally do not have access to the computer's filesystem. This is an intentional security choice. It would be horrible if advertisers could see the contents of your hard drive.
There are technologies that let individual websites store information on your computer that act a little bit like a filesystem, ranging from old school cookies to more advanced databases like LocalStorage or IndexedDB.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
What would be the best practice to upload files in MongoDB. I recently came accross Multer. It's easy to use, but it uploads files in the file system instead of uploading directly into the MongoDB.
Multer is a high-level wrapper for Busboy and unfortunately doesn't callback with a Stream unless you're writing a StorageEngine for it. The reason you want a stream so badly is because otherwise you would have to buffer the whole file in-memory of your Node process before being able to do anything with it. Streams are much more efficient and allows you to stream the data somewhere meanwhile you're receiving it, only buffering in-memory in case you cannot pipe the data somewhere at equal rate as you're receiving it.
Combining a custom StorageEngine with gridfs-stream would allow you to write the data to GridFS in real-time as you receive the data (e.g the user is still uploading).
I've found two GridFS Storage Engines for Multer:
https://github.com/ISMAELMARTINEZ/gridfs-storage-engine
https://github.com/arjandepooter/multer-gridfs
The latter doesn't seem to have any docs but it's still easy to use just looking through the source.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I am working on a project, where each user can post many images. We use Node.js with Express and MongoDB as the database. I was wondering which way would be better in terms of speed and scalability:
Storing images in Mongo GridFS
Or
Storing images on Amazon S3, and paths of images on MongoDB. Then retrieve images using paths.
Any thoughts are appreciated !
Thank you,
This is like comparing Go vs Node.js. There's no better general solution.
Each might have their own advantages and solutions. MongoDB is more like the DIY solution, and Amazon S3 is the managed solution. With MongoDB you have to scale it yourself. I can say S3 will be faster initially and it's already scaled by Amazon, and probably cheaper(S3 is cheaper than EBS). You can get a lot of servers with huge amounts of RAM and MongoDB will definitely be faster. Also if the MongoDB instance is in the same instance as your App, you will have less latency.
Also check this question: MongoDB as static files provider?
And this: What are the advantage of using MongoDB GridFS vs Amazon S3 to store assets for a product with MongoDB as the database backend?
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I am developing a website which uses a lot of images.
The images get manipulated very often (every few seconds, by the clients). All images are on a linux server. It is also possible that two clients try to change an image at the same time.
So my question is: should I put the images into a database or just leave them in a folder (how does the OS handle the write-write-collisions?)?
I use node.js and mongoDB on the server.
You usually store the reference to the file location inside of the database. As far as write-write collisions In most whoever has the file open first gets it however it mostly depends on the OS that you are working with. You will want to look into file locking. This wikipedia article gives a good overview.
http://en.wikipedia.org/wiki/File_locking
It is also considered good practice in your code to check and notify the user if the file is in use if write collisions are likely to occur.
I suggest you store your images within the MongoDB using the GridFS file system. This allows you to keep images and their metadata together, it has an atomic update and two more advantages, if you are using replica sets for your database:
Your images have the same high availability as the rest of your data and get backed-up together
You can, eventually, direct queries to secondary members of the set
For more information, see
http://docs.mongodb.org/manual/applications/gridfs
http://docs.mongodb.org/manual/replication/
Does this help?
Cheers
Ronald