How to save data unlimited data to mongodb collection - node.js

I am trying to save the data from an API coinbase pro, The loop will run until all data is fetched and is being saved to mongodb collection. But the main issue is when we reach 16MB , the script fails.
I need a viable solution to save unlimited data to mongodb collection and utilize it.

MongoDB documents have a maximum size of 16MB according to the docs
"The maximum BSON document size is 16 megabytes.
The maximum document size helps ensure that a single document cannot use excessive amount of RAM or, during transmission, excessive amount of bandwidth. To store documents larger than the maximum size, MongoDB provides the GridFS API..."
(https://docs.mongodb.com/manual/reference/limits/)
It might be worth checking out that GridFS API (but I haven't yet).
Are you trying to insert ONE document that is 16MB+? or are you trying to insert MULTIPLE documents that are adding up to 16MB+?

Related

How can we handle to save Object of size greater then 16MB in mongodb, is it a hard limit for any document

How can we handle to save Object of size greater then 16MB in mongodb , is it a hard limit for any document ie can't we save any document larger then 16Mb in mongodb ?
is this limit in the library that we are using to save documents in mongodb.
You need GridFS.
GridFS is a specification for storing and retrieving files that exceed the BSON-document size limit of 16 MB.
is it a hard limit for any document ie can't we save any document larger then 16Mb in mongodb ?
Yes, it is a hard limit.

How can I add a data of size more than 2 mb as a single entry in cosmos db?

As there is a size limit for cosmsos db for single entry of data, how can I add a data of size more than 2 mb as a single entry?
The 2MB limit is a hard-limit, not expandable. You'll need to work out a different model for your storage. Also, depending on how your data is encoded, it's likely that the actual limit will be under 2MB (since data is often expanded when encoded).
If you have content within an array (the typical reason why a document would grow so large), consider refactoring this part of your data model (perhaps store references to other documents, within the array, vs the subdocuments themselves). Also, with arrays, you have to deal with an "unbounded growth" situation: even with documents under 2MB, if the array can keep growing, then eventually you'll run into a size limit issue.

How manage big data in MongoDb collections

I have a collection called data which is the destination of all the documents sent from many devices each n seconds.
What is the best practice to keep the collection alive in production without documents overflow?
How could I "clean" the collection and save the content in another one? Is it the correct way?
Thank you in advance.
You cannot overflow, if you use sharding you have almost unlimited space.
https://docs.mongodb.com/manual/reference/limits/#Sharding-Existing-Collection-Data-Size
Those are limits for single shard, and you have to start sharding before reaching them.
It depends on your architecture, however limit (in worst case) of 8.19200 exabytes (or 8,192,000 terabytes) is unreachable for most of even big data apps, if you multiply number of shard possible in replica set by max collection size in one of them.
See also:
What is the max size of collection in mongodb
Mongodb is a best database for storing large collection. You can do below steps for better performance.
Replication
Replication means copying your data several times on a single server or multiple server.
It provides a backup of your data every time when you insert data in your db.
Embedded document
Try to make your collection with refreences. It means that try to make refrences in your db.

inserting bulk data in mongo db, I get Error : "mongo/util/concurrency/rwlock.h:204"

while inserting bulk data around million in mongo db, I get Error:
WriteError({"code":8,"index":0,"errmsg":"assertion
C:\data\mci\src\src\mongo/util/concurrency/rwlock.h:204"
I am using NodeJS and Mongo DB.
Please let me know why it is coming I can share code also if needed
For Bulk insertion : The size of the document must be less than or equal to the maximum BSON document size.
The maximum BSON document size is 16 megabytes.
We can check it on Mongo Shell by issuing following command
db.isMaster().maxBsonObjectSize/(1024*1024)

MongoDB: How can I store files (Word, Excel, etc.)?

I've yet to look into storing files such as Word, Excel, etc. into MongoDB seriously and I want to know - am I able to store whole docx or excel files in MongoDB and then RETRIEVE them via querying?
Using gridfs yes.
Gridfs is a storage specification. It is not built into the DB but instead into the drivers.
You can find out more here: http://www.mongodb.org/display/DOCS/GridFS.
It's normal implementation is to break down your big documents into smaller ones and store those aprts in a chunks collection mastered by a fs.files collection which you query for your files.
MongoDB is a document database that stores JSON-like documents (called BSON). Maximum size of a BSON object is 16 megabytes, which may be too little for some use cases.
If you want to store binary data of arbitrary size, you can use GridFS (http://www.mongodb.org/display/DOCS/GridFS+Specification). GridFS automatically splits your documents (or any binary data) into several BSON objects (usually 256k in size), so you only need to worry about storing and retriving complete documents (whatever their sizes are).
As far as I know, Mongoose doesn't support GridFS. However, you can use GridFS via its native driver's GridStore. Just run npm install mongodb and start hacking!

Resources