MongoDB insetMany not inserting all documents when document length is over 50k - node.js

I am trying to add a list of contacts to my db. I am using inertMany to do so. Every thing works fine if amount of contacts i am adding is within 50k. if it exceeds that not all data is being save to db. the response i get after insert is { ok: 1, n: 1047 }
. Anybody know why this is happening ?

Related

Insert records into database and at the same time notify users about number of record inserted or failed in node.js

I have a requirement like this:
Maximum 500 records.
I have to insert records into a table. However, before inserting them I have to check if that same record or it's parents are already inserted.
Want to achieve:- How can i notify the user at the same time once the record is inserted in node.js
Example:- if i am uploading 400 records and 5 records are inserted user should be notified that 5 record is inserted if any failed, failed record count should be notified.
Any help would be really appreciated.
Igor already told you how to make a question so you should pass through what he righted.
Now answering your question , you need basically control the insertion. And use 2 counters, for example: let inserted , let notInserted or var inserted , var notInserted
For each insertion you check if exists , if yes notInserted +1 else inserted+1
At the end you should return to user the result res.send().json({message:"Inserted"+ inserted+"Not Inserted"+notInserted}) ;
Something like this!

couchDB conflicts when supplying own ID with large inserts using _bulk_docs

Same code works fine when letting couch auto generate UUID's. I am starting off with a new completely empty database yet I keep getting this
error: conflict
reason: Document update conflict
To reiterate I am posting new documents to an empty database so not sure how I can get update conflicts when nothing is being updated. Even stranger the conflicting documents still show up in the DB with only a single revision, but overall there are missing records.
I am trying to insert about 38,000 records with _bulk_docs in batches of 100. I am getting these records (100 at a time) from a RETS server, each record already has a unique ID that I want to use for the couchDB _id instead of their UUID's. I am using a promised based library to get the records and axios to insert them into couch. After getting the first batch of 100 I then run this code to add an _id to each of the 100 records before inserting
let batch = [];
batch = records.results.map((listing) => {
let temp = listing;
temp._id = listing.ListingKey;
return temp;
});
Then insert:
axios.post('http://127.0.0.1:5984/rets_store/_bulk_docs', { docs: batch })
This is all inside of a function that I call recursively.
I know this probably wont be enough to see the issue but thought Id start here. I know for sure it has something to do with my map() and adding the _id = ListingKey
Thanks!

Speeding up my cloudant query

I was wondering whether someone could provide some advice on my cloudant query below. It is now taking upwards of 20 seconds to execute against a DB of 50,000 documents - I suspect I could be getting better speed than this.
The purpose of the query is to find all of my documents with the attribute "searchCode" equalling a specific value plus a further list of specific IDs.
Both searchCode and _id are indexed - any ideas why my query would be taking so long / what I could do to speed it up?
mydb.find({selector: {"$or":[{"searchCode": searchCode},{"_id":{"$in":idList}}]}}, function (err, result) {
if(!err){
fulfill(result.docs);
}
else{
console.error(err);
}
});
Thanks,
James
You could try doing separate calls for the queries
find me documents where the searchCode = 'some value'
find me documents whose ids match a list of ids
The first can be achieved with a find call and a query like so:
{ selector: {"searchCode": searchCode} }
The second can be achieved by hitting the databases's _all_docs endpoint, passing in the list of ids as a keys parameter e.g.
GET /db/_all_docs?keys=["a","b","c"]
You might find that running both requests in parallel and merging the results gives you better performance.

Mongodb: how to compare DB to new data

Each week I receive a new copy of source data (8500, and growing, records approx and with an id field that Mongo uses as _id) and I want to look for (and save, while keeping the old data) updated information (about 30 changes/additions per month are likely). I'm trying to work out the best approach.
My first thought was, for each entry in new data, get the DB entry with that _id, compare, and update the data where changed. But that results in 8500 asynchronous calls over the net (to mongolab) + 30 upserts where new/changed data needs to be saved.
So, the alternative is to download everything at the outset. But then I end up with an Array from Mongo and would need to do Array.find each time to get the element that matches with the new data.
Is there a Mongo command to return the results of .find({}) as a Javascript Object keyed by _id? Or, does it otherwise make sense to take the raw array form Mongo and covert it myself to an object
I will store :
id + version + date + datas
For each update :
Make a dump of prod DB for local usage
work offline, in a local mongoDB (because you don't want to launch 9000 query over the web)
for each line
compare datas to mongo datas
if modifications ==true, will store a new/first (id+version)
else skip;
make a dump of your local DB
installl dump to production environnement
mongodb doc dump

Meteor last executed query in mongodb?

Meteor Mongo and Mongodb query is doest same. I am using external Mongodb. so I need to debug my query. Is their any way to find last executed query in Mongo?
Don't know if this works in meteor mongo -but you seem to be using an external mongo - presumably you set up profiling with a capped collection, so that the collection never grows over a certain size. If you only need the last op, then you make the size pretty much smaller than this.
db.createCollection( "system.profile", { capped: true, size:4000000 } )
The mongo doc is here: http://docs.mongodb.org/manual/tutorial/manage-the-database-profiler/
From the mongo docs:
To return the most recent 10 log entries in the system.profile
collection, run a query similar to the following:
db.system.profile.find().limit(10).sort( { ts : -1 } ).pretty()
Since it's sorted inversely by time, just take the first record from the result.
Otherwise you could roll your own with a temporary client-only mongo collection:
Queries = new Mongo.Collection(null);
Create an object containing your query, cancel the last record and insert the new one.

Resources