Infinite scroll with vue and mongo - node.js

I want to implement infinite-scrolling in my vueApp.
I'm using mongo, node, vue, mongoose and having problems to understand the logic on the server-side.
my issue is this:
let's say I have a collection with 3 million documents,
when the user is scrolling down he will get 5 documents every time.
how I'm implementing this into a mongo query that will send 5 documents each time and will know to proceed from the last record it sent once its called again?
Thanks!

You can try applying the logic applied to pagination, that is when the user reaches the down, increment the page number and pass it to the query to fetch new records!

Related

I want to update multi document in mongodb

I have a collection in mongoDB that hold 66000 documents on it, Now I want to add new filed to all theses document a uniqueID field which take this form S-1111 and the number increase for the next document
I have tried to make a call to DB and update each one in order but it take to much time because this is a lot of request to mongoDB , so is there is another way to do that fast
I am working with Node and mongoose

solr-api accessing document field is taking a lot of time

I am trying to access a field in custom request handler. I am accessing it like this for each document:
Document doc;
doc = reader.document(id);
DocFields = doc.getValues("state");
There are around 600,000 documents in the solr. For a query running on all the docs, it is taking more than 65 seconds.
I have also tried SolrIndexSearcher.doc method, but it is also taking around 60 seconds.
Removing the above lines of code bring down the qtime to milliseconds. But, I need to access that field for my algo.
Is there a more optimised way to do this?
It seems that you querying one document at a time which is slow.
If you need to query all documents try to query *:*(instead of asking for a specific id) and then iterate over the results.

Mongoose bulk insert or update documents

I am working on a node.js app, and I've been searching for a way around using the Model.save() function because I will want to save many documents at the same time, so it would be a waste of network and processing doing it one by one.
I found a way to bulk insert. However, my model has two properties that makes them unique, an ID and a HASH (I am getting this info from an API, so I believe I need these two informations to make a document unique), so, I wanted that if I get an already existing object it would be updated instead of inserted into the schema.
Is there any way to do that? I was reading something about making concurrent calls to save the objects, using Q, however I still think this would generate an unwanted load on the Mongo server, wouldn't it? Does Mongo or Mongoose have a method to bulk insert or update like it does with insert?
Thanks in advance
I think you are looking for the Bulk.find(<query>).upsert().update(<update>) function.
You can use it this way:
bulk = db.yourCollection.initializeUnorderedBulkOp();
for (<your for statement>) {
bulk.find({ID: <your id>, HASH: <your hash>}).upsert().update({<your update fields>});
}
bulk.execute(<your callback>)
For each document, it will look for a document matching the {ID: <your id>, HASH: {your hash}} criteria. Then:
If it finds one, it will update that document using {<your update fields>}
Otherwise, it will create a new document
As you need, it will not make a connection to the mongo server on each iteration of the for loop. Instead a single call will be made on the bulk.execute() line.

Best async practices for Jade template pages created from objects in mongodb

I have this website run on node.js using jade for page templates and mongodb to store some data. One of the pages is this 'parts' page. It is just a list of parts I use in some project, but that's not important.
So i have this scenario where:
I add an object in the form of a doc to the DB
That updates a global array of those objects
That global array gets passed to the jade template which draws the list for each element in the global array of objects.
On server startup, should the server ever restart, it populates the global by querying the DB once.
Here's my question,
Is this the best practice I can use in an async environment like node?
My rational is that i'm reducing the amount of time spent communicating with the DB by keeping the docs in the app.
Would it be better to query the DB for the docs and pass that to the template each time a user loads a page?
is using a global like this even async or just too fast to tell?
note: it's only for few records; say less then 200 records.
note: there will never be more than 1 person adding to the array at a time.
edit: third option, is this method bad all together?

How to get Post with Comments Count in single query with CouchDB?

How to get Post with Comments Count in single query with CouchDB?
I can use map-reduce to build standalone view [{key: post_id, value: comments_count}] but then I had to hit DB twice - one query to get the post, another to get comments_count.
There's also another way (Rails does this) - count comments manually, on the application server and save it in comment_count attribute of the post. But then we need to update the whole post document every time a new comment added or deleted.
It seems to me that CouchDB is not tuned for such a way, unlike RDBMS when we can update only the comment_count attribute in CouchDB we are forced to update the whole post document.
Maybe there's another way to do it?
Thanks.
The view's return json includes the document count as 'total_rows', so you don't need to compute anything yourself, just emit all the documents you want counted.
{"total_rows":3,"offset":0,"rows":[
{"id":...,"key":...,value:doc1},
{"id":...,"key":...,value:doc2},
{"id":...,"key":...,value:doc3}]
}

Resources