I have a mongoose schema. I want to count records in a collection that corresponds to the schema. I don't want to count all records, but records that satisfies some criteria. How to execute this count synchronously?
In mongodb console I can do `db.mycollections.find(criteria).count()". How to
execute this query from mongoose code?
Mongoose, like most nodejs modules, is not designed to be used for synchronous code execution. This would result in all of the app's execution stalling while the database is performing the query, which could be a long time.
There is an asynchronous count function which you call on your model.
Assuming you made a model from your schema like so:
var MyModel = mongoose.model('mySchemaCollection', mySchema);
You can get the count like so:
MyModel.count(criteria, function (err, count) {
/* handle count */
});
You can read more about count, as well as other types of querying, from the Mongoose Documentation.
Related
I have always created a single connection with one connection string. My question is, how to make multiple connections(MongoDB instances) if an array of connection strings are given in NodeJs get API?
Let’s say multiple connection strings will have the same type of database. e.g., my database name is “University” and this database is available in all different locations. And I wanted to write one common API which will provide me with an array of universities from different connections, how to do it?
Example
connectionString1 = mongodb://localhost:27017
connectionString2 = mongodb://localhost:27018
connectionString3 = mongodb://localhost:27019
Now I wanted to connect with all three connection strings and fetch all records from them and send them
in response to one common API, how can I do it in an efficient manner? Also after retrieval of each query, I need to close the corresponding database instances.
Your input will help me to understand this structure in a better way
Execute your query against each database using example function named exec, and await the returned promise array by Promise.allSettled. Once settled, parse (e.g., reduce, maybe sort) for proper merging.
// each db with client.db(name)
const dbArr = [db1, db2, ...];
// execute query for given collection across each db, return promise array
function exec(coll, query) {
let p = [];
for (let db of dbArr) {
p.push(db.collection(coll).find(query))
}
return p;
}
// main
async function fetchUniversitiesBy(filter) {
try {
// make mongo filter doc
const query = filter
const results = await Promise.allSettled(exec('university', query));
/*
reduce, or execute any other manipulation here to merge results.
Can check `status` of settled objects here
*/
return results.reduce((acc, c) => [...acc, ...c], []);
} catch(e) {
console.log(e)
} finally {
// client `close()` here
}
}
In terms of 'API', invoke fetchUniversitiesBy where you defined your api/universities/get (or however defined) route. Imagine your request params can be passed as filter.
I have complex solution and I just need to run knex synchronously, is it possible?
I have scenario when knex query is run inside Promise.mapSeries for array with unknown number of elements. For each element some knex query is called, including insert query.
So, this insert could affect result for the next element of array.
var descriptionSplitByCommas = desc.split(",");
Promise.mapSeries(descriptionSplitByCommas, function (name) {
// knex.select
// knex.insert if select doesn't return results
});
This was not my initial code, so maybe even Promise.mapSeries should be removed. But I need each descriptionSplitByCommas array elements to be processed syncrhonously.
Otherwise often while processing next description in array I get SQL error, because of duplicate elements inserted for column with unique index. This would not happen if query would be synchronous.
I am using native promises, so I do not have experience with mapSeries, therefore I cannot tell you what exactly is going on at current state.
However running several asynchronous commands in series instead of parallel is quite common. There is one important thing, you have to know - once you create Promise, you do not have control about how and when it will be resolved. So if you create 100 Promises, they all start resolving in parallel.
This is the reason, there is no method for native promises like Promise.series - it is not possible.
What are your options? If you need to "create promise at one place, but run it in another", then factory method is your friend:
const runPromiseLater = () => Promise.resolve(25);
// some code
const myRealPromise = runPromiseLater();
myRealPromise.then( //
Of course, you can create array with these methods, then is question - how to run it in series?
If you can use Node with support for async/await, then for cycle is good enough
async function runInSeries(array) {
for (let i=0;i < array.length; i++){
await array[i]();
// or if you have only instructions in array then you get the value and then call some // await myMethod(array[i])
}
}
If you cant use that, then async library is your friend: https://caolan.github.io/async/docs.html#series
If you need to use the value from previous calls, you can use .waterfall
I have a big collection in MongoDB: size - 94.605.081.327 B count - 54.738.234. I have to change all documents in this collection, and it can't be done in a single update.
I would like to stream the data and update the collection one document at a time. Another developer on the same project recommended an approach like this:
stream.on('data', (data)=>{
stream.pause();
data.field = 'newValue'; // update
data.save((err)=>{
stream.resume();
})
})
Is it a good idea? Is there a more efficient way of doing this update with Node.js and Mongoose?
You can use bulk operation. Check out MongoDB documentation for detailed information. If you need to involve mongoose, your code would look something like this:
var bulk = Items.collection.initializeOrderedBulkOp();
bulk.find(query).update(update);
bulk.execute(function(error) {
callback();
});
First time working with MongoDB and Mongoose.
I have written an edit function which do something with the input params and then call an update on the model:
edit: function (input) {
var query = ...
var document = ...
return Model.update(query, document, {multi: true});
}
The function return a Promise with the number of affected documents.
I know that the Mongoose update function
updates documents in the database without returning them
so I was wondering if there is a way to somehow:
run my edit function
if everything goes well, run a find on the model in order to retrieve the updated documents, and return the find's Promise.
Any help would be appreciated.
In my db, I have a "Thing", whose id is "53e5fec1bcb589c92f6f38dd".
I wanna update the count member in it everytime when doAdd is called.
Like the code below.
But since the find and save operation is separated, I cannot get the desired result..
Any best practice about this situation?
Thanks in advance!
var mongoose = require('mongoose');
mongoose.connect('mongodb://localhost/test');
var ThingSchema = new mongoose.Schema({
count: Number,
});
var Thing = mongoose.model('Thing', ThingSchema);
var doAdd = function(id){
Thing.findById(id, function(err, thing){
if(******){
#perform some logic...
thing.count++;
}
thing.save();
});
};
var id = "53e5fec1bcb589c92f6f38dd";
doAdd(id);
doAdd(id);
The problem is you are expecting to increase the count by 2 as you are calling doAdd twice. But here the code in doAdd is of asynchronous nature i.e the second doAdd call is called before the the first query and save process is completed. The results are unpredictable due to the asynchronous nature of the function. All mongoose queries and the save calls are asynchronous.
So if you call
thing.save();
console.log('saved!!');
The console ouput appears before the document is actually saved because node doesn't wait for the save to be finished. So if you need any operation to be done after the document is saved, you would place it in the callback.
thing.save(function(err){
if(!err){
console.log('saved');
}
});
If you want to avoid callbacks have a look at async module.