Good morning,
I thought that following the instructions of Push element into nested array mongoose nodejs I would manage to solve my issue but I am still stuck.
I am trying to get the results of a Mongoose query into an array. The below code works when the number of objects is relatively small, but I get a "parse error" whenever the volumen increases.
I can see I am not considering the fact that the code is asynchronous but my the attemps I have tried end up in Promise { <pending> } at best.
const Collection = require("./schema");
const connection = require("mongoose");
let data = [];
Collection.find({},(e,result)=>{
result.forEach(doc =>{
data.push([doc["a"],
doc["b"]])
});
})
.then(()=>connection.close())
module.exports = data;
The above is obviously wrong, since I am not respecting the async nature of the operation.
I implemented the below function, but I don't understand how I should resolve the promise.
async function getdata() {
const cursor = Collection.find().cursor();
let data = []
await cursor.eachAsync(async function(doc) {
await new Promise(resolve =>resolve((data.push(doc))));
});
}
The aim is that when I do let data = require("./queryResult.js") data contains all the necessary data [[a,b],[a,b]]
Could anybody give me a hand on this one?
Many thanks in advance.
Related
I have some async code that makes calls to a mongo database and inserts/fetches items. When I am developing locally, the code below works fine. However, when I make the mongoose instance connect to MongoDb Atlas, issues arise. In particular, it seems that my code does not work properly unless I console.log the promise, which makes no sense to me. For example, with the console.log statement, all my tests pass as expected. Without it, 35 tests fail... This is because the promise I am expecting returns null, when it should return some JSON object from the database. Is my code not blocking properly?
It feels like I'm dealing with Schrodinger's cat... Any help would be appreciated. Thanks in advance.
Below is an example promise/function call. I then pass it into _executeQuery. I have await on relevant functions, so I don't think it's because I'm missing the word await somewhere.
async _inSomeAsyncFunction = () => {
const dbQueryPromise = this._dbModel.findById(_id, modelView).lean();
await this._executeQuery({ dbQueryPromise, isAccessPermitted: true })
}
_executeQuery basically gets the result of the promise if the user has access.
private _executeQuery = async (props: {
isAccessPermitted: boolean;
dbQueryPromise: Promise<any>;
}): Promise<any> => {
const { isAccessPermitted, dbQueryPromise } = props;
if (!isAccessPermitted) {
throw new Error('Access denied.');
}
console.log(dbQueryPromise, 'promise'); // without this line, dbQueryResult would be null...
const dbQueryResult = await dbQueryPromise;
return dbQueryResult;
};
After some more testing, I found out that the first API call works but any calls after that returns null...
EDIT:
this._dbModel is some mongoose schema. For example,
const dbSchema= new Schema({
name: String,
});
const dbModel = mongoose.model('DbSchema', dbSchema);
Try replacing your dbQueryPromise as follows:
const dbQueryPromise = this._dbModel.findById(_id, modelView).lean().exec();
Mongoose queries do not get executed unless you pass a callBack function or use exec()
For anyone else having similar problems, here's how I solved it:
I changed
const dbQueryResult = await dbQueryPromise;
to
const dbQueryResult = await dbQueryPromise.then((doc) => {
return doc;
});
The goal of my function is to loop through several 'community' documents in the collection 'communities'. Each community document has a collection of documents called 'posts' where I query the document with the highest value of 'hotScore'. I then loop through those documents (contained in postsQuerySnapArray) to access the data in them.
My issue is that when I loop through the postQuerySnapArray, every document in postQuerySnap is of type undefined. I have verified that all communities contain a 'posts' collection and every post document has a 'hotScore' property. Does anyone know what could be causing this behavior? Thanks!
exports.sendNotificationTrendingPost = functions.https.onRequest(async (req, res) => {
try {
const db = admin.firestore();
const communitiesQuerySnap = await db.collection('communities').get();
const communityPromises = [];
communitiesQuerySnap.forEach((community) => {
let communityID = community.get('communityID');
communityPromises.push(db.collection('communities').doc(communityID).collection('posts').orderBy('hotScore', 'desc').limit(1).get())
});
const postsQuerySnapArray = await Promise.all(communityPromises);
postsQuerySnapArray.forEach((postsQuerySnap, index) => {
const hottestPost = postsQuerySnap[0]; //postsQuerySnap[0] is undefined!
const postID = hottestPost.get('postID'); //Thus, an error is thrown when I call get on hottestPost
//function continues...
Finally figured out what my problem was. Instead of getting the element in postsQuerySnap by calling
const hottestPost = postsQuerySnap[0];
I changed my code to get the element by using a forEach on postsQuerySnap
var hottestPost;
postsQuerySnap.forEach((post) => {
hottestPost = post;
})
I'm still not quite sure why postsQuerySnap[0] didn't work originally, so if anyone knows please leave a comment!
Edit: As Renaud said in his comment, a better fix would be const hottestPost = postsQuerySnap.docs[0] since postsQuerySnap is not an array.
MongoDB version 3.6.7, mongoDB node.js driver version 3.1.10.
Have a function which should add some data to the db in a bulk unordered operation. When bulk.execute() is called the data is inserted into the DB as have manually checked this is the case, however the bulkWriteResult object which should be generated as a result of the execution isn't being returned.
We need that for use in other functions to determine the number of insertions etc. So when the function in question is chained in other functions it returns undefined, when adding bulk.execute().then(console.log) also nothing is logged to the terminal.
The function takes 3 parameters, the MongoClient, the name of the collection and the documents to be inserted into the DB which is an array of documents.
Have also tried adding err and result callbacks too with no luck, none of the console.logs are reached.
batch.execute((err, result) => {
console.log('RESULT INSERTED:', result.nInserted)
console.log('RESULT ERRORS:', result.getWriteErrorCount())
console.log('RESULT INSIGHTS:', result.getWriteErrors())
console.log('ERROR:', err)
})
Any ideas why the bulkWriteResult would not be returned yet the bulk insertion is successful?, function was working and returning the expected object prior to upgrading the mongoDB node driver to 3.1.10.
const insertManyMissingEntries = (database, collectionName, documents) => {
const db = database.db('data')
const collection = db.collection(collectionName)
const batch = collection.initializeUnorderedBulkOp()
documents.forEach(doc => {
batch
.find({ year: doc.year, month: doc.month, code: doc.code })
.upsert()
.updateOne({ '$setOnInsert': doc })
})
return batch.execute()
}
Here is your function
const insertManyMissingEntries = (database, collectionName, documents) => {
if (documents.length === 0)
return Promise.reject('documents length is zero')
const operations = documents.map((doc) => {
const filter = {year: doc.year, month: doc.month, code: doc.code}
const update = {$setOnInsert: doc}
return {updateOne: {'filter': filter, 'update': update, 'upsert': true}}
})
const db = database.db('data')
const collection = db.collection(collectionName)
const bulkOptions = {ordered: false}
return collection.bulkWrite(operations, bulkOptions)
}
If documents parameters is empty i.e. length = 0 the collection bulk write function will always fail. Watch it.
Also you need to make sure you have index on fields year, month, code or your bulk will be executing slow because it will every time it will perform a collection scan to find a document.
Remember,
driver docs is your best friend
http://mongodb.github.io/node-mongodb-native/3.1/api/Collection.html
There wasn't a way to resolve this issue though we believe it was caused by a docker mongo image version not being the official one or out of date. Even the folks at mongoDB weren't able to replicate this or explain any circumstances where a promise should fail to either resolve or reject within the timescales the functions were working to so like many things it remains a mystery that we had to workaround by basically using a ternary to handle the undefined that resulted from the promise error without ever actually solving the underlying issue - not ideal or best practice but in absence of other options it's what we did.
Thanks for everyone's help and comments to try to resolve it, much appreciated.
This is a follow-up question of this Stackoverflow question: Async Cursor Iteration with Asynchronous Sub-task. with a slightly different turn this time.
While iterating over MongoDB documents the task stops in the middle if the target DB size is too large. (more than 3000 documents in a single collection and each document consists of lengthy texts, so .toArray is not really feasible due to memory limit. 3000 is just a part of the whole data and the full data might be more than 10,000 documents.) I've noticed if the number of documents in a collection is larger than approx. 750, it just stops in the middle of the task.
I've searched over previous Stackoverflow questions to solve this: some say iteration on a large collection requires using stream, each or map instead of for/while with cursor. When I tried these recommendations in real life, non of them did work. They also just stops in the middle, bears almost no difference from for/while iteration. I don't really like the idea of expanding timeout since it may leave the cursor behind drifting around in the memory but it also didn't work.
every method below is under async condition
stream method
const cursor = db.collections('mycollection').find()
cursor.on('data', doc => {
await doSomething(doc)//do something with doc here
})
while/for method(just replace while with for)
const cursor = db.collections('mycollection').find()
while ( await cursor.hasNext() ) {
let doc = await cursor.next()
await doSomething(doc)
}
map/each/foreach method(replace map with foreach/each)
const cursor = db.collections('mycollection').find()
cursor.map(async doc=>{
await doSomething(doc)
})
none of them shows any difference to the other. They just stop when it iterates around approx. 750 documents and just hang. I've even tried registering each document on Promise.all queue and do the async/await task at once later so that cursor won't spend too much time while iterating but the same problem arises.
EDIT: I think doSomething() confuses the other readers. So I have created a sample code so that you can reproduce the problem.
const MongoClient = require('mongodb').MongoClient
const MongoUrl = 'mongodb://localhost:27017/'
const MongoDBname = 'testDB'
const MongoCollection = 'testCollection'
const moment = require('moment')
const getDB = () =>
new Promise((resolve,reject)=>{
MongoClient.connect(MongoUrl,(err,client)=>{
if(err) return reject(err)
console.log('successfully connected to db')
return resolve(client.db(MongoDBname))
client.close()
})
})
;(async ()=>{
console.log(`iteration begins on ${moment().format('YYYY/MM/DD hh:mm:ss')} ------------`)
let db = await getDB() //receives mongodb
//iterate through all db articles...
const cursor = await db.collection(MongoCollection).find()
const maxDoc = await cursor.count()
console.log('Amount of target document:' + maxDoc)
let count = 0
//replace this with stream/while/map...any other iteration methods
cursor.each((err,doc)=>{
count ++
console.log(`preloading doc No.${count} async ${(count / maxDoc * 100).toFixed(2)}%`)
})
})()
My apologies. on the test run. it actually iterated all the documents...I think I really have done something wrong with the other parts. I'll elaborate this one with the other parts causing the trouble.
I am trying to use Typescript with Express and Mongoose. So far the result has been amazing. I am however stuck at a very minor part.
Premise: I am executing a Mongoose Query using EXEC()
let result = await UserModel.User.find().exec();
I have to use the async / await as there is some processing after this line and I want to avoid callbacks throughout
Problem
I need to get the {err, data} from the result object returned by the query. However currently it simply holds the entire data and I am not able to perform the error handling
So need a way to get the mongoose error description when I use async/await
Error handling using async/await is done by using try/catch:
try {
let result = await UserModel.User.find().exec();
...
} catch(err) {
...
}
Try using the library await-to-js
Example:
const to = require(‘await-to-js’).default
const [err, result] = await to(func())
if (err) throw err
...