Nested documents in mongodb implementation, just like Reddit comments - node.js

In my project, I would like to implement a comment section which consists of list of comments.
const myschema= mongoose.Schema({
_id: mongoose.Schema.Types.ObjectId,
//other fields
comments : [comment]
},{collection : 'TABLE_NAME'} );
const comment= mongoose.Schema({
_id : mongoose.Schema.Types.ObjectId,
commentid: String, // id consists of uuid1()
senderid: Number, //user's id
body: String, // useful information
parent: String,// parent id consists of uuid1()
children: [] // I couldn't add children = [comment], it gives me an error,
//instead, I make it empty and will fill it when a comment comes
})
In the request tab, I will receive a JSON in the following format:
{
"userid": "NUMBERID HERE",
"comment": "COMMENT BODY",
"parent" : "Comment's parent id"
}
I would like to add a comment which can be a child of another comment. How can I search and find the appropriate position?
If there are no parent in JSON body, I'm doing this:
// import comment somewhere at the beginning
.then(doc =>{
var newc= new comment();
newc.cid = uuidv1();
newc.sender = req.body.userid;
newc.body = req.body.comment;
newc.parent = "";
newc.children = "";
doc.comments.push(newc);
// save to DB
doc
.save()
.then(docres =>{
res.status(200).json(docres);
})
.catch(err => {
res.status(500).json({error: err});
})
}
I have no idea how to find a comment that resides in a deep level

You cannot search for an array element or object property given an unspecified, arbitrarily-nested depth. It just isn't possible. You would need to instead handle this in the application layer. Denormalizing your data is fine in many cases, but arbitrary nesting depths isn't a recommended use case for data denormalization, especially since you can't index efficiently!
If you want a pure MongoDB solution, then you'll need a different document structure. I would recommend taking a look at the documentation, particularly the section concerning an array of ancestors, in order to properly model your data.

I have found a solution.
The manually traversing the comments and inserting in the right place works. However, it only works up to some level. In my case, I can insert a comment below comment and save it. I can also insert a 3rd deep level comment and see the JSON dump of the object that I have inserted or even wrap it in an HTTP response object and send to the client. But the database does not save this update.
What I did is, after inserting the comment in the correct place I added this code before saving to the database.
doc.markModified('comments'); // this is added
doc.save().then(/*prepare response*/);
Somehow the MongoDB or mongoose or javascript interpreter knows the document has been changed and save the updated version. If the markmodified part is not specified, probably the compiler thinks that the object has not been modified and skips it.
Hopefully, this helps people who encounter this issue.
This is my implementation
// find the document which comment is going to be inserted
MyDOC.findOne({'id' : req.params.id})
.exec()
.then(doc =>{
// if the comment has a parent, it should be inserted in nested
if(req.body.parent != null && req.body.parent != undefined)
{
// find correct position and insert
findComment(doc.comments, req.body);
doc.markModified('comments');
doc
.save()
.then(docres =>{
res.status(200).json(docres);
})
.catch(err => {
console.log(err);
res.status(500).json({error: err});
})
}
else
{
//Add comment to the root level
var comment= new Comment();
comment.cid = uuidv1();
comment.body = req.body.comment;
comment.parent = "";
doc.comments.push(comment);
doc.markModified('comments');
// save to DB
doc
.save()
.then(docres =>{
res.status(200).json(docres);
})
.catch(err => {
console.log(err);
res.status(500).json({error: err});
})
}
})
function findComment(comment, body)
{
comment.forEach(element => {
if(element.children.length != 0)
{
if(element.cid === body.parent)
{
// found, insert
var comment= new Comment();
comment.cid = uuidv1();
comment.body = body.comment;
comment.parent = body.parent;
element.children.push(comment);
return true;
}
if(findComment(element.children, body, usr))
return true;
}
else
{
// this comment does not have children. If this comments id and the body's parent is equal, add it
if(element.cid === body.parent)
{
// found, insert
var comment= new Comment();
comment.cid = uuidv1();
comment.body = body.comment;
comment.parent = body.parent;
element.children.push(comment);
return true;
}
return false;
}
});
}

Related

Mongoose: After finding document, iterate over a value in the document and run a new query on each

I have one schema which contains an array of references to another schema (among other fields):
const RecipeIngredient = new Schema({
ingredientId: { // store id ref so I can populate later
type: Schema.Types.ObjectId,
ref: 'ingredients',
required: true
},
// there are a couple other fields but not relevant here
});
const Recipe = new Schema({
ingredients: [RecipeIngredient]
});
I'm trying to write a route which will first find a recipe by _id, populate the ingredients array (already have this working), and finally iterate over each ingredient in that array.
router.get('/:recipeId/testing', async (req, res) => {
const { recipeId } = req.params
let recipe = await Recipe
.findById(recipeId)
.populate({
path: 'ingredients.ingredientId',
model: 'Ingredient',
select: '_id ......' //I'm selecting other fields too
})
.lean()
.exec();
if (recipe) {
const { ingredients } = recipe;
const newIngredients = [];
await ingredients.forEach(async (ingr) => {
// here I'd like to be able to run a new query
// and append the result to an array outside of the forEach
// I do need information about the ingr in order to run the new query
newIngredients.push(resultOfNewQuery);
});
return res.json(newIngredients)
};
return res.status(404).json({ noRecipeFound: 'No recipe found.'});
})
I've tried approaching this in a few different ways, and the closest I've gotten was executing the new query within each iteration, but because the query is async, I return the response before I've actually collected the documents from the inner query.
I also attempted to use .cursor() in the initial query, but that won't work for me because I do need to access the ingredients field on the recipe once it is resolved before I can iterate and run the new queries.
Any ideas would be appreciated! I'm definitely opening to restructuring this whole route if my approach is not ideal.
I was able to make this work by using a for loop:
const newIngredients = [];
for (let idx = 0; idx < ingredients.length; idx++) {
const { fieldsImInterestedIn } = ingredients[idx];
const matchingIngredients = await Ingredient
.find(fieldsImInterestedIn)
.lean()
.exec()
.catch(err => res.status(404).json({ noIngredientsFound: 'No ingredients found' }));
newIngredients.push(ingredientsToChooseFrom[randomIndex]);
};
return res.json(newIngredients);
still a little perplexed as to why this was able to work while forEach wasn't, but I'll happily move on...

Firebase Cloud Function batch write update document overwrote the entire document?

The following Cloud Function has a batch write operation that, in part, updates a single field in a document. This overwrote the entire document and now the document has a single field joinedCount: -1. Is this not the way to update individual fields in documents without overwriting them?
exports.deleteUserTEST = functions.https.onCall(async (data, _context) => {
const uId = data.userId;
const db = admin.firestore();
try {
const batch = db.batch();
const settingsDoc = await db.collection("userSettings").doc(uId).get();
const joinedIds = settingsDoc.get("private.joinedIds");
Object.keys(joinedIds).forEach(function(jId, _index) {
batch.update(
db.collection("profiles").doc(jId),
{
private: {
joinedCount: admin.firestore.FieldValue.increment(-1), // <-- the culprit
},
},
);
});
await batch.commit();
} catch (error) {
throw new functions.https.HttpsError("unknown", "Failed the delete the user's content.", error);
}
return Promise.resolve(uId);
});
Moving the solution found in the comments by #Dharmaraj into a community answer, this problem was caused by the structure of the document.
Since all the data in the document was inside the private map field, passing a new map through the update method would make it appear that the entire document was being overwritten instead of updated.
In this case, you would need to access the fields through dot notation. This allows those inner fields within the map to be updated, without replacing the entire private map:
Object.keys(joinedIds).forEach(function(jId, _index) {
batch.update(db.collection("profiles").doc(jId), {
"private.joinedCount": admin.firestore.FieldValue.increment(-1)
});
});
Another example from the documentation:
import { doc, setDoc, updateDoc } from "firebase/firestore";
// Create an initial document to update.
const frankDocRef = doc(db, "users", "frank");
await setDoc(frankDocRef, {
name: "Frank",
favorites: { food: "Pizza", color: "Blue", subject: "recess" },
age: 12
});
// To update age and favorite color:
await updateDoc(frankDocRef, {
"age": 13,
"favorites.color": "Red"
});

Delete an item in array within an array

Am I doing this correctly in the backend API? How would you delete an object inside an array within a parent array in the backend? I first found the main parent array index and then I found the object from tasks array using .tasks[index]. The question is how would I delete this in node? Tutorials I found uses req.params.id to delete an item but mine is more complicated.
exports.deleteTaskItem = async (req, res) => {
const taskindex = req.params.id;
const index = req.params.index;
try {
const taskfound = await Task.findById(taskindex);
const taskfounditem = await taskfound.tasks[index];
//code to type here
res.status(204).json({
status: "success",
data: null
});
} catch (err) {
res.status(404).json({
status: "fail",
message: err
});
}
};
I believe this piece of documentation would interest you:
https://docs.mongodb.com/manual/reference/operator/update-array/
And to specify, I believe you want to use the $pull operator.
Something like this:
const {id, index} = req.params;
await Task.findByIdAndUpdate(id,{
$pull: {
tasks: { _id: index }
}
});
(Disclaimer: I did not test this out this time, sorry. But it should be close.)
edit: Now when I reread the question I notice that you want to use the index. Personally I think it'd be easier to just add ids since you get that automatically if you use a sub-document. But if you insist on using index, maybe this answer can help:
https://stackoverflow.com/a/4970050/1497533
edit again:
It seems this helped getting a working solution, so I'll copy it in from comments:
taskfound.tasks.splice(taskindex, 1);
taskfound.markModified('tasks');
await taskfound.save();

Sails.js populate nested associations

I've got myself a question regarding associations in Sails.js version 0.10-rc5. I've been building an app in which multiple models are associated to one another, and I've arrived at a point where I need to get to nest associations somehow.
There's three parts:
First there's something like a blog post, that's being written by a user. In the blog post I want to show the associated user's information like their username. Now, everything works fine here. Until the next step: I'm trying to show comments which are associated with the post.
The comments are a separate Model, called Comment. Each of which also has an author (user) associated with it. I can easily show a list of the Comments, although when I want to display the User's information associated with the comment, I can't figure out how to populate the Comment with the user's information.
In my controller i'm trying to do something like this:
Post
.findOne(req.param('id'))
.populate('user')
.populate('comments') // I want to populate this comment with .populate('user') or something
.exec(function(err, post) {
// Handle errors & render view etc.
});
In my Post's 'show' action i'm trying to retrieve the information like this (simplified):
<ul>
<%- _.each(post.comments, function(comment) { %>
<li>
<%= comment.user.name %>
<%= comment.description %>
</li>
<% }); %>
</ul>
The comment.user.name will be undefined though. If I try to just access the 'user' property, like comment.user, it'll show it's ID. Which tells me it's not automatically populating the user's information to the comment when I associate the comment with another model.
Anyone any ideals to solve this properly :)?
Thanks in advance!
P.S.
For clarification, this is how i've basically set up the associations in different models:
// User.js
posts: {
collection: 'post'
},
hours: {
collection: 'hour'
},
comments: {
collection: 'comment'
}
// Post.js
user: {
model: 'user'
},
comments: {
collection: 'comment',
via: 'post'
}
// Comment.js
user: {
model: 'user'
},
post: {
model: 'post'
}
Or you can use the built-in Blue Bird Promise feature to make it. (Working on Sails#v0.10.5)
See the codes below:
var _ = require('lodash');
...
Post
.findOne(req.param('id'))
.populate('user')
.populate('comments')
.then(function(post) {
var commentUsers = User.find({
id: _.pluck(post.comments, 'user')
//_.pluck: Retrieves the value of a 'user' property from all elements in the post.comments collection.
})
.then(function(commentUsers) {
return commentUsers;
});
return [post, commentUsers];
})
.spread(function(post, commentUsers) {
commentUsers = _.indexBy(commentUsers, 'id');
//_.indexBy: Creates an object composed of keys generated from the results of running each element of the collection through the given callback. The corresponding value of each key is the last element responsible for generating the key
post.comments = _.map(post.comments, function(comment) {
comment.user = commentUsers[comment.user];
return comment;
});
res.json(post);
})
.catch(function(err) {
return res.serverError(err);
});
Some explanation:
I'm using the Lo-Dash to deal with the arrays. For more details, please refer to the Official Doc
Notice the return values inside the first "then" function, those objects "[post, commentUsers]" inside the array are also "promise" objects. Which means that they didn't contain the value data when they first been executed, until they got the value. So that "spread" function will wait the acture value come and continue doing the rest stuffs.
At the moment, there's no built in way to populate nested associations. Your best bet is to use async to do a mapping:
async.auto({
// First get the post
post: function(cb) {
Post
.findOne(req.param('id'))
.populate('user')
.populate('comments')
.exec(cb);
},
// Then all of the comment users, using an "in" query by
// setting "id" criteria to an array of user IDs
commentUsers: ['post', function(cb, results) {
User.find({id: _.pluck(results.post.comments, 'user')}).exec(cb);
}],
// Map the comment users to their comments
map: ['commentUsers', function(cb, results) {
// Index comment users by ID
var commentUsers = _.indexBy(results.commentUsers, 'id');
// Get a plain object version of post & comments
var post = results.post.toObject();
// Map users onto comments
post.comments = post.comments.map(function(comment) {
comment.user = commentUsers[comment.user];
return comment;
});
return cb(null, post);
}]
},
// After all the async magic is finished, return the mapped result
// (or an error if any occurred during the async block)
function finish(err, results) {
if (err) {return res.serverError(err);}
return res.json(results.map);
}
);
It's not as pretty as nested population (which is in the works, but probably not for v0.10), but on the bright side it's actually fairly efficient.
I created an NPM module for this called nested-pop. You can find it at the link below.
https://www.npmjs.com/package/nested-pop
Use it in the following way.
var nestedPop = require('nested-pop');
User.find()
.populate('dogs')
.then(function(users) {
return nestedPop(users, {
dogs: [
'breed'
]
}).then(function(users) {
return users
}).catch(function(err) {
throw err;
});
}).catch(function(err) {
throw err;
);
Worth saying there's a pull request to add nested population: https://github.com/balderdashy/waterline/pull/1052
Pull request isn't merged at the moment but you can use it installing one directly with
npm i Atlantis-Software/waterline#deepPopulate
With it you can do something like .populate('user.comments ...)'.
sails v0.11 doesn't support _.pluck and _.indexBy use sails.util.pluck and sails.util.indexBy instead.
async.auto({
// First get the post
post: function(cb) {
Post
.findOne(req.param('id'))
.populate('user')
.populate('comments')
.exec(cb);
},
// Then all of the comment users, using an "in" query by
// setting "id" criteria to an array of user IDs
commentUsers: ['post', function(cb, results) {
User.find({id:sails.util.pluck(results.post.comments, 'user')}).exec(cb);
}],
// Map the comment users to their comments
map: ['commentUsers', function(cb, results) {
// Index comment users by ID
var commentUsers = sails.util.indexBy(results.commentUsers, 'id');
// Get a plain object version of post & comments
var post = results.post.toObject();
// Map users onto comments
post.comments = post.comments.map(function(comment) {
comment.user = commentUsers[comment.user];
return comment;
});
return cb(null, post);
}]
},
// After all the async magic is finished, return the mapped result
// (or an error if any occurred during the async block)
function finish(err, results) {
if (err) {return res.serverError(err);}
return res.json(results.map);
}
);
You could use async library which is very clean and simple to understand. For each comment related to a post you can populate many fields as you want with dedicated tasks, execute them in parallel and retrieve the results when all tasks are done. Finally, you only have to return the final result.
Post
.findOne(req.param('id'))
.populate('user')
.populate('comments') // I want to populate this comment with .populate('user') or something
.exec(function (err, post) {
// populate each post in parallel
async.each(post.comments, function (comment, callback) {
// you can populate many elements or only one...
var populateTasks = {
user: function (cb) {
User.findOne({ id: comment.user })
.exec(function (err, result) {
cb(err, result);
});
}
}
async.parallel(populateTasks, function (err, resultSet) {
if (err) { return next(err); }
post.comments = resultSet.user;
// finish
callback();
});
}, function (err) {// final callback
if (err) { return next(err); }
return res.json(post);
});
});
As of sailsjs 1.0 the "deep populate" pull request is still open, but the following async function solution looks elegant enough IMO:
const post = await Post
.findOne({ id: req.param('id') })
.populate('user')
.populate('comments');
if (post && post.comments.length > 0) {
const ids = post.comments.map(comment => comment.id);
post.comments = await Comment
.find({ id: commentId })
.populate('user');
}
Granted this is an old question, but a much simpler solution would be to loop over the comments,replacing each comment's 'user' property (which is an id) with the user's full detail using async await.
async function getPost(postId){
let post = await Post.findOne(postId).populate('user').populate('comments');
for(let comment of post.comments){
comment.user = await User.findOne({id:comment.user});
}
return post;
}
Hope this helps!
In case anyone is looking to do the same but for multiple posts, here's one
way of doing it:
find all user IDs in posts
query all users in 1 go from DB
update posts with those users
Given that same user can write multiple comments, we're making sure we're reusing those objects. Also we're only making 1 additional query (whereas if we'd do it for each post separately, that would be multiple queries).
await Post.find()
.populate('comments')
.then(async (posts) => {
// Collect all comment user IDs
const userIDs = posts.reduce((acc, curr) => {
for (const comment of post.comments) {
acc.add(comment.user);
}
return acc;
}, new Set());
// Get users
const users = await User.find({ id: Array.from(userIDs) });
const usersMap = users.reduce((acc, curr) => {
acc[curr.id] = curr;
return acc;
}, {});
// Assign users to comments
for (const post of posts) {
for (const comment of post.comments) {
if (comment.user) {
const userID = comment.user;
comment.user = usersMap[userID];
}
}
}
return posts;
});

Incorrect Subdocument Being Updated?

I've got a Schema with an array of subdocuments, I need to update just one of them. I do a findOne with the ID of the subdocument then cut down the response to just that subdocument at position 0 in the returned array.
No matter what I do, I can only get the first subdocument in the parent document to update, even when it should be the 2nd, 3rd, etc. Only the first gets updated no matter what. As far as I can tell it should be working, but I'm not a MongoDB or Mongoose expert, so I'm obviously wrong somewhere.
var template = req.params.template;
var page = req.params.page;
console.log('Template ID: ' + template);
db.Template.findOne({'pages._id': page}, {'pages.$': 1}, function (err, tmpl) {
console.log('Matched Template ID: ' + tmpl._id);
var pagePath = tmpl.pages[0].body;
if(req.body.file) {
tmpl.pages[0].background = req.body.filename;
tmpl.save(function (err, updTmpl) {
console.log(updTmpl);
if (err) console.log(err);
});
// db.Template.findOne(tmpl._id, function (err, tpl) {
// console.log('Additional Matched ID: ' + tmpl._id);
// console.log(tpl);
// tpl.pages[tmpl.pages[0].number].background = req.body.filename;
// tpl.save(function (err, updTmpl){
// if (err) console.log(err);
// });
// });
}
In the console, all of the ID's match up properly, and even when I return the updTmpl, it's saying that it's updated the proper record, even though its actually updated the first subdocument and not the one it's saying it has.
The schema just in case:
var envelopeSchema = new Schema({
background: String,
body: String
});
var pageSchema = new Schema({
background: String,
number: Number,
body: String
});
var templateSchema = new Schema({
name: { type: String, required: true, unique: true },
envelope: [envelopeSchema],
pagecount: Number,
pages: [pageSchema]
});
templateSchema.plugin(timestamps);
module.exports = mongoose.model("Template", templateSchema);
First, if you need req.body.file to be set in order for the update to execute I would recommend checking that before you run the query.
Also, is that a typo and req.body.file is supposed to be req.body.filename? I will assume it is for the example.
Additionally, and I have not done serious testing on this, but I believe your call will be more efficient if you specify your Template._id:
var template_id = req.params.template,
page_id = req.params.page;
if(req.body.filename){
db.Template.update({_id: template_id, 'pages._id': page_id},
{ $set: {'pages.$.background': req.body.filename} },
function(err, res){
if(err){
// err
} else {
// success
}
});
} else {
// return error / missing data
}
Mongoose doesn't understand documents returned with the positional projection operator. It always updates an array of subdocuments positionally, not by id. You may be interested in looking at the actual queries that mongoose is building - use mongoose.set('debug', true).
You'll have to either get the entire array, or build your own MongoDB query and go around mongoose. I would suggest the former; if pulling the entire array is going to cause performance issues, you're probably better off making each of the subdocuments a top-level document - documents that grow without bounds become problematic (at the very least because Mongo has a hard document size limit).
I'm not familiar with mongoose but the Mongo update query might be:
db.Template.update( { "pages._id": page }, { $set: { "pages.$.body" : body } } )

Resources