Mongodb: updating multiple docs in a loop? - node.js

I did a great effort to find a solution to this common situation, without success. Hoping somebody can help please.
Background:
A board of messages.
Each message has its msg_id.
Each user has an id
I need to track which user watched which message.
For this I have a collection called viewedMessages like this:
{
{ _id: <message_id_X>,
viewedBy: [<user_id_?>,<user_id_?>,...]
},
{ _id: <message_id_Y>,
viewedBy: [<user_id_?>,<user_id_?>,user_id_?>,...]
},
...
}
The user is calling my node.js server once in a while, reporting which messages were viewed, like this:
{ user_id: <user_id_?>, viewed: [<message_id_?>,...] }
An entry for a message is only created when some user is reporting it as viewed. This is done using this command:
db.viewedMessages.update({"_id":<msg_id>},{$addToSet:{viewedBy:<user_id>}},{ upsert : true });
Now for the question.
The above update command is asynchronous. How do I run over an array of [msg_id,msg_id,msg_id,...], issuing the above update command for each msg_id, and finally get a callback telling me that everything was done ok OR failed for some reason?
Would highly appreciate any help here!

Related

How to update array in object in an array with index in mongoose?

I am developing an application using MongoDB and express.
the application is similar to any social media app, the user can create posts and the other users can like the post and also comment on it.
I want to add a feature where a user can like the comment and I've tried a lot and searched too but with no good results.
the json format of a post object:
{
...,
comments: [
{
title: 'I am a comment in the post !',
likers: [
ObjectId('foo'),
ObjectId('bar'),
],
},
],
}
I want to either be able to push to likers or pull from with index.
I have tried to update using the following code but I failed since it cant read the commentIndex property and gives error in syntax.
Post.findByIdAndUpdate(postId , { '$push': { 'comments.${commentIndex}.likedUsersIds': mongoose.Types.ObjectId(liker) } })
also tried with operator $ and not worked too.
please help me.
Thanks,

mongoose $pull get nModified: 0

After spending the whole day with that problem... i need help.
Node v12.14.1
mongoose v5.8.9
mongoDB v4.2.1
So everything is up to date.
I tried many ways, but this is how it should work:
model.updateOne({_id:model_id},{$pull: {videos: {_id:video_id},{multi:true})
but then i get
{ n: 1, nModified: 0, ok: 1 }
So, it get found, no errors but it dosent remove/modify the object.
Cant figure out what should be wrong.
So from your query, I am assuming this is what your document looks like:
{
_id: "<some uuid>",
videos: [
{
_id: "<some uuid>"
},
{... n}
]
}
If that is the case, your update query looks good, and maybe you need to consider the video_id input.
Could you try the equivalent find?:
model.find({_id:model_id, 'team._id': video_id})
Sometimes...
Sometimes the mistake took place way before we reach this line of code...
It was hard to find cause in terminal output and also in compass everything looks fine.
But, at that moment where i save the video and update the user to just have video id, title and slug also in the user collection, i did this:
$push: {
videos: [{
_id: data._id,
title: data.title,
slug: data.slug
}]
}
Yes, i pushed an empty array containing an object into an array. But like i said, the output and also .find() was like everything is okay.
After i removed the [] brackets and tested a new data set, $pull works fine.
Guess this was my hardest bug... hardest bug till now!
Thanks for your helps!

Creating and pushing to an array with MongoDB

I'm trying to make a messaging system that writes each message to a mongo entry. I'd like the message entry to reflect the user that sends the message, and the actual message content. This is the message schema:
const MessageSchema = new Schema({
id: {
type: String,
required: true
},
messages: {
type: Array,
required: true
},
date: {
type: Date,
default: Date.now
}
});
And this is where I either create a new entry, or append to an existing one:
Message.findOne({ id: chatId }).then(message => {
if(message){
Message.update.push({ messages: { 'name': user.name, 'message': user.message } })
} else {
const newMessage = new Message(
{ id: chatId },
{ push: { messages: { 'name': user.name, 'message': user.message } } }
)
newMessage
.save()
.catch(err => console.log(err))
}
})
I'd like the end result to look something like this:
id: '12345'
messages: [
{name: 'David', message: 'message from David'},
{name: 'Jason', message: 'message from Jason'},
etc.
]
Is something like this possible, and if so, any suggestions on how to get this to work?
This questions contains lots of topics (in my mind at least). I really want to try to break this questions to its core components:
Design
As David noted (first comment) there is a design problem here - an ever-growing array as a sub document is not ideal (please refer to this blog post for more details).
On the over hand - when we imagine how a separate collection of messages will looks like, it will be something like this:
_id: ObjectId('...') // how do I identify the message
channel_id: 'cn247f9' // the message belong to a private chat or a group
user_id: 1234 // which user posted this message
message: 'hello or something' // the message itself
Which is also not that great because we are repeating the channel and user ids as a function of time. This is why the bucket pattern is used
So... what is the "best" approach here?
Concept
The most relevant question right now is - "which features and loads this chat is suppose to support?". I mean, many chats are only support messages display without any further complexity (like searching inside a message). Keeping that in mind, there is a chance that we store in our database an information that is practically irrelevant.
This is (almost) like storing a binary data (such an image) inside our db. we can do this, but with no actual good reason. So, if we are not going to support a full-text search inside our messages, there is no point to store the messages inside our db.. at all
But.. what if we want to support a full-text search? well - who said that we need to give this task to our database? we can easily download messages (using pagination) and make the search operation on the client side itself (while keyword not found, download previous page and search it), taking the loads out of our database!
So.. it seems like that messages are not ideal for storage in database in terms of size, functionality and loads (you may consider this conclusion as a shocking one)
ReDesign
Using a hybrid approach where messages are stored in a separated collection with pagination (the bucket pattern supports this as described here)
Store messages outside your database (since your are using Node.js you may consider using chunk store), keeping only a reference to them in the database itself
Set your page with a size relevant to your application needs and also with calculated fields (for instances: number of current messages in page) to ease database loads as much as possible
Schema
channels:
_id: ObjectId
pageIndex: Int32
isLastPage: Boolean
// The number of items here should not exceed page size
// when it does - a new document will be created with incremental pageIndex value
// suggestion: update previous page isLastPage field to ease querying of next page
messages:
[
{ userId: ObjectID, link: string, timestamp: Date }
]
messagesCount: Int32
Final Conclusion
I know - it seems like a complete overkill for such a "simple" question, but - Dawid Esterhuizen convinced me that designing your database to support your future loads from the very beginning is crucial and always better than simplifying db design too much
The bottom line is that the question "which features and loads this chat is suppose to support?" is still need to be answered if you intend to desgin your db efficiently (e.g. to find the Goldilocks zone where your design suits your application needs in the most optimal way)

"Race like" condition with Mongoose

I have a process that triggers a number of requests that in turn trigger off a number of webhooks. I know the process is complete when I've received all of my webhooks. My model looks something like this.
{
name: 'name',
status: 'PENDING',
children: [{
name: 'child1',
status: 'PENDING'
}, {
name: 'child2',
status: 'PENDING'
}]
}
The idea is as the webhooks come in, I update the subdoc to COMPLETE. At the end of each update I check if the others are complete and if they are, I set status='COMPLETE' on the parent. It looks like one invocation of the service is marking it as COMPLETE after another invocation has determined it was still PENDING but before that second invocation has saved. When the second one saves, it overwrites COMPLETE with PENDING on the parent.
Here is the code from my schema:
methods: {
doUpdate: function(data) {
var child = this.children.id(data.childId);
child.status = 'COMPLETE';
if (_.every(this.children.status, 'COMPLETE'))
this.status = 'COMPLETE';
return this.save();
}
}
I think you can solve your issue when you just modify and save your child objects instead of saving / overwriting the whole document each time.
To do that you can use the positional $ in your update statement.
Essentially it would look something like this:
db.yourChildrenCollection.update(
{ _id: this._id, 'children.name': childName },
{ $set: { 'children.$.status' : 'COMPLETE' } }
)
You have to modify the variable names as I do not know your collection name etc. but I think you get the point.
The solution is to first find and update the status in one operation, using a variant of mongo's findAndModify method, and then check if the other children have completed. I was trying to do two updates in one operation. By breaking it up into two steps, I know that when I check the statuses of the other children, all other invocations will have the most recent state of the document and not get a false reading while another invocation is waiting for save() to complete.

Saving subdocuments with mongoose

I have this:
exports.deleteSlide = function(data,callback){
customers.findOne(data.query,{'files.$':1},function(err,data2){
if(data2){
console.log(data2.files[0]);
data2.files[0].slides.splice((data.slide-1),1);
data2.files[0].markModified('slides');
data2.save(function(err,product,numberAffected){
if(numberAffected==1){
console.log("manifest saved");
var back={success:true};
console.log(product.files[0]);
callback(back);
return;
}
});
}
});
}
I get the "manifest saved" message and a callback with success being true.
When I do the console.log when I first find the data, and compare it with the console.log after I save the data, it looks like what I expect. I don't get any errors.
However, when I look at the database after running this code, it looks like nothing was ever changed. The element that I should have deleted, still appears?
What's wrong here?
EDIT:
For my query, I do {'name':'some string','files.name':'some string'}, and if the object is found, I get an array of files with one object in it.
I guess this is a subdoc.
I've looked around and it says the rules for saving subdocs are different than saving the entire collection, or rather, the subdocs are only applied when the root object is saved.
I've been going around this by grabbing the entire root object, then I do loops to find the actual subdoc I that I want, and after I manipulate that, I save the whole object.
Can I avoid doing this?
I'd probably just switch to using native drivers for this query as it is much simpler. (For that matter, I recently dropped mongoose on my primary project and am happy with the speed improvements.)
You can find documentation on getting access to the native collection elsewhere.
Following advice here:
https://stackoverflow.com/a/4588909/68567
customersNative.update(data.query, {$unset : {"slides.1" : 1 }}, function(err){
if(err) { return callback(err); }
customersNative.findAndModify(data.query, [],
{$pull: {'slides' : null } }, {safe: true, 'new' : true}, function(err, updated) {
//'updated' has new object
} );
});

Resources