Mongodb merge update nested fields in a subdocument - node.js

I am searching if it exists, in mongodb or its nodejs driver, any method to merge/update a subdocument like this:
The sample document, for example in collection C:
{
subdocument: {
a: 1,
b: 2
}
}
The update query that I do NOT want to use:
db.C.updateOne(
{},
{
$set: {
"subdocument.b": 3
}
}
}
The update query that I want to use:
db.C.updateOne(
{},
{
$set: {
subdocument: {
b: 3
}
}
}
}
The resulting document I get when running such a query:
{
subdocument: {
b: 3
}
}
The resulting merged document I would like to get instead:
{
subdocument: {
a: 1,
b: 3
}
}
For the record, the reason I want this is because I'm trying to use interfaces to avoid as much as possible writing schema keywords inside strings, so I can have as many of them checked by the typescript compiler as possible. So i know the way to do this is to update "subdocument.b", but I am exactly trying to not do this, to avoid using a string.
Obviously, the subdocument is not merged but fully replaced using the standard update witout options. I would like to know if there is a way to do this natively using mongodb query language, aggregation framework, mongodb nodejs driver, or maybe something else? From what i could learn by myself, it seems unsupported, but maybe it is and someone can tell me how?
Thanks

Its very easy to do that if you use pipeline updates MongoDB >= 4.2
With pipeline updates all aggregate operators can be used, but only limited stages see the database command , drivers use that internally.
If you use pipeline updates you can use only aggregation operators, or query operators for the $match, only if you use $expr.The old update operators doesn't work inside a pipeline.
Test code here
db.collection.update({},
[
{
"$addFields": {
"subdocument": {
"$mergeObjects": [
"$subdocument",
{
"b": 3
}
]
}
}
}
])

Related

NodeJS MongoDb updateMany() with a condition?

I want to update a MongoDb collection with an Array of JSON objects.
However, I want the update to ignore any objects that already exist in the DB.
Sounds easy, but the key that allows me to know if the object exists is not the '_id', but a different 'id' (or other key).
Is that possible ?
Currently I am using it like this:
dbHandle.collection('contents').updateMany(contents);
where 'contents' is the Array of JSON objects.
Thanks in advance
The following operation updates all documents where violations are greater than 4 and $set a flag for review:
try {
db.restaurant.updateMany(
{ violations: { $gt: 4 } }, //Your Condition
{ $set: { "Review" : true } } //YOUR JSON contents
);
} catch (e) {
print(e);
}
Change the condition accordingly.

Mongoose, Nodejs - replace many documents in one I/O?

I have an array of objects and I want to store them in a collection using only one I/O operation if it's possible. If any document already exists in the collection I want to replace it, or insert it otherwise.
These are the solutions that I found, but doesn't work exactly as I want:
insertMany(): this doesn't replace the document that already exists, but throws exception instead (This is what I found in the Mongodb documentation, but I don't know if it's the same as mongoose).
update() or ‎updateMany() with upsert = true: this doesn't help me as well, because here I have to do the same updates to all the to stored documents.
‎There is no replaceMany() in mongodb or mongoose.
Is there anyone how knows any optimal way to do replaceMany using mongoose and node.js
There is bulkWrite (https://docs.mongodb.com/manual/reference/method/db.collection.bulkWrite/), which makes it possible to execute multiple operations at once. In your case, you can use it to perform multiple replaceOne operations with upsert. The code below shows how you can do it with Mongoose:
// Assuming *data* is an array of documents that you want to insert (or replace)
const bulkData = data.map(item => (
{
replaceOne: {
upsert: true,
filter: {
// Filter specification. You must provide a field that
// identifies *item*
},
replacement: item
}
}
));
db.bulkWrite(bulkData);
You need to query like this:
db.getCollection('hotspot').update({
/Your Condition/
}, {
$set: {
"New Key": "Value"
}
}, {
multi: true,
upsert: true
});
It fulfils your requirements..!!!

Mongodb Aggregation Append method for optional $match pipeline operator

I'm using nodejs + mongoosejs with mongodb 2.6. I have a static function on a model that sums the value of all items in the collection. Each item is assigned to a project using a projectNo property. I need the static function to be able to give me the total for the collection, and if a projectNo argument is passed, add a $match pipeline operator to the aggregation. This will save me from having to make 2 static functions that essentially does the same thing.
To spice things up a bit I use bluebird promisifyAll method to make the aggregation framework return a promise.
my static function that sums the entire collection:
db.collection.aggregateAsync([
{$group:{_id: null, amount: { $sum: "$amount" }}}
])
my static function that sums only the records with a matching projectNo:
db.collection.aggregateAsync([
{$match: { projectNo: projectNo }},
{$group:{_id: null, amount: { $sum: "$amount" }}}
])
I really want to use the Aggregate.append method to append the $match pipeline only if a req.params.projectNo is included.
When I try to add it to the async aggregation it gets an error, which makes sense because its just a promise. If I try this:
db.collection.aggregateAsync([
{$group:{_id: null, amount: { $sum: "$amount" }}}
]).then(function(aggregate){
aggregate.append({$match: { projectNo: projectNo }})
})
I get an error, (append is undefined). How should I go about doing this? Or just live with the fact that I have two functions that do the same thing?
I read the source code in mongodb to see exactly how to use the aggregate.append method. If you're building the aggregation using the chained methods, you can use append to add any pipeline operations.
So what I did instead is put the array of aggregation pipelines into an array. If there is a projectNo then I add the $match pipeline to the array using unshift(). I used unshift because you usually want the $match pipeline to first limit the number of records, then do the rest of the pipeline operations.
var pipeline = [{$group:{_id: null, amount: { $sum: "$amount" }}}];
if(req.params.projectNo){
pipeline.unshift({$match: { projectNo: req.params.projectNo }});
}
db.collection.aggregateAsync(pipeline);
I usually make things way more complicated than I need to...

Mongoose update push to start of array

I want to push to the starting of array while updating(a single document). I am using findOneAndUpdate but it seems like mongoose doesn't support $position operator. I can achieve this by using the native driver by Model.collection.update
{
'$push': {
post_IDs: {
'$each': [articles],
'$position': 0
}
}
}
but the native driver doesn't return the document updated. That's why I can't use it here. Is there any way to push to the start of the array while receiving the updated document in the callback - apart from using find() followed by save()?
Mongoose does not support the new operators directly, but the underlying driver dependency should be recent enough if your mongoose is a recent release.
You get the underlying "node native" driver functions by using the .collection accessor on the model:
Model.collection.findAndModify(
{ field: "value" },
[],
{
"$push": {
"post_IDs": {
"$each": [articles],
"$position": 0
}
}
},
{ new: true},
function(err,doc) {
}
);
The method there is the .findAndModify() from the native driver. The syntax is a little different. First is "query" then a "sort" array, then the "update" document. Also the options are set to return the "new" document, which is what the mongoose methods default to, but this one does not.

Update offspring in nested tree mongoDB, node.js

Is there any way to update nested documents by id or some other field?
I use "Full Tree in Single Document" and don't know beforehand how deep nesting can go. Need to Update, for example, answer with {id:'104'}. I can do that via 'dot notation', but since I don't know the level (depth) of nesting I can't predict how long my 'comment.answers.answers....answers.' can go.
Is there any way to directly find and update id:'104', or I still need to pass some kind of depth mark?
{
title:'some title',
comment:
{
id:'101'
author:'Joe',
text:'some comment',
answers:
[
{
id:'102'
author:'Joe',
text:'first answer to comment',
answers:
[
{
id:'103'
author:'Done',
text:'first answer to first answer to comment',
answers:[]
},
{
id:'104'
author:'Bob',
text:'Second answer to first answer to comment',
answers:[]
}
]
},
{
},
{
},
]
}
}
I use The Node.JS MongoDB Driver
In short there's no really good way to do a query of this sort. There are a few options:
You can create a query with a long $or statement, specifying each of the possible nested locations for the document:
{ comment.id: 103, $or [
{ comment.answers.id: 103 },
{ comment.answers.answers.id: 103 },
{ comment.answers.answers.answers.id: 103 }
]
}
For more information about the $or operator see the docs.
In truth, the better and more sustainable solution would be to use a different schema, where all comments and answers are stored in a flat array and then store information about the relationships between the comments in a comments.parent field. For example:
{ comment: [
{ id: 1 }
{ id: 2, parent: 1 }
] }
For additional options and a more in depth discussion of possible ways of modeling comment hierarchies, view the Storing Comments Use Case in the MongoDB documentation.
There are also a number of also strategies in the Trees in MongoDB that you might want to consider.
I think you should store depth level of each node and then dynamically create queries.

Resources