Mongoose update push to start of array - node.js

I want to push to the starting of array while updating(a single document). I am using findOneAndUpdate but it seems like mongoose doesn't support $position operator. I can achieve this by using the native driver by Model.collection.update
{
'$push': {
post_IDs: {
'$each': [articles],
'$position': 0
}
}
}
but the native driver doesn't return the document updated. That's why I can't use it here. Is there any way to push to the start of the array while receiving the updated document in the callback - apart from using find() followed by save()?

Mongoose does not support the new operators directly, but the underlying driver dependency should be recent enough if your mongoose is a recent release.
You get the underlying "node native" driver functions by using the .collection accessor on the model:
Model.collection.findAndModify(
{ field: "value" },
[],
{
"$push": {
"post_IDs": {
"$each": [articles],
"$position": 0
}
}
},
{ new: true},
function(err,doc) {
}
);
The method there is the .findAndModify() from the native driver. The syntax is a little different. First is "query" then a "sort" array, then the "update" document. Also the options are set to return the "new" document, which is what the mongoose methods default to, but this one does not.

Related

how to get the call back of update_one in mongodb using django?

I'm trying to update some values to a document in a collection using
dbconn.collectionName.update_one(
{'_id': ObjectId('6396c654efd251498ea4ebbc')},
{'$addToSet': {
'updateHistory': {
"key":"Text",
"key":"date"
"key":ObjectId("633456783er5t672342134")
}
}
},
{'$set': {
"key":"Text",
"key":ObjectId("633456783er5t672342134")
}
}
)
so what I'm trying to achieve is to acknowledge whether the document updated or not by getting the callback of update_one()
if it's updated i need to do certain actions depending on that, how's that possible?

Mongodb merge update nested fields in a subdocument

I am searching if it exists, in mongodb or its nodejs driver, any method to merge/update a subdocument like this:
The sample document, for example in collection C:
{
subdocument: {
a: 1,
b: 2
}
}
The update query that I do NOT want to use:
db.C.updateOne(
{},
{
$set: {
"subdocument.b": 3
}
}
}
The update query that I want to use:
db.C.updateOne(
{},
{
$set: {
subdocument: {
b: 3
}
}
}
}
The resulting document I get when running such a query:
{
subdocument: {
b: 3
}
}
The resulting merged document I would like to get instead:
{
subdocument: {
a: 1,
b: 3
}
}
For the record, the reason I want this is because I'm trying to use interfaces to avoid as much as possible writing schema keywords inside strings, so I can have as many of them checked by the typescript compiler as possible. So i know the way to do this is to update "subdocument.b", but I am exactly trying to not do this, to avoid using a string.
Obviously, the subdocument is not merged but fully replaced using the standard update witout options. I would like to know if there is a way to do this natively using mongodb query language, aggregation framework, mongodb nodejs driver, or maybe something else? From what i could learn by myself, it seems unsupported, but maybe it is and someone can tell me how?
Thanks
Its very easy to do that if you use pipeline updates MongoDB >= 4.2
With pipeline updates all aggregate operators can be used, but only limited stages see the database command , drivers use that internally.
If you use pipeline updates you can use only aggregation operators, or query operators for the $match, only if you use $expr.The old update operators doesn't work inside a pipeline.
Test code here
db.collection.update({},
[
{
"$addFields": {
"subdocument": {
"$mergeObjects": [
"$subdocument",
{
"b": 3
}
]
}
}
}
])

NodeJS MongoDb updateMany() with a condition?

I want to update a MongoDb collection with an Array of JSON objects.
However, I want the update to ignore any objects that already exist in the DB.
Sounds easy, but the key that allows me to know if the object exists is not the '_id', but a different 'id' (or other key).
Is that possible ?
Currently I am using it like this:
dbHandle.collection('contents').updateMany(contents);
where 'contents' is the Array of JSON objects.
Thanks in advance
The following operation updates all documents where violations are greater than 4 and $set a flag for review:
try {
db.restaurant.updateMany(
{ violations: { $gt: 4 } }, //Your Condition
{ $set: { "Review" : true } } //YOUR JSON contents
);
} catch (e) {
print(e);
}
Change the condition accordingly.

Mongoose, Nodejs - replace many documents in one I/O?

I have an array of objects and I want to store them in a collection using only one I/O operation if it's possible. If any document already exists in the collection I want to replace it, or insert it otherwise.
These are the solutions that I found, but doesn't work exactly as I want:
insertMany(): this doesn't replace the document that already exists, but throws exception instead (This is what I found in the Mongodb documentation, but I don't know if it's the same as mongoose).
update() or ‎updateMany() with upsert = true: this doesn't help me as well, because here I have to do the same updates to all the to stored documents.
‎There is no replaceMany() in mongodb or mongoose.
Is there anyone how knows any optimal way to do replaceMany using mongoose and node.js
There is bulkWrite (https://docs.mongodb.com/manual/reference/method/db.collection.bulkWrite/), which makes it possible to execute multiple operations at once. In your case, you can use it to perform multiple replaceOne operations with upsert. The code below shows how you can do it with Mongoose:
// Assuming *data* is an array of documents that you want to insert (or replace)
const bulkData = data.map(item => (
{
replaceOne: {
upsert: true,
filter: {
// Filter specification. You must provide a field that
// identifies *item*
},
replacement: item
}
}
));
db.bulkWrite(bulkData);
You need to query like this:
db.getCollection('hotspot').update({
/Your Condition/
}, {
$set: {
"New Key": "Value"
}
}, {
multi: true,
upsert: true
});
It fulfils your requirements..!!!

Avoid Aggregate 16MB Limit

I have a collection of about 1M documents. Each document has internalNumber property and I need to get all internalNumbers in my node.js code.
Previously I was using
db.docs.distinct("internalNumber")
or
collection.distinct('internalNumber', {}, {},(err, result) => { /* ... */ })
in Node.
But with the growth of the collection I started to get the error: distinct is too big, 16m cap.
Now I want to use aggregation. It consumes a lot of memory and it is slow, but it is OK since I need to do it only once at the script startup. I've tried following in Robo 3T GUI tool:
db.docs.aggregate([{$group: {_id: '$internalNumber'} }]);
It works, and I wanted to use it in node.js code the following way:
collection.aggregate([{$group: {_id: '$internalNumber'} }],
(err, docs) => { /* ... * });
But in Node I get an error: "MongoError: aggregation result exceeds maximum document size (16MB) at Function.MongoError.create".
Please help to overcome that limit.
The problem is that the native driver differs from how the shell method is working by default in that the "shell" is actually returning a "cursor" object where the native driver needs this option "explicitly".
Without a "cursor", .aggregate() returns a single BSON document as an array of documents, so we turn it into a cursor to avoid the limitation:
let cursor = collection.aggregate(
[{ "$group": { "_id": "$internalNumber" } }],
{ "cursor": { "batchSize": 500 } }
);
cursor.toArray((err,docs) => {
// work with resuls
});
Then you can use regular methods like .toArray() to make the results a JavaScript array which on the 'client' does not share the same limitations, or other methods for iterating a "cursor".
For Casbah users:
val pipeline = ...
collection.aggregate(pipeline, AggregationOptions(batchSize = 500, outputMode = AggregationOptions.CURSOR)

Resources