mongo $push overwrites rather than adds to an array within a document - node.js

I am trying to create a historical record for updates to a document in Mongo DB via NodeJS. The document updates are only in one object within the document, so it seems like creating an array of historical values makes sense.
However, when I use the $push function with db.collection.update(), it only updates the array at the 0 index rather than add to the array.
Here is what I have:
{
_id: ID,
odds: {
spread: CURRENTSPREAD,
total: CURRENTTOTAL,
history: [
0: {
spread: PREVIOUSSPREAD1,
total: PREVIOUSTOTAL1,
date: DATEENTERED
}
]
}
}
Here is what I would like:
{
_id: ID,
odds: {
spread: CURRENTSPREAD,
total: CURRENTTOTAL,
history: [
0: {
spread: PREVIOUSSPREAD1,
total: PREVIOUSTOTAL1,
date: DATEENTERED1
},
1: {
spread: PREVIOUSSPREAD2,
total: PREVIOUSTOTAL2,
date: DATEENTERED2
},
...,
n: {
spread: PREVIOUSSPREAD-N,
total: PREVIOUSTOTAL-N,
date: DATEENTERED-N
}
]
}
}
There is no need to check whether the previous value exists before adding.
Here is my code:
var oddsHistoryUpdate = {
$push: {
'odds.history': {
spread: game.odds.spread,
total: game.odds.total,
date: Date.now()
}
}
}
db.collection('games').update({"_id": ID}, oddsHistoryUpdate).
.then(finish executing)
Why is it only pushing to the 0 index instead of adding to the array? How do I fix?

Bigga_HD's answer is the correct one regarding the $push operator. However, there may be an alternative solution that is more aligned to how MongoDB works under the hood.
A single document in MongoDB has a hard limit of 16MB, and if a document is frequently updated, it is possible that the array grows so large that it hits this limit.
Alternatively, you can just insert a new document into the collection instead of pushing the old document inside an array. The new & old documents can be differentiated by their insertion date. For example:
{
_id: ID,
name: <some identification>
insert_date: ISODate(...),
odds: {
spread: CURRENTSPREAD,
total: CURRENTTOTAL
}
}
You can then query the collection using a combination of e.g. its name and insert_date, sorted by its date descending, and limit by 1 to get the latest version:
db.collection.find({name: ...}).sort({insert_date: -1}).limit(1)
or remove the limit to find all versions:
db.collection.find({name: ...}).sort({insert_date: -1})
To support this query, you can create an index based on name and insert_date in descending order (see Create Indexes to Support Your Queries)
db.collection.createIndex({name: 1, insert_date: -1})
As a bonus, you can use a TTL index on the insert_date field to automatically delete old document versions.

$push
The $push operator appends a specified value to an array.
The $push operator has the form:
{ $push: { <field1>: <value1>, ... } }
If the field is absent in the document to update, $push adds the array field with the value as its element.
If the field is not an array, the operation will fail.
If the value is an array, $push appends the whole array as a single element. To add each element of the value separately, use the $each modifier with $push.
$each -Appends multiple values to the array field.
This should do the trick for you. Obviously, it's a very simplified example.
{ $push: { <field1>: { <modifier1>: <value1>, ... }, ... } }
let oddsHistoryUpdate = {
spread: game.odds.spread,
total: game.odds.total,
date: Date.now()
}
db.games.update(
{ _id: ID },
{ $push: { odds.history: oddsHistoryUpdate} }
)
I suggest try using Mongoose for your NodeJS - MongoDB interactions.

The answer was uncovered by dnickless.
In a previous call, I update the main odds object which I didn't realize was wiping out the history array.
Updating the previous call from
update($set: {odds: { spread: SPREAD, total: TOTAL }})
to
update($set: {"odds.spread": SPREAD, "odds.total": TOTAL})
and then making my $push call as written, all works fine.

Related

Using updateOne with Arrays in Mongoose with NodeJS

Im working on an API for a custom game and i need to update specific fields in an array in my Database Entry. How do i do that without adding a new thing to the array, instead just updating for example the content of entry 5 in the array
Hello I'm not sure the name of your schema so I'm assuming its called Game so here is an example for how to update an index in an array assuming its the board array field you're trying to update. Also reference here for more details on updating an array field by it's index:
var fieldPosition = "board." + req.params.field
await Game.updateOne({
_id: 1
}, [{
$set: {
tempBoard: fieldPosition
}
},
{
$set: {
"$tempBoard": session.turn
}
},
{
$unset: ["tempBoard"]
}
])

Is it possible to group up all documents returned from a query into a dictionary-like structure based on one of their fields? [duplicate]

I have a collection in my MongoDB:
{ userId: 1234, name: 'Mike' }
{ userId: 1235, name: 'John' }
...
I want to get a result of the form
dict[userId] = document
in other words, I want a result that is a dictionary where the userId is the key and the rest of the document is the value.
How can I do that?
You can use $arrayToObject to do that, you just need to format it to array of k, v before.
It is not clear if you want one dictionary for all documents, or each document in a dictionary format. I guess you want the first option, but I'm showing both:
One dictionary with all data*, requires a $group (which also format the data):
db.collection.aggregate([
{
$group: {
_id: null,
data: {$push: {k: {$toString: "$userId"}, v: "$$ROOT"}}
}
},
{
$project: {data: {$arrayToObject: "$data"}}
},
{
$replaceRoot: {newRoot: "$data"}
}
])
See how it works on the playground example - one dict
*Notice that in this option, all the data is inserted to one document, and document as a limit size.
Dictionary format: If you want to get all documents as different results, but with a dictionary format, just replace the first step of the aggregation with this:
{
$project: {
data: [{k: {$toString: "$userId"}, v: "$$ROOT"}],
_id: 0
}
},
See how it works on the playground example - dict per document

Mongoose bulk update

I want to be able to update an array of objects where each object has a new unique value assigned to it.
Here is a simplified example of what I'm doing. items is an array of my collection items.
let items = [{_id: '903040349304', number: 55}, {_id: '12341244', number: 1166}, {_id: '667554', number: 51115}]
I want to assign a new number to each item, and then update it in collection:
items = items.map(item => {
item.number = randomInt(0, 1000000);
return item;
})
What would be the best way to update the collection at once? I know that I could do it in forEach instead of map, how ever this seems as a dirty way of doing it, as it won't do the bulk update.
items.forEach(async (item) => {
await this.itemModel.update({_id: item._id}, {number: randomInt(0, 1000000)})
});
I've checked the updateMany as well but my understanding of it is that it's only used to update the documents with a same new value - not like in my case, that every document has a new unique value assigned to it.
After a bit of thinking, I came up with this solution using bulkWrite.
const updateQueries = [];
items.forEach(async (item) => {
updateQueries.push({
updateOne: {
filter: { _id: item._id },
update: { number: item.number },
},
});
});
await this.itemModel.bulkWrite(updateQueries);
About bulkWrite
Sends multiple insertOne, updateOne, updateMany, replaceOne,
deleteOne, and/or deleteMany operations to the MongoDB server in one
command. This is faster than sending multiple independent operations
(like) if you use create()) because with bulkWrite() there is only one
round trip to MongoDB.
You can call an aggregate() to instantly update them without needing to pull them first:
Step1: get a random number with mongoDb build in $rand option which returns a number between 0 and 1
Step2: $multiply this number by 1000000 since that is what you defined ;)
Step3: use another $set with $floor to remove the decimal portion
YourModel.aggregate([
{
'$set': {
'value': {
'$multiply': [
{
'$rand': {}
}, 1000000
]
}
}
}, {
'$set': {
'value': {
'$floor': '$value'
}
}
}
])
Here a picture of how that looks in mongo Compass as a proof of it working:

Mongoose find all documents where array.length is greater than 0 & sort the data

I am using mongoose to perform CRUD operation on MongoDB. This is how my schema looks.
var EmployeeSchema = new Schema({
name: String,
description: {
type: String,
default: 'No description'
},
departments: []
});
Each employee can belong to multiple department. Departments array will look like [1,2,3]. In this case departments.length = 3. If the employee does not belong to any department, the departments.length will be equal to 0.
I need to find all employee where EmployeeSchema.departments.length > 0 & if query return more than 10 records, I need to get only employees having maximum no of departments.
Is it possible to use Mongoose.find() to get the desired result?
Presuming your model is called Employee:
Employee.find({ "departments.0": { "$exists": true } },function(err,docs) {
})
As $exists asks for the 0 index of an array which means it has something in it.
The same applies to a maximum number:
Employee.find({ "departments.9": { "$exists": true } },function(err,docs) {
})
So that needs to have at least 10 entries in the array to match.
Really though you should record the length of the array and update with $inc every time something is added. Then you can do:
Employee.find({ "departmentsLength": { "$gt": 0 } },function(err,docs) {
})
On the "departmentsLength" property you store. That property can be indexed, which makes it much more efficient.
By some reason, selected answer doesn't work as for now. There is the $size operator.
Usage:
collection.find({ field: { $size: 1 } });
Will look for arrays with length 1.
use can using $where like this:
await EmployeeSchema.find( {$where:'this.departments.length>0'} )
If anyone is looking for array length is greater than 1, you can do like below,
db.collection.find({ "arrayField.1" : { $exists: true }})
The above query will check if the array field has value at the first index, it means it has more than 1 items in the array. Note: Array index start from 0.

MongoDB aggregate to find first document with value greater than n

Given a LARGE (hundreds of thousands) collection of event documents (see below for example), what is the most performant method to retrieve the first event with an _id greater than (n) ?
Example Document
{
_id: NumberLong(352757), // Uniqueness guaranteed
type: "BallDropped",
createdAt: "2014-01-01T00:00:00Z",
// ... followed by dynamic properties of unknown size
}
Current Implementation
Given a collection of many events, retrieve the first event with an _id greater than 35.
First, retrieve the id of the event using aggregate.
I do this assuming that the projection phase (return just the id) will be more performant than cycling over full documents of unknown size.
db.events.aggregate(
{ $project: { _id: 1 } },
{ $match: { _id: { $gt: NumberLong(35) } } },
{ $sort: { _id: 1 } },
{ $limit: 1 }
)
Then, I call findOne with the returned _id to retrieve that document.
What are your thoughts?

Resources