Most efficient way to check if element exists in a set - node.js

so in my MongoDB database I have a collection holding user posts.
Within that collection I have a set called "likes", which holds an array of the ids of the users that have liked that post. When querying I would like to pass a user id to my query and have a boolean in the result telling me whether the id exists in the array to see whether the user has already liked the post. I understand this would be easy to do with two queries, one to get the post and one to check if the user has liked it, but I would like to find the most efficient way to do this.
For example, one of my documents looks like this
{
_id: 24jef247jos991,
post: "Test Post",
likes: ["userid1", "userid2"]
}
When I query from "userid1" I would like the return
{
_id: 24jef247jos991,
post: "Test Post",
likes: ["userid1", "userid2"],
userLiked: true
}
But when I query from let's say "userid3" I would like
{
_id: 24jef247jos991,
post: "Test Post",
likes: ["userid1", "userid2"],
userLiked: false
}

You can add the $addFields stage checking each of the document likes arrays against the input user.
db.collection.aggregate( [
{
$addFields: {
"userLiked":{ $in: [ "userid1", "$likes" ] }
}
}
] )

Starting from MongoDB 3.4 you can use the $in aggregation operator to check if an array contains a given element. You can use the $addFields operator aggregation operator to add the newly computed value to your document without explicitly including other fields.
db.collection.aggregate( [
{ "$addFields": { "userLiked": { "$in": [ "userid1", "$likes" ] } } }
])
In MongoDB 3.2, you can use the $setIsSubset operator and the square bracket [] operator to do this. The downside of this approach is that you need to manually $project all the field in your document. Also the $setIsSubset operator with de-duplicate your array which may not be what you want.
db.collection.aggregate([
{ "$project": {
"post": 1, "likes": 1,
"userLiked": { "$setIsSubset": [ [ "userid3" ], "$likes" ] }
}}
])
Finally if your mongod version is 3.0 or older you need to use the $literal operator instead of the [] operator.

Related

updateOne nested Array in mongodb

I have a group collection that has the array order that contains ids.
I would like to use updateOne to set multiple items in that order array.
I tried this which updates one value in the array:
db.groups.updateOne({
_id: '831e0572-0f04-4d84-b1cf-64ffa9a12199'
},
{$set: {'order.0': 'b6386841-2ff7-4d90-af5d-7499dd49ca4b'}}
)
That correctly updates (or sets) the array value with index 0.
However, I want to set more array values and updateOne also supports a pipeline so I tried this:
db.slides.updateOne({
_id: '831e0572-0f04-4d84-b1cf-64ffa9a12199'
},
[
{$set: {'order.0': 'b6386841-2ff7-4d90-af5d-7499dd49ca4b1'}}
]
)
This does NOTHING if the order array is empty. But if it's not, it replaces every element in the order array with an object { 0: 'b6386841-2ff7-4d90-af5d-7499dd49ca4b1' }.
I don't understand that behavior.
In the optimal case I would just do
db.slides.updateOne({
_id: '831e0572-0f04-4d84-b1cf-64ffa9a12199'
},
[
{$set: {'order.0': 'b6386841-2ff7-4d90-af5d-7499dd49ca4b1'}},
{$set: {'order.1': 'otherid'}},
{$set: {'order.2': 'anotherone'}},
]
)
And that would just update the order array with the values.
What is happening here and how can I achieve my desired behavior?
The update by index position in the array is only supported in regular update queries, but not in aggregation queries,
They have explained this feature in regular update query $set operator documentation, but not it aggregation $set.
The correct implementation in regular update query:
db.slides.updateOne({
_id: '831e0572-0f04-4d84-b1cf-64ffa9a12199'
},
{
$set: {
'order.0': 'b6386841-2ff7-4d90-af5d-7499dd49ca4b1',
'order.1': 'otherid',
'order.2': 'anotherone'
}
}
)
If you are looking for only an aggregation query, it is totally long process than the above regular update query, i don't recommend that way instead, you can format your input in your client-side language and use regular query.
If you have to use aggregation framework, try this (you will have to pass array of indexes and array of updated values separately):
$map and $range to iterate over the order array by indexes
$cond and $arrayElemAt to check if the current index is in the array of indexes that has to be updates. If it is, update it with the same index from the array of new values. If it is not, keep the current value.
NOTE: This will work only if the array of indexes that you want to update starts from 0 and goes up (as in your example).
db.collection.update({
_id: '831e0572-0f04-4d84-b1cf-64ffa9a12199'
},
[
{
"$set": {
"order": {
"$map": {
input: {
$range: [
0,
{
$size: "$order"
}
]
},
in: {
$cond: [
{
$in: [
"$$this",
[
0,
1,
2
]
]
},
{
$arrayElemAt: [
[
"b6386841-2ff7-4d90-af5d-7499dd49ca4b1",
"otherid",
"anotherone"
],
"$$this"
]
},
{
$arrayElemAt: [
"$order",
"$$this"
]
}
]
}
}
}
}
}
])
Here is the working example: https://mongoplayground.net/p/P4irM9Ouyza

Mongoose: Using $addFields, $filter and $map inside deeply nested array in mongoose document

I have a mongoose schema that is structured like this:
Schema E = {
_id,
... some fields,
details: [
{
...somefields,
people: [ObjectIds]
}
]
}
First, I have an aggregate query where I am using $geoNear then $match, and then $facet.
After the operations the document that I get is as follows:
estates: [
{
_id,
... some fields,
details: [
{
...somefields,
people: [ObjectIds]
}
],
... other fields
},
... more estate objects
]
],
page: [...some objects]
I have an array called approved which has some object Ids.
I want to filter the page array inside events.details while keeping the rest of the fields intact.
The result I want is as follows:
NOTE: *The field filteredPeople is the array I want after filtering people with approved.
estate: [
{
_id,
... some fields,
details: [
{
...somefields,
filteredPeople: [ObjectIds],
numberOfPeople: Size of people array
}
],
... other fields
},
... more estate objectes
],
page: [...some objects]
This is what I tried doing:
{
"estates": {
"$map": {
"input": "$estates",
"as": "estate",
"in": {
"details": {
"$map": {
"input": "$$estate.details",
"as": "detail",
"in": {
"filteredPeople": {
"$filter": {
"input": "$$detail.people",
"as": "people",
"cond": { "$in": ["$$people", approved] }
}
}
}
}
}
}
}
},
}
But this erases the other fields. The other way is to create a separate field called estatePeople where the result of the $addFields will be stored.
I could then try to merge the two arrays. But I don't have any field to match them as the second estatePoeple array will not have anything but the filteredPeople. So I will then somehow have to merge the two arrays just by the index of the array and where they appear.
Can someone please help me out on how to get the desired document with relatively good performance?
For anyone who has the same problem:
In the end, I was unable to find any way to execute the query that I wanted with reasonable performance.
This schema design is not the optimal way to execute such complicated queries. What I ended up doing was making the details array an object and have separate documents for separate details. And then I made a parent schema that kept reference of the details for the same estate.
You can use reverse referencing or referencing according to the queries that you want to execute.

mongodb find with calculated field

I'm trying to create a mongodb query using the filtered value in the filter. For example:
var myIdVariable = '1jig23h34r34r30h';
var myVisibleVariable = false;
var myDistanceVariable = 100;
db.getCollection.find({
'_id': myIdVariable,
'isVisible': myVisibleVariable,
'distanceRange': {$lte: {myDistanceVariable - distanceRange}}
})
So, I want filter the distanceRange from database based on the calculation of (myDistanceVariable - distanceRange), with the distanceRange given in the same query.
I don't know if I give you a clear explanation of my problem. It's possible?
Thanks you.
Use the $expr operator to build a query expression that allows you to compare fields from the same document as well as compare the distanceRange field with the calculation of the field itself and your variables.
You would need to use the logical $and query operator to include the other query expressions thus your final query would look like the following:
db.getCollection('collectionName').find({
'$expr': {
'$and': [
{ 'isVisible': myVisibleVariable },
{ '$lte': [
'$distanceRange', {
'$subtract': [
myDistanceVariable, '$distanceRange'
]
}
] }
]
}
})
If your MongoDB server doesn't support the $expr operator then go for the aggregation framework route with $redact
db.getCollection('collectionName').aggregate([
{ "$redact": {
"$cond": [
{
'$and': [
{ 'isVisible': myVisibleVariable },
{ '$lte': [
'$distanceRange', {
'$subtract': [
x, '$distanceRange'
]
}
] }
]
},
"$$KEEP",
"$$PRUNE"
]
} }
])
Note
Including the _id in the query expressions means you are narrowing down your selection to just a single document and the query may not return any results since it's looking for a specific document with that _id AND the same document should satisfy the other query expressions.

Query data where userID in multiples ID

I try to make a query and i don't know the right way to do this.
The mongo collection structure contains multiples user ID (uid) and i want to make a query that get all datas ("Albums") where the User ID match one of the uid.
I do not know if the structure of the collection is good for that and I would like to know if I should do otherwise.
{
"_id": ObjectId("55814a9799677ba44e7826d1"),
"album": "album1",
"pictures": [
"1434536659272.jpg",
"1434552570177.jpg",
"1434552756857.jpg",
"1434552795100.jpg"
],
"uid": [
"12814a8546677ba44e745d85",
"e745d677ba4412814e745d7b",
"28114a85466e745d677d85qs"
],
"__v": 0
}
I just searched on internet and found this documentation http://docs.mongodb.org/manual/reference/operator/query/in/ but I'm not certain that this is the right way.
In short, I need to know: if I use the right method for the stucture of the collection and the operator "$in" is the right solution (knowing that it may have a lot of "User ID": between 2 and 2000 maximum).
You don't need $in unless you are matching for more than one possible value in a field, and that field does not have to be an array. $in is in fact shorthand for $or.
You just need a simple query here:
Model.find({ "uid": "12814a8546677ba44e745d85" },function(err,results) {
})
If you want "multiple" user id's then you can use $in:
Model.find(
{ "uid": { "$in": [
"12814a8546677ba44e745d85",
"e745d677ba4412814e745d7b",
] } },
function(err,results) {
}
)
Which is short for $or in this way:
Model.find(
{
"$or": [
{ "uid": "12814a8546677ba44e745d85" },
{ "uid": "e745d677ba4412814e745d7b" }
]
},
function(err,results) {
}
)
Just to answer your question, you can use the below query to get the desired result.
db.mycollection.find( {uid : {$in : ["28114a85466e745d677d85qs"] } } )
However, you need to revisit your data structure, looks like its a Many-to-Many problem and you might need to think about introducing a mid collection for that.

Mongodb how can I query a subset of an array for a specific document

I want to run a query to select specific documents. Then on each document, open up an array of sub documents and run a query to filter those sub docs.
Example:
{
"_id" : ObjectID(23412351346435),
"list" : [
{date: ISODate(2015-01-12T00:00:00.000Z), name: "Jan 12"},
{date: ISODate(2015-01-13T00:00:00.000Z), name: "Jan 13"},
{date: ISODate(2015-01-14T00:00:00.000Z), name: "Jan 14"}
]
}
I'm guessing I can do something with Mongo's aggregate function. I have been able to match the documents I want, but how can I get a sub query going on the array? I tried using $elemMatch but that only returns the first item in the array that matches the date range.
To be clear when I query for ObjectID(23412351346435) and Date range 2015-01-12 to 2015-01-13 I want it to return this;
{
"_id" : ObjectID(23412351346435),
"list" : [
{date: ISODate(2015-01-12T00:00:00.000Z), name: "Jan 12"},
{date: ISODate(2015-01-13T00:00:00.000Z), name: "Jan 13"}
]
}
As you guessed, you can use aggregation to get the results you are looking for. The steps you need in your aggregation pipeline are the following.
Match the documents you want. (I've done that here by _id).
Unwind the list array.
Match the dates you are looking for using a range query.
Group the documents back by _id.
Your query should look something like this:
db.collection.aggregate([
{ "$match": { "_id": ObjectId("54ef8b0acfb269d664de0b48")} },
{ "$unwind": "$list" },
{ "$match": {
"list.date": { $gte: ISODate("2015-01-12T00:00:00.000Z"),
$lte: ISODate("2015-01-13T00:00:00.000Z")
}
}},
{ "$group": {
"_id": "$_id",
"list": { "$push": { "date": "$list.date", "name": "$list.name" }}
}}
]);
You can use the $unwind operator. It allows you to take an array in a document, and clone that document with each of the array elements. Then you can match on the field. If necessary, you can use $group to wind the document back up.
[
{$match:{...}},
{$unwind:"myfield"},
{$match:{"myfield.name":"Jan 12"}},
{$group:{ _id:"$id", "myfield":{$push:"$myfield"} }}
]
Note that $unwind is slow for large sets of documents, and combinatoric for multiple arrays. It is the only option you really have for Mongo 2.4.x though. You might be better served by rearranging your data so that you do not have arrays.

Resources