MongoDB query using $near not working when nested - node.js

I have several documents in my Articles collection. Every document has a location value and some extra data. The location value looks like this:
"loc" : {
"type" : "Point",
"coordinates" : [
4,
54
]
}
I can build an index by executing the following command:
db.articles.ensureIndex({loc:"2dsphere"});
And I can query documents based on their location and a $maxDistance with the following query:
db.articles.find({ loc : { $near : {$geometry : {type : "Point" , coordinates : [4, 54] }, $maxDistance : 1000 } } });
This works perfectly!
However when I change the location of my "loc" object in my document, my query always returns zero results. My document should look like this (This is a minimized version):
{
"articledata" {
"content": {
"contact": {
"loc" : {
"type" : "Point",
"coordinates" : [
4.1,
54
]
}
}
}
}
}
When I rebuild my index query:
db.articles.ensureIndex({"articledata.content.contact.loc":"2dsphere"});
and execute my query again after changing my 'loc' location in the document:
db.articles.find({ "articledata.content.contact.loc" : { $near : {$geometry : {type : "Point" , coordinates : [4, 54] }, $maxDistance : 10000 } } });
There are no results.
It's probably some stupid mistake but I really can't find the problem...
Is there anyone who can help me out?
Thanks in advance!

Related

MongoDB: How to find documents by value?

Two documents containing ObjectId("6148a371c13a6a0be492ebf4")
Document 1
{
"_id" : ObjectId("6144f66fb9543917f96fc"),
"refId" : "ford",
"template" : "6144f61cb96d772317f96f9",
"fieldValues" : {
"PDV" : [
"6126938cd24a8aa3d37b4992",
ObjectId("6148a371c13a6a0be492ebf4")
]
},
"group" : ObjectId("6144f66fb96d7731917f96fd"),
"createdAt" : ISODate("2021-09-17T20:11:27.440Z"),
"updatedAt" : ISODate("2021-09-20T15:06:26.146Z"),
"__v" : 0
}
Document 2
{
"_id" : ObjectId("6144f66fb96d77rr3217f96fc"),
"refId" : "CCM",
"template" : "6144f613296d7731917f96f9",
"fieldValues" : {
"DDB" : [
"6126938cd2448aa3d37b4992",
"5443938cd2448aa3d37b4992",
ObjectId("6148a371c13a6a0be492ebf4"),
]
},
"group" : ObjectId("6144f66fb96de431917f96fd"),
"createdAt" : ISODate("2021-09-17T20:11:27.440Z"),
"updatedAt" : ISODate("2021-09-20T15:06:26.146Z"),
"__v" : 0
}
ObjectId that we looking for is always inside fieldValues but instead of PDV or DDB we will always have the different naming.
So we can't use this type of query:
db.getCollection('products').find({"fieldValues.PDV":ObjectId('6148a371c13a6a0be492ebf4')})
PS. This query should work only on DB, we can't afford to query all products and do calculation on backend there might to be a millions of products.
You can use this one:
db.collection.aggregate([
{
$set: {
kv: { $first: { $objectToArray: "$fieldValues" } }
}
},
{ $match: { "kv.v": ObjectId("6148a371c13a6a0be492ebf3") } },
{ $unset: "kv" }
])
Mongo Playground
db.products.find({'_id': ObjectId("6148a371c13a6a0be492ebf4")})
The mistake in your code is that you used key instead of _id.
This way of writing it is much easier on the fingers though.
You'd think a solution like this would work but one reason why this may not is because you're trying to use === on an object. If you refer to this thread, it might help if you use .equals() instead of ===.

$add,$subtract aggregation-framework in mongodb

Hi i am mentioning the sample data
///collection - test////
{
"_id" : {
"date" : ISODate("2020-02-11T17:00:00Z"),
"userId" : ObjectId("5e43e5cdc11f750864f46820"),
"adminId" : ObjectId("5e43de778b57693cd46859eb")
},
"outstanding" : 212.39999999999998,
"totalBill" : 342.4,
"totalPayment" : 130
}
{
"_id" : {
"date" : ISODate("2020-02-11T17:00:00Z"),
"userId" : ObjectId("5e43e73169fe1e3fc07eb7c5"),
"adminId" : ObjectId("5e43de778b57693cd46859eb")
},
"outstanding" : 797.8399999999999,
"totalBill" : 797.8399999999999,
"totalPayment" : 0
}
I need to structure a query which does following things-
I need to calculate the actualOutstanding:[(totalBill+outstanding)-totalPayment],
I need to save this actualOutstanding in the same collection & in the same document according to {"_id" : {"date","userId", "adminId" }}
NOTE: userId is different in both the documents.
Introduced in Mongo version 4.2+ pipelined updates, meaning we can now use aggregate expressions to update documents.
db.collection.updateOne(
{
"adminId" : ObjectId("5e43de778b57693cd46859eb")
'_id."userId" : ObjectId("5e43e73169fe1e3fc07eb7c5"),
'_id.date': ISODate("2020-02-11T18:30:00Z"),
},
[
{ '$set': {
actualOutstanding: {
$subtract:[ {$add: ['$totalBill','$outstanding']},'$totalPayment']
}
} }
]);
For any other Mongo version you have to split it into 2 actions, first query and calculate then update the document with the calculation.

MongoDB-Query Optimization

I have a collection with a sub-document consisting of more than 40K records.
My aggregate query takes about 300 secs. I have tried optimizing the same using compound as well as multi-key indexing, which completes in 180 secs.
I still require a reduced query time execution.
here is my collection:
{
"_id" : ObjectId("545b32cc7e9b99112e7ddd97"),
"grp_id" : 654,
"user_id" : 2,
"mod_on" : ISODate("2014-11-06T08:35:40.857Z"),
"crtd_on" : ISODate("2014-11-06T08:35:24.791Z"),
"uploadTp" : 0,
"tp" : 1,
"status" : 3,
"id_url" : [
{"mid":"xyz12793"},
{"mid":"xyz12794"},
{"mid":"xyz12795"},
{"mid":"xyz12796"}
],
"incl" : 1,
"total_cnt" : 25,
"succ_cnt" : 25,
"fail_cnt" : 0
}
and following is my query
db.member_id_transactions.aggregate([ { '$match':
{ id_url: { '$elemMatch': { mid: 'xyz12794' } } } },
{ '$unwind': '$id_url' },
{ '$match': { grp_id: 654, 'id_url.mid': 'xyz12794' } } ])
has anyone faced the same issue?
here's the o/p for aggregate query with explain option
{
"result" : [
{
"_id" : ObjectId("546342467e6d1f4951b56285"),
"grp_id" : 685,
"user_id" : 2,
"mod_on" : ISODate("2014-11-12T11:24:01.336Z"),
"crtd_on" : ISODate("2014-11-12T11:19:34.682Z"),
"uploadTp" : 1,
"tp" : 1,
"status" : 3,
"id_url" : [
{"mid":"xyz12793"},
{"mid":"xyz12794"},
{"mid":"xyz12795"},
{"mid":"xyz12796"}
],
"incl" : 1,
"__v" : 0,
"total_cnt" : 21406,
"succ_cnt" : 21402,
"fail_cnt" : 4
}
],
"ok" : 1,
"$gleStats" : {
"lastOpTime" : Timestamp(0, 0),
"electionId" : ObjectId("545c8d37ab9cc679383a1b1b")
}
}
One way to reduce the number of records being filtered further is to include the field grp_id, in the first $match operator.
db.member_id_transactions.aggregate([
{$match:{ "id_url.mid": 'xyz12794',"grp_id": 654 } },
{$unwind: "$id_url" },
{$match: { "id_url.mid": "xyz12794" } }
])
See how the performance is now. Add grp_id to the index to get better response time.
The above aggregation query though it works, is unnecessary. since you are not altering the structure of the document, and you expect only one element in the array to match the filter condition, you could just use a simple find and project.
db.member_id_transactions.find(
{ "id_url.mid": "xyz12794","grp_id": 654 },
{"_id":0,"grp_id":1,"id_url":{$elemMatch:{"mid":"xyz12794"}},
"user_id":1,"mod_on":1,"crtd_on":1,"uploadTp":1,
"tp":1,"status":1,"incl":1,"total_cnt":1,
"succ_cnt":1,"fail_cnt":1
}
)

Using near with elemMatch in Mongoose

I am searching within a collection of Stores. Stores have an embedded collection of outlets with locations. My goal is to return the set of stores that have outlets near a geolocation, and also only return those Outlets within that location.
I can successfully restrict the query to only return Stores have an Outlet at a particular location using 'near'
Store
.where('isActive').equals(true)
.where('outlets.location')
.near({ center: [153.027117, -27.468515], maxDistance: 1000 / 6378137, spherical: true })
.where('outlets.isActive').equals(true)
.where('products.productType').equals('53433f1f3e02e39addde1954')
.where('products.isActive').equals(true)
.select('name outlets')
.select({'products': {$elemMatch: {'isActive': true, 'productType': '53433f1f3e02e39addde1954'}}})
.select('name outlets')
.execQ()
.then(function (results) {
console.log(results);
})
.fail(function (err) {
console.log(err);
})
.done();
The problem I have is that the store document returns all the outlets, not just the outlet that matched the geolocation. I've tried using elemMatch within a select like I did with the products;
.select({'outlets': {$elemMatch: {'location': {near:{ center: [153.027117, -27.468515], maxDistance: 10000 / 6378137, spherical: true }}}}})
However it returns an empty array. Can use use the near operator in an elemMatch clause? Am I doing it incorrectly? Is there an more efficient/fast/better way to achieve the goal?
I see what you are trying to do here but there seems to be a few flaws in this sort of design. Though not exactly your document structure I see you are trying to do something like this:
{
"_id" : ObjectId("5344badd519563414f23fdf8"),
"store" : "Mine",
"outlets" : [
{
"name" : "somewhere",
"loc" : {
"type" : "Point",
"coordinates" : [
150.975131,
-33.8440366
]
}
},
{
"name" : "else",
"loc" : {
"type" : "Point",
"coordinates" : [
151.3651524,
-33.8389783
]
}
}
]
}
{
"_id" : ObjectId("5344be6f519563414f23fdf9"),
"store" : "Another",
"outlets" : [
{
"name" : "else",
"loc" : {
"type" : "Point",
"coordinates" : [
151.3651524,
-33.8389783
]
}
},
{
"name" : "somewhere",
"loc" : {
"type" : "Point",
"coordinates" : [
150.975131,
-33.8440366
]
}
}
]
}
So basically you appear to be attempting to nest the outlet locations within an array in a top level document.
What I am referring to a flaw here is that by design, any type of "near" based query is going to return more than 1 result. That does seem logical when you look at the purpose. You can of course modify this to restrict the results by "maxDistance" but generally it will be more than 1.
So the only way is to .limit() the results returned by the cursor to a single "nearest" response. Also note that with some operations those results are not necessarily "sorted" with the "nearest response first.
Now as these results are actually contained within an array of the document, remember that .find() itself does not actually "filter" the results of an array, so of course the document will contain all of the array contents.
If you tried to "project" with a positional $ operator, then the problem falls back to the original point because there is no singular actual match, so it is not possible to return an "index" value for the matching element. If you in fact did try this, you would always get the default index value of 0, so just returning the first element.
If then you thought you could run off to aggregate and and try to actually "de-normalize" the array entries, you would be out of luck because due to the need to use the index at the first stage of any aggregation pipeline statement.
So the summary of this is that embedded entries like this are not suited to this design where you need to do geo-spatial matching on those store locations. The locations are better off in a separate collection:
{
"_id" : ObjectId("5344bec7519563414f23fdfa"),
"store": "Mine"
"name" : "else",
"loc" : {
"type" : "Point",
"coordinates" : [
151.3651524,
-33.8389783
]
}
}
{
"_id" : ObjectId("5344bed5519563414f23fdfb"),
"store": "Mine"
"name" : "somewhere",
"loc" : {
"type" : "Point",
"coordinates" : [
150.975131,
-33.8440366
]
}
}
So that would allow you to "limit" the result to the "nearest" match by setting the limit to 1. You can also include any necessary values such as the "store" to be used in your filtering. If you need to you can include other information aside from what you need to filter or otherwise just put the ObjectId values within the array of the original object, or possibly even duplicate for both collections.
But since the very nature of these queries is intended to not only return 1 match, then there is no way you are going to get this to work on embedded documents. So your solution will require some changes in your overall schema.

MongoDB geospatial index, how to use it with array elements?

I would like to get Kevin pub spots near a given position. Here is the userSpots collection :
{ user:'Kevin',
spots:[
{ name:'a',
type:'pub',
location:[x,y]
},
{ name:'b',
type:'gym',
location:[v,w]
}
]
},
{ user:'Layla',
spots:[
...
]
}
Here is what I tried :
db.userSpots.findOne(
{ user: 'Kevin',
spots: {
$elemMatch: {
location:{ $nearSphere: [lng,lat], $maxDistance: d},
type: 'pub'
}
}
},
},
function(err){...}
)
I get a strange error. Mongo tells me there is no index :2d in the location field. But when I check with db.userSpots.getIndexes(), the 2d index is there. Why doesn't mongodb see the index ? Is there something I am doing wrong ?
MongoError: can't find special index: 2d for : { spots: { $elemMatch: { type:'pub',location:{ $nearSphere: [lng,lat], $maxDistance: d}}}, user:'Kevin'}
db.userSpots.getIndexes() output :
{
"0" : {
"v" : 1,
"key" : {
"_id" : 1
},
"ns" : "mydb.userSpots",
"name" : "_id_"
},
"1" : {
"v" : 1,
"key" : {
"spots.location" : "2d"
},
"ns" : "mydb.usersBoxes",
"name" : "spots.location_2d",
"background" : true,
"safe" : null
}
}
For a similar geospatial app, I transformed the location into GeoJSON:
{
"_id" : ObjectId("5252cbdd9520b8b18ee4b1c3"),
"name" : "Seattle - Downtown",
"location" : {
"type" : "Point",
"coordinates" : [
-122.33145,
47.60789
]
}
}
(the coordinates are in longitude / latitude format. Mongo's use of GeoJSON is described here.).
The index is created using:
db.userSpots.ensureIndex({"location": "2dsphere"})
In my aggregation pipeline, I find matches using:
{"$match":{"location":{"$geoWithin": {"$centerSphere":[[location.coordinates[0], location.coordinates[1]], radius/3959]}}}}
(where radius is measured in miles - the magic number is used to convert to radians).
To index documents containing array of geo data MongoDB uses multi-key index. Multi-key index unwinds document to some documents with single value instead of array before indexing. So the index consider that key field as single value field not array.
Try query it without $elemMatch operator.

Resources