Sort by date with null first - node.js

I'm trying to find documents in a collection, ordered by their date. This works, but I get the documents with null in the date field at the bottom, I want these first.
MyModel.find({ }, null, { sort: { date: -1 } }, function(err, models) {
// Models sorted with the "largest" date first and models with null dates last
});
If I change the sorting to { date: 1 } I do get the documents with null at first, but the order otherwise is reverse, which I do not want.
How can I achieve the desired behaviour?

Unfortunately it appears there's no trivial way to do this, akin to SQL's "nulls first" / "nulls last" syntax. This feature is mentioned in this Mongo bug:
https://jira.mongodb.org/browse/SERVER-153
Where it's looped in with the larger feature of custom sorting functions. (I really wish this were broken out into a separate feature request because I would think implementing only the nulls first/last bit should be much easier than custom sorting functions and still provide a lot of value.)
Anyway, the work-around for the time being is to just query separately for the null values in addition to your existing query:
How are null values in a MongoDB index sorted?
MyModel.find({ date: null });
MyModel.find({ date: { $ne: null } }).sort({ date: -1 } });

Related

Upsert and $inc Sub-document in Array

The following schema is intended to record total views and views for a very specific day only.
const usersSchema = new Schema({
totalProductsViews: {type: Number, default: 0},
productsViewsStatistics: [{
day: {type: String, default: new Date().toISOString().slice(0, 10), unique: true},
count: {type: Number, default: 0}
}],
});
So today views will be stored in another subdocument different from yesterday. To implement this I tried to use upsert so as subdocument will be created each day when product is viewed and counts will be incremented and recorded based on a particular day. I tried to use the following function but seems not to work the way I intended.
usersSchema.statics.increaseProductsViews = async function (id) {
//Based on day only.
const todayDate = new Date().toISOString().slice(0, 10);
const result = await this.findByIdAndUpdate(id, {
$inc: {
totalProductsViews: 1,
'productsViewsStatistics.$[sub].count': 1
},
},
{
upsert: true,
arrayFilters: [{'sub.day': todayDate}],
new: true
});
console.log(result);
return result;
};
What do I miss to get the functionality I want? Any help will be appreciated.
What you are trying to do here actually requires you to understand some concepts you may not have grasped yet. The two primary ones being:
You cannot use any positional update as part of an upsert since it requires data to be present
Adding items into arrays mixed with "upsert" is generally a problem that you cannot do in a single statement.
It's a little unclear if "upsert" is your actual intention anyway or if you just presumed that was what you had to add in order to get your statement to work. It does complicate things if that is your intent, even if it's unlikely give the finByIdAndUpdate() usage which would imply you were actually expecting the "document" to be always present.
At any rate, it's clear you actually expect to "Update the array element when found, OR insert a new array element where not found". This is actually a two write process, and three when you consider the "upsert" case as well.
For this, you actually need to invoke the statements via bulkWrite():
usersSchema.statics.increaseProductsViews = async function (_id) {
//Based on day only.
const todayDate = new Date().toISOString().slice(0, 10);
await this.bulkWrite([
// Try to match an existing element and update it ( do NOT upsert )
{
"updateOne": {
"filter": { _id, "productViewStatistics.day": todayDate },
"update": {
"$inc": {
"totalProductsViews": 1,
"productViewStatistics.$.count": 1
}
}
}
},
// Try to $push where the element is not there but document is - ( do NOT upsert )
{
"updateOne": {
"filter": { _id, "productViewStatistics.day": { "$ne": todayDate } },
"update": {
"$inc": { "totalProductViews": 1 },
"$push": { "productViewStatistics": { "day": todayDate, "count": 1 } }
}
}
},
// Finally attempt upsert where the "document" was not there at all,
// only if you actually mean it - so optional
{
"updateOne": {
"filter": { _id },
"update": {
"$setOnInsert": {
"totalProductViews": 1,
"productViewStatistics": [{ "day": todayDate, "count": 1 }]
}
}
}
])
// return the modified document if you really must
return this.findById(_id); // Not atomic, but the lesser of all evils
}
So there's a real good reason here why the positional filtered [<identifier>] operator does not apply here. The main good reason is the intended purpose is to update multiple matching array elements, and you only ever want to update one. This actually has a specific operator in the positional $ operator which does exactly that. It's condition however must be included within the query predicate ( "filter" property in UpdateOne statements ) just as demonstrated in the first two statements of the bulkWrite() above.
So the main problems with using positional filtered [<identifier>] are that just as the first two statements show, you cannot actually alternate between the $inc or $push as would depend on if the document actually contained an array entry for the day. All that will happen is at best no update will be applied when the current day is not matched by the expression in arrayFilters.
The at worst case is an actual "upsert" will throw an error due to MongoDB not being able to decipher the "path name" from the statement, and of course you simply cannot $inc something that does not exist as a "new" array element. That needs a $push.
That leaves you with the mechanic that you also cannot do both the $inc and $push within a single statement. MongoDB will error that you are attempting to "modify the same path" as an illegal operation. Much the same applies to $setOnInsert since whilst that operator only applies to "upsert" operations, it does not preclude the other operations from happening.
Thus the logical steps fall back to what the comments in the code also describe:
Attempt to match where the document contains an existing array element, then update that element. Using $inc in this case
Attempt to match where the document exists but the array element is not present and then $push a new element for the given day with the default count, updating other elements appropriately
IF you actually did intend to upsert documents ( not array elements, because that's the above steps ) then finally actually attempt an upsert creating new properties including a new array.
Finally there is the issue of the bulkWrite(). Whilst this is a single request to the server with a single response, it still is effectively three ( or two if that's all you need ) operations. There is no way around that and it is better than issuing chained separate requests using findByIdAndUpdate() or even updateOne().
Of course the main operational difference from the perspective of code you attempted to implement is that method does not return the modified document. There is no way to get a "document response" from any "Bulk" operation at all.
As such the actual "bulk" process will only ever modify a document with one of the three statements submitted based on the presented logic and most importantly the order of those statements, which is important. But if you actually wanted to "return the document" after modification then the only way to do that is with a separate request to fetch the document.
The only caveat here is that there is the small possibility that other modifications could have occurred to the document other than the "array upsert" since the read and update are separated. There really is no way around that, without possibly "chaining" three separate requests to the server and then deciding which "response document" actually applied the update you wanted to achieve.
So with that context it's generally considered the lesser of evils to do the read separately. It's not ideal, but it's the best option available from a bad bunch.
As a final note, I would strongly suggest actually storing the the day property as a BSON Date instead of as a string. It actually takes less bytes to store and is far more useful in that form. As such the following constructor is probably the clearest and least hacky:
const todayDate = new Date(new Date().setUTCHours(0,0,0,0))

Include $or in $match of aggregate doesn't work in mongoose? [duplicate]

So I have an embedded document that tracks group memberships. Each embedded document has an ID pointing to the group in another collection, a start date, and an optional expire date.
I want to query for current members of a group. "Current" means the start time is less than the current time, and the expire time is greater than the current time OR null.
This conditional query is totally blocking me up. I could do it by running two queries and merging the results, but that seems ugly and requires loading in all results at once. Or I could default the expire time to some arbitrary date in the far future, but that seems even uglier and potentially brittle. In SQL I'd just express it with "(expires >= Now()) OR (expires IS NULL)" -- but I don't know how to do that in Mongo.
Any ideas? Thanks very much in advance.
Just thought I'd update in-case anyone stumbles across this page in the future. As of 1.5.3, mongo now supports a real $or operator: http://www.mongodb.org/display/DOCS/Advanced+Queries#AdvancedQueries-%24or
Your query of "(expires >= Now()) OR (expires IS NULL)" can now be rendered as:
{$or: [{expires: {$gte: new Date()}}, {expires: null}]}
In case anyone finds it useful, www.querymongo.com does translation between SQL and MongoDB, including OR clauses. It can be really helpful for figuring out syntax when you know the SQL equivalent.
In the case of OR statements, it looks like this
SQL:
SELECT * FROM collection WHERE columnA = 3 OR columnB = 'string';
MongoDB:
db.collection.find({
"$or": [{
"columnA": 3
}, {
"columnB": "string"
}]
});
MongoDB query with an 'or' condition
db.getCollection('movie').find({$or:[{"type":"smartreply"},{"category":"small_talk"}]})
MongoDB query with an 'or', 'and', condition combined.
db.getCollection('movie').find({"applicationId":"2b5958d9629026491c30b42f2d5256fa8",$or:[{"type":"smartreply"},{"category":"small_talk"}]})
Query objects in Mongo by default AND expressions together. Mongo currently does not include an OR operator for such queries, however there are ways to express such queries.
Use "in" or "where".
Its gonna be something like this:
db.mycollection.find( { $where : function() {
return ( this.startTime < Now() && this.expireTime > Now() || this.expireTime == null ); } } );
db.Lead.find(
{"name": {'$regex' : '.*' + "Ravi" + '.*'}},
{
"$or": [{
'added_by':"arunkrishna#aarux.com"
}, {
'added_by':"aruna#aarux.com"
}]
}
);
Using a $where query will be slow, in part because it can't use indexes. For this sort of problem, I think it would be better to store a high value for the "expires" field that will naturally always be greater than Now(). You can either store a very high date millions of years in the future, or use a separate type to indicate never. The cross-type sort order is defined at here.
An empty Regex or MaxKey (if you language supports it) are both good choices.

Mongodb check if value is in a nested array

I have a collection in my database that contains a field which is composed of 3 arrays, like this :
use_homepage: {
home: [Array],
hidden: [Array],
archive: [Array]
}
This field represents the homepage of a user.
Each array contains an ObjectID that identifies projects shown on the user homepage.
I would like to check if my project id is in use_homepage.home or use_homepage.hidden, and if it is, remove the id from the array that match.
Can I do this with 1 (or 2 max) requests or do I have to make a request each time I have to check in another array ?
In case you expect to update one document at most, you can try this:
db.entities.findAndModify({
query: { $or : [
{ home: ObjectId('<HERE YOUR ID TO BE FOUND>') },
{ hidden: ObjectId('<HERE YOUR ID TO BE FOUND>') }
]},
update: { $pull: {
home: ObjectId('<HERE YOUR ID TO BE DELETED>'),
hidden: ObjectId('<HERE YOUR ID TO BE DELETED>')
}
}
});
As you can see, in general, you can search for some value and delete some other value.
The statement returns the original matching document (i.e. before the deletion is performed). If you want the modified document you can add the following attribute:
new: true
In case you search for many documents to update, this solution does not work, since findAndModify() works just on the first document matching the query condition.
Finaly, i used to make 2 requests to do the job :
db.User.find({"use_homepage.home": id}, {_id: 1}).toArray(function(err, result) {
// If some users have the id in the array home
db.User.updateMany({_id: {$in: users_match_ids}}, {
$pull: {"use_homepage.home": id}
}
});
// Do the same with 'hidden' array
If anyone see this post and have a better solution, I take it :)

CouchDB reducing sums with date filter

I'm pretty new to couchdb and map/reduce in general. I have the following view:
{
"_id": "_design/keys",
"views": {
"keys": {
"map": "function(doc) { for (var thing in doc) { if (doc.created_at != null) { emit([thing, doc.created_at],1); } } }",
"reduce": "function(key,values) { return sum(values); }"
}
}
}
This works well to give me a sum of the count of all document keys in the database with the proper group_level:
.../_design/keys/_view/keys?group_level=1
{"rows":[
{"key":["_id"],"value":2},
{"key":["_rev"],"value":2},
{"key":["created_at"],"value":2},
{"key":["testing"],"value":2}
]}
Now what I want to do is reduce these mapped documents by date, which is an IOS8601 string:
{"rows":[
{"key":["_id","2015-11-25T21:13:58Z"],"value":1},
{"key":["_id","2015-11-25T21:14:39Z"],"value":1},
{"key":["_rev","2015-11-25T21:13:58Z"],"value":1},
{"key":["_rev","2015-11-25T21:14:39Z"],"value":1},
{"key":["created_at","2015-11-25T21:13:58Z"],"value":1},
{"key":["created_at","2015-11-25T21:14:39Z"],"value":1},
{"key":["testing","2015-11-25T21:13:58Z"],"value":1},
{"key":["testing","2015-11-25T21:14:39Z"],"value":1}
]}
But I still want the results grouped by the first part of the key. That is, I want to specify a start time of 2015-11-25T21:13:57Z and an end time of 2015-11-25T21:13:59Z, and get back everything with the time stamp of 2015-11-25T21:13:58Z, like so:
{"rows":[
{"key":["_id"],"value":1},
{"key":["_rev"],"value":1},
{"key":["created_at"],"value":1},
{"key":["testing"],"value":1}
]}
How can I do this?
You should use your view function to emit the date component of the timestamp (which as you note is conveniently already in hierarchical structure) as a complex key:
Instead of "2015-11-26T...", emit the key as [2015, 11, 26, 21, 13, 58]
Then you can range query on the complex keys to different levels (year, month, date, time). Note that if you use times other than Zulu time you may need to use the view function to read the tz and emit in Zulu time so that all sort correctly.
Please pardon typos as was entered from mobile device
I had a similar problem just a few days ago and found that List Functions are a pretty easy way to solve this. You could simply use the date as key, the things as values, do the counting in the list function and can still use all the regular view features to define start and end keys.

Mongoose/Mongodb previous and next in embedded document

I'm learning Mongodb/Mongoose/Express and have come across a fairly complex query (relative to my current level of understanding anyway) that I'm not sure how best to approach. I have a collection - to keep it simple let's call it entities - with an embedded actions array:
name: String
actions: [{
name: String
date: Date
}]
What I'd like to do is to return an array of documents with each containing the most recent action (or most recent to a specified date), and the next action (based on the same date).
Would this be possible with one find() query, or would I need to break this down into multiple queries and merge the results to generate one result array? I'm looking for the most efficient route possible.
Provided that your "actions" are inserted with the "most recent" being the last entry in the list, and usually this will be the case unless you are specifically updating items and changing dates, then all you really want to do is "project" the last item of the array. This is what the $slice projection operation is for:
Model.find({},{ "actions": { "$slice": -1 } },function(err,docs) {
// contains an array with the last item
});
If indeed you are "updating" array items and changing dates, but you want to query for the most recent on a regular basis, then you are probably best off keeping the array ordered. You can do this with a few modifiers such as:
Model.update(
{
"_id": ObjectId("541f7bbb699e6dd5a7caf2d6"),
},
{
"$push": { "actions": { "$each": [], "$sort": { "date": 1 } } }
},
function(err,numAffected) {
}
);
Which is actually more of a trick that you can do with the $sort modifier to simply sort the existing array elements without adding or removing. In versions prior to 2.6 you need the $slice "update" modifier in here as well, but this could be set to a value larger than the expected array elements if you did not actually want to restrict the possible size, but that is probably a good idea.
Unfortunately, if you were "updating" via a $set statement, then you cannot do this "sorting" in a single update statement, as MongoDB will not allow both types of operations on the array at once. But if you can live with that, then this is a way to keep the array ordered so the first query form works.
If it just seems too hard to keep an array ordered by date, then you can in fact retrieve the largest value my means of the .aggregate() method. This allows greater manipulation of the documents than is available to basic queries, at a little more cost:
Model.aggregate([
// Unwind the array to de-normalize as documents
{ "$unwind": "$actions" },
// Sort the contents per document _id and inner date
{ "$sort": { "_id": 1, "actions.date": 1 } },
// Group back with the "last" element only
{ "$group": {
"_id": "$_id",
"name": { "$last": "$name" },
"actions": { "$last": "$actions" }
}}
],
function(err,docs) {
})
And that will "pull apart" the array using the $unwind operator, then process with a next stage to $sort the contents by "date". In the $group pipeline stage the "_id" means to use the original document key to "collect" on, and the $last operator picks the field values from the "last" document ( de-normalized ) on that grouping boundary.
So there are various things that you can do, but of course the best way is to keep your array ordered and use the basic projection operators to simply get the last item in the list.

Resources