SOLR group by different date ranges - search

I have a lot of data in solr like this:
{
id: some_id
date: 2008-01-01T00:00:00Z
price: 34.20
currency: "CAD"
weight: 39.9
etc
}
I'd like to perform searches on it to find the unique set of ids, and group them by time. So sometimes I want to find the items that satisfy the search for each day, or week, or month.
The first way I tried to do this was set an fq (field query) to the date range I want, and set a facet.field=id to get the unique id for that range, but if I want to do this for each day I'd have to do 365(+0/+1) queries, which is quite a pain and very slow.
A solution to this was to use facet.pivot=date,id which would break this down into each day, and then for each day give the set of ids. This is perfect for the day case! However, how do we achieve the same thing for weekly? Or monthly?
What I want is the first facet.pivot, which is date, to be a range of values. So instead of getting this:
{
"responseHeader":{
...
},
"facet_counts":{
...
"facet_pivot":{
"date,id":[{
"field":"date",
"value":"2008-01-01T00:00:00Z",
"count":923,
"pivot":[{
"field":"id",
"value":18,
"count":1},
{
"field":"id",
"value":66,
"count":1},
{
"field":"id",
"value":70,
"count":1},
]
}
...]
}
}
We get something like this:
{
"responseHeader":{
...
},
"facet_counts":{
...
"facet_pivot":{
"date,id":[{
"field":"date",
"value":"2008-01-01T00:00:00Z TO 2008-01-31T00:00:00Z",
"count":923,
"pivot":[...similar to above]
}
...]
}
}
In other words, instead of it grouping based on the value of date, it groups based on a range/interval/etc. I've toyed around with SOLRs interval,range, etc but can't seem to get something that works.

Please try like below to get the monthly range with the gap of week.
facet.range={!tag=rdt}date&facet.range.start=NOW/DAY&facet.range.gap=+7DAY&facet.range.end=NOW/DAY +30DAY&facet=true&facet.pivot={!range=rdt}date,id
I hope this helps!

Related

MongoDb: Best way to store time range

I need to store something like startTime and endTime in my document. To give some more context, these will reflect the opening and closing times for a shop. So, for example, startTime could be 9AM and endTime could be 9PM. What is the best way to store this? This is what I am doing right now:
timings: {
startTime: {
type: String,
required: [true, "....."]
},
endTime: {
type: String,
required: [true, "....."]
}
}
The idea is to store the values as strings ("9AM", "9PM") and do some sort of time parsing each time I query the database. But I was wondering if there was a better approach to this? Another idea I had is to store it as DateTime and ignore the date part. What else can I do? I'd like to avoid parsing/processing on application level as much as possible and leverage the power of mongodb.
I'm using mongoose and nodeJS.
I would agree the Date type is not relevant (and it has something to do with time zones and you might not want to get there ...)
I would store it as a number, not a string. Why ? because you might want to query it (like all "give me all shops that opens after 8pm"), and doing it with a string will be annoying ...
I'd go with that :
{
startTime: {
value: number;
amOrPm: string; //(if you don't want to use a 24 hours base)
},
endTime: {
value: number;
amOrPm: string; //(if you don't want to use a 24 hours base)
},
timeOffset: number // So you keep track on the offset with the base timezone
}
You could also store the minutes, or even store the time only in "minutes ellapsed since midnight" and convert it every time there is an access.
Having the offset this way won't allow you to easily query for a specific moment across different timezones, but I guess it's totally useless in your case.
Also you could store days as a number ('officially' sunday is 0, then monday is 1), but nowadays it is as easy to store a name so well ...
Edit: for the days maybe it's better to go with an array :
{
daysOppenned: [0, 1, 4]
}
And finally, what if each day has a different time openning ? Maybe you would have to consider having an array of days, each containing the time openning and time of closing, like above.
If you want to get even more into details, some shops are closed in middays and some other (like restaurants) only opens two times a day, you could then offer them to tick cases on a schedule and store that in an array.
Let us know if you need to build somthing like that !

Mongodb/mongoose query for completely overlapping dates (that could span multiple documents)

I'm having some issues designing a query that deals with overlapping dates.
So here's the scenario, I can't reveal too much about the actual project but here is a similar example. Lets say I have a FleaMarket. It has a bunch of data about itself such as name, location, etc.
So a FleaMarket would have many Stalls, that are available to be booked for a portion of the year (as short as 2 days, as long as all year sort of thing). So the FleaMarket needs to specify when in a year it will be open. Most scenarios would either be open all year, or all summer/fall, but it could possible be broken down further (because seasons determine pricing). Each FleaMarket would define their Seasons which would include a startDate and endDate (including year).
Here's an ERD to model this example:
When a user attempts to book a Stall, they have already selected a FleaMarket (although ideally it would be nice to search based on availability in the future). It's really easy to tell if a Stall is already booked for the requested dates:
bookings = await Booking.find({
startDate: { $lt: <requested end date> },
endDate: { $gt: <requested start date> },
fleaMarketId: <flea market id>,
}).select('stallId');
bookedIds = bookings.map(b => b.stallId);
stalls = await Stall.find({
fleaMarketId: <flea marked id>,
_id: { $nin: bookedIds }
});
The issue I'm having is determining if a Stall is available for the specified Season. The problem comes that 2 seasons could be sequential, so you could make a booking that spans 2 seasons.
I originally tried a query like so:
seasons = await Season.find({
fleaMarketId: <flea market id>,
startDate: { $lt: <requested end date> },
endDate: {$gt: <requested start date> }
});
And then programatically checked if any returned seasons were sequential, and plucked the available stalls from that that existed in all seasons. But unfortunately I just realized this won't work if the requested date only partially overlaps with a season (ex: requested Jan 1 2020 - Jan 10 2020, but the season is defined as Jan 2 2020 - May 1 2020)
Is there a way I can handle checking for completely overlapping dates that could possible overlap with multiple documents? I was thinking about calculating and storing the current and future available season dates (stored as total ranges) denormalized on the Stall.
At this point I'm almost thinking I need to restructure the schema quite a bit. Any recommendations? I know this seems very relational, but pretty much everywhere else in the application doesn't really do much with the relationships. It's just this search that is quite problematic.
Update:
I just had the thought of maybe creating some sort of Calendar Document that can store a centralized list of availability for a FleaMarket, that would do a rolling update to only store future and present data, and slowly wiping away historical data, or maybe archiving it in a different format. Perhaps this will solve my issue, I will be discussing it with my team soon.
So as I said in an update in my post, I came up with the idea to create a rolling calendar.
For anyone who is interested, here's what I got:
I created an Availability collection, that contains documents like the following:
{
marketId: ObjectId('5dd705c0eeeaf900450e7009'),
stallId: ObjectId('5dde9fc3bf30e500280f80ce'),
availableDates: [
{
date: '2020-01-01T00:00:00.000Z',
price: 30.0,
seasonId: '5dd708e7534f3700a9cad0e7',
},
{
date: '2020-01-02T00:00:00.000Z',
price: 30.0,
seasonId: '5dd708e7534f3700a9cad0e7',
},
{
date: '2020-01-03T00:00:00.000Z',
price: 35.0,
seasonId: '5dd708e7534f3700a9cad0e8',
}
],
bookedDuring: [
'2020-01-01T00:00:00.000Z'
'2020-01-02T00:00:00.000Z'
]
}
Then handling updates to this collection:
Seasons
when creating, $push new dates onto each stall (and delete dates from the past)
When updating, remove the old dates, and add on the new ones (or calculate difference, either works depending on the integrity of this data)
When deleting, remove dates
Stalls
When creating, insert records for associated seasons
When deleting, delete records from availability collection
Bookings
When creating, add dates to bookedDuring
When updating, add or remove dates from bookedDuring
Then to find available stalls for a market, you can query where { marketId: /* market's ID */, availableDates.date: { $all: [/* each desired day */], bookedDuring: { $nin: [/* same dates */ ] } }} and then pluck out the stallId
And to find markets that have available, do { availableDates.dates: { $all: [/* each desired day */], bookedDuring: { $nin: [/* same dates */ ] } }} select distinct marketIds

Mongooses - How to get the data changes in the past two weeks

Supposed I have a book schema. It has few fields.
const Book = new schema({
title: String,
content: String,
like: Number
});
How to get the book which get the most like in the past 2 weeks?
Most at all, it need to updated daily.
For example, supposed that i only need likes in the past two day now. 5 people like the book totally on Day1, 7 people like the book on Day2, and 3 people on Day3. Hence, I expect 12 likes (5 + 7) on Day2, and 10 likes on Day3 (7+3)
I intend to add a field and to a 14-element array.
{
...,
likeCnt: {
type: Array,
default: Array(14).fill(0)
}
}
So only update the Date().getDate() % 14th element when someone likes the book.
However, I need to use a cron-job to zero every book's likeCnt[] everyday.
Please tell me a more efficient solution.
Thank a lot.
I would create a new array where I would keep the date of every change that happens to data.
{
...,
likes: [Date],
}
Then request it that way to get the last two weeks change:
collection.find({
likes: {
$gte: Date.now() - 1209600000,
},
});
This soluce are going to make you to keep the data even after the two weeks.
You can either remove them periodically, or keep it and allow you to maybe change your functionality later. What if next month you need data on 3 weeks? Always think generic and evolutive.

MongoDB fetch rows before and after find result

I'm using MongoDB, (Mongoose on Node.js) I have a very large db of events, each event has a field seq (sequence), the order of the events.
I want to allow my users to find all the occurrences of a given event.
For example:
The user is searching for the event "ButtonClicked", I should return the all the locations that this event happened, in this example say [239, 1992, 5932]
This is easy, and I can just search for the requested event, and return the seq field.
Now I want to let the user view 20 events before, and 20 events after a specific seq.
It would have been great if I could do something like this:
db.events.find( { id:"ButtonClicked", seq: 1992 } ).before(20).after(20);
How can I do that?
Please note that the field seq might start with any number, and skip numbers, but it is incremental!
For example: [3,4,5,6,7,12,13,15,56,57...]
Also, note that the solution can ignore seq, I mentioned this field because I think that it can help the solution.
Thanks!
You could use comparison query operators, in particular $gte and $lte, using seq as a offset for the comparison.
Try:
var seqOffset = 1992;
db.events.find( { seq: { $gte: seqOffset - 20, $lte: seqOffset + 20 } } );
You could not get exactly 40 events, since as you mentioned seq might skip numbers.

How to get a list of all CouchDB documents that are valid on a given date?

I have a large collection of documents and each is valid for a range of days. The range could be from 1 week up to 1 year. I want to be able to get all the documents that are valid on a specific day.
How would I do that?
As an example say I have the following two documents:
doc1 = {
// 1 year ago to today
start_at: "2012-03-22T00:00:00Z",
end_at: "2013-03-22T00:00:00Z"
}
doc2 = {
// 2 months ago to today
start_at: "2012-01-22T00:00:00Z",
end_at: "2013-03-22T00:00:00Z"
}
And a map function:
(doc) ->
emit([doc.start_at, doc.end_at], null)
So for a date of 6 months ago I would only get doc1, a date of 1 week ago I would get both documents, and with a date of tomorrow I would receive no documents.
Note that actual resolution needs to be down to the second of the request being made and there are lots of documents, so strategies of emitting a key for every valid second would not be appropriate.
You could call emit for each day in your range, and then you can easily pick out the documents available for a specific day.
function(doc) {
var day = new Date(doc.start),
end = new Date(doc.end).getTime();
do {
emit(day);
day = new Date(day.getFullYear(), day.getMonth(), day.getDate() + 1);
} while (day.getTime() <= end);
}
Even though you will have lots of documents, if you leave out the value part (2nd param) of your emit, the index will be as small as it could possibly be.
If you need to get more sophisticated, you could try out couchdb-lucene. You can index date fields as date objects and execute range queries with multiple fields in 1 request.
You can translate the problem into the computational geometry problem of location. For documents in two dimensional plane [x,y]=[start_at,end_at] query for those, which are valid at date date is the list of the points in the rectangle bounded by: left=-infinity, right=date (start_at<date) and bottom=date, top=infinity (end_at>date).
Unfortunately, CouchDB team underrate the power of computational geometry and does not support multidimensional queries. There is GeoCouch extension that allows you to do this kind of queries as easy as:
http://localhost:5984/places/_design/main/_spatial/points?bbox=0,0,180,90
on the view emitting spatial value:
emit({ type: "Point", coordinates: [doc.start_at, doc.end_at] }, doc);
The problem is different data type. You get float in range of [-180.0,180.0]/[-90.0,90.0] and need at least int (UNIX time format). If GeoCouch works for you in ranges bigger then 180.0 and the precision of float operation designed for geographical calculation is sufficient for dates with precision of seconds your problem is solved :) I am sure, with few tricks and hacks, you could solve this problem efficiently in geo software. If not GeoCouch then perhaps ElastiSearch (also support multidimensional queries) which is easy to use with CouchDB with its River plugins system.

Resources