Modeling time-based application in NodeJs - node.js

Im developing an auction style web app, where products are available for a certain period of time.
I would like to know how would you model that.
So far, what I've done is storing products in DB:
{
...
id: p001,
name: Product 1,
expire_date: 'Mon Oct 7 2013 01:23:45 UTC',
...
}
Whenever a client requests that product, I test *current_date < expire_date*.
If true, I show the product data and, client side, a countdown timer. If the timer reaches 0, I disable the related controls.
But, server side, there are some operations that needs to be done even if nobody has requested that product, for example, notify the owner that his product has ended.
I could scan the whole collection of products on each request, but seems cumbersome to me.
I thought on triggering a routine with cron every n minutes, but would like to know if you can think on any better solutions.
Thank you!

Some thoughts:
Index the expire_date field. You'll want to if you're scanning for auction items older than a certain date.
Consider adding a second field that is expired (or active) so you can also do other types of non-date searches (as you can always, and should anyway, reject auctions that have expired).
Assuming you add a second field active for example, you can further limit the scans to be only those auction items that are active and beyond the expiration date. Consider a compound index for those cases. (As over time, you'll have more an more expired items you don't need to scan through for example).
Yes, you should add a timed task using your favorite technique to scan for expired auctions. There are lots of ways to do this -- your infrastructure will help determine what makes sense.
Keep a local cache of current auction items in memory if possible to make scanning efficient as possible. There's no reason to hit the database if nothing is expiring.
Again, always check when retrieving from the database to confirm that items are still active -- as there easily could be race conditions where expired items may expire while being retrieved for display.
You'll possible want to store the state of status e-mails, etc. in the database so that any server restarts, etc. are properly handled.
It might be something like:
{
...
id: p001,
name: "Product 1",
expire_date: ISODate("Mon Oct 7 2013 01:23:45 UTC"),
active: true,
...
}
// console
db.auctions.esureIndex({expire_date: -1, active: 1})
// javascript idea:
var theExpirationDate = new Date(2013, 10, 06, 0, 0, 0);
db.auctions.find({ expire_date : { "$lte" : theExpirationDate }, active: true })

Scanning the entire collection on each request sounds like a huge waste of processing time.
I would use something like pm2 to handle both keeping track of your main server process as well as running periodic tasks with its built-in cron-like functionality.

Related

WooCommerce Subscriptions: how to determine the last correctly paid order for a given subscription

Is there any already-programmed method to get the last correctly-paid order for a given subscription?
$subscription->get_last_order() will return the last associated order, no matter if that order involved a correct-payment or not.
$subscription->get_related_orders() will return the whole list of orders, and the list can include pending-payment or failed orders.
I think if you wrap / trigger $subscription->get_last_order() with the woocommerce_subscription_payment_complete action (https://docs.woocommerce.com/document/subscriptions/develop/action-reference/) you would essentially achieve that objective. That hook fires both for initial subscription orders and renewal orders and will ensure the $last_order is paid for. Something like this:
add_action( 'woocommerce_subscription_payment_complete', 'set_last_order' );
function set_last_order( $subscription ) {
$last_order = $subscription->get_last_order( 'all', 'any' );
// If you want to be able to reference that $last_order at any time
// then you could just save/update that order ID to post meta so
// that you can grab it any time outside of the action.
}
}
I know that seems a little clunky, but it's the best way I can think of. The only other option that comes to mind would be to loop through $subscription->get_related_orders() checking is_paid() from high IDs to low IDs and grabbing the first one from there.

EventSourcing for Aggregates that rely on other aggregates

I'm currently working on a calendar system written in an EventSource style.
What I'm currently struggling with is how to create an event which create lots of smaller events and how to store the other events in a way which will allow them to be replayed.
For example I may trigger an CreateReminderSchedule which then triggers the construction of many smaller events such as CreateReminder.
{
id:1
description: "Clean room",
weekdays:[5]
start: 01.12.2018,
end: 01.12.2018
type: CREATEREMINDERSCHEDULE
}
This will then create loads of CreateReminder aggregates with different ids so you can edit the smaller ones i.e.
{
id:2
description: "Clean room"
date: 07.12.2018
type: CREATEREMINDER
scheduleId: 1
}
So to me one problem is when I replay all the events the createReminderSchedule will then retrigger createReminderEvents which mean I'll have more reminders than needed during the replay.
Is the answer to remove the smaller events and just have one big create event listing all the ids of the reminders within the event like:
{
id:1
description: "Clean room",
weekdays:[5]
start: 01.12.2018,
end: 01.12.2018
type: CREATEREMINDERSCHEDULE
reminderIds:[2,3,4,5,...]
}
But if i do this way then I won't have the base event for all my reminder aggregates.
Note the reminders must be aware of the reminderSchedule so I can later change the reminderSchedule to update all the reminders related to that reminderschedule
Perhaps you are confusing events with commands. You could have a command that is processed to create your reminders (in the form of events, ie ReminderCreated) which is then applied to your aggregate to create your reminder-objects. This state is recreated in the same way every time you replay your events from its source.

User Segmentation Engine using MongoDB

I have an analytics system that tracks customers and their attributes as well as their behavior in the form of events. It is implemented using Node.js and MongoDB (with Mongoose).
Now I need to implement a segmentation feature that allows to group stored users into segments based on certain conditions. For example something like purchases > 3 AND country = 'Netherlands'
In the frontend this would look something like this:
An important requirement here is that the segments get updated in realtime and not just periodically. This basically means, that every time a user's attributes change or he triggers a new event, I have to check again which segments he does belong to.
My current approach is to store the conditions for the segments as MongoDB queries, that I can then execute on the user collection in order to determine which users belong to a certain segment.
For example a segment to filter out all users that are using Gmail would look like this:
{
_id: '591638bf833f8c843e4fef24',
name: 'Gmail Users',
condition: {'email': { $regex : '.*gmail.*'}}
}
When a user matches the condition I would then store that he belongs to the 'Gmail Users' segment directly on the user's document:
{
username: 'john.doe',
email: 'john.doe#gmail.com',
segments: ['591638bf833f8c843e4fef24']
}
However by doing this, I would have to execute all queries for all segments every time a user's data changes, so I can check if he is part of the segment or not. This feels a bit complicated and cumbersome from a performance point of view.
Can you think of any alternative way to approach this? Maybe use a rule-engine and do the processing in the application and not on the database?
Unfortunately I don't know a better approach but you can optimize this solution a little bit.
I would do the same:
Store the segment conditions in a collection
Once you find a matching user, store the segment id in the user's document (segments)
An important requirement here is that the segments get updated in realtime and not just periodically.
You have no choice, you need to run the segmentation query every times when a segment changes.
I would have to execute all queries for all segments every time a user's data changes
This is where I would change your solution, actually just optimise it a little bit:
You don't need to run the segmentation queries on the whole collection. If you put your user id into the query with an $and, Mongodb will fetch the user first and after that will check the rest of the segmentation conditions. You need to make sure Mongodb uses the user's _id as an index, for this you can use .explain() to check it or .hint() to force it. Unfortunately you need to run N+1 queries if you have N segments (+1 is for the user update)
I would fetch every segments and store them in a cache (redis). If someone changed the segment I would update the cache as well. (Or just invalidate the cache and the next query will handle the rest, depends on the implementation). The point is that I would have every segments without fetching the database and if a user updated a record I would go through every segments with Node.js and validate the user by the conditions and I could update the user's segments array in the original update query so it would not require any extra database operation.
I know it could be a pain in the ass implementing something like this but it doesn't overload the database ...
Update
Let me give you some technical details about my second suggestion:
(This is just a pseudo code!)
Segment cache
module.exporst = function() {
return new Promise(resolve) {
Redis.get('cache:segments', function(err, segments) {
// handle error
// Segments are cached
if(segments) {
segments = JSON.parse(segments);
return resolve(segments);
}
//fetch segments and save it to the cache
Segments.find().exec(function(err, segments) {
// handle error
segments = JSON.stringify(segments);
// Save to the database but set 60 seconds as an expiration
Redis.set('cache:segments', segments, 'EX', 60, function(err) {
// handle error
return resolve(segments);
})
});
})
}
}
User update
// ...
let user = user.findOne(_id: ObjectId(req.body.userId));
// etc ...
// fetch segments from cache or from the database
let segments = yield segmentCache();
let userSegments = [];
segments.forEach(function(segment) {
if(checkSegment(user, segment)) {
userSegments.push(segment._id)
}
});
// Override user's segments with userSegments
This is where the magic happens, somehow you need to define the conditions in a way you can use them in an if statement.
Hint: Lodash has this functions: _.gt, _.gte, _.eq ...
Check segments
module.exports = function(user, segment) {
let keys = Object.keys(segment.condition);
keys.forEach(function(key) {
if(user[key] === segment.condition[key]) {
return false;
}
})
return true;
}
You are already storing an entire segment "query" in a document in segments collection - why not include a field in the same document which will enumerate which fields in the users document impact membership in a particular segment.
Since action of changing user data will know which fields are being changed, it can fetch only the segments which are computed using the fields being changed significantly reducing the size of segmentation "queries" you have to re-run.
Note that a change in user's data may add them to a segment they are not currently a member of, so checking only the segments currently stored in the user is not sufficient.

How to implement a 'past due' notification in Meteor.js

I have a "task" model (collection) in my app, it has a due date and I'd like to send out notifications once the task is past due.
How should I implement the "past due" property so the system can detect "past due" at any time?
Do I set up cron job to check every minute or is there a better way?
I'd recommend using synced-cron for this. It has a nice interface, and if you expand to multiple instances, you don't have to worry about each instance trying to execute the task. Here is an example of how you could use it:
SyncedCron.add({
name: 'Notify users about past-due tasks',
schedule: function(parser) {
// check every two minutes
return parser.recur().on(2).minute();
},
job: function() {
if(Tasks.find(dueAt: {$lte: new Date}).count())
emailUsersAboutPastDueTasks()
}
});
Of course, you'd also want to record which users had been notified or run this less frequently so your users don't get bombarded with notifications.

Turn Based Participant Timeout Date Always NULL

Have been working on a two-player turn based game that uses a custom UI for match management. Considering restricting the app to iOS 6+ in order to use player timeouts. I would like to show the user the remaining amount of time to move, but the participant.timeoutDate is always null? Per the WWDC 2012 video (that says the timeout won't apply to the last participant in nextParticipants), I pass an array with two entries (opponent at index 0 and local player at index 1) when calling endTurnWithNextParticipants:turnTimeout:matchData:completionHandler: to take a turn. I've tried both GKTurnTimeoutDefault and various integer literals ... no luck ... always seems to be null. The player's last turn date works fine.
On the subject of player timeouts ... after I get them working, how is this delivered? I see GKTurnBasedMatchOutcomeTimeExpired ... does this come in a turn event?
From Apple's developer forum
Elian Gidoni -
+1
The doc should be:
timeoutDate
The date and time when the participant’s turn timed out. (read-only)

Resources