How to realize an endless Queue in NodeJS - node.js

I have a table t_user in a MySQL Database with a field "user_isonline". This flag can change for every user every second (done by user interaction on an external website).
Now I would like to realize a nodeJS script, which runs all day, checking if there are users with "user_isonline" = true, if yes put them into a queue and process them somehow.
As I am very new to nodeJS and async programming, I actually have only very few ideas on even how to start.
It would be great to have some slim lines of code and not to use any pre-defined package or something.
Thank you very much in advance.

With the setTimeout Function which Node provides you could create a timer which queries the database if something has changed (i.e. Query for isOnline and attach the new users to your queue)
More details about setTimeout: Timers Doc NodeJS

Related

Run a schedule function for every user in node js

I need to find the best possible way to do the following task.
I have different users let's say (over 500) and all users have a scheduled function that need to be run twice every day. But if any of the user's phone is off. Then of course that function wont work since its code is written on client side.
now what i want to do is Run a scheduled function in the backend using Node js, but idk how to run that for every user. (note : every user has different schedules). Thats why i wrote that in client side, but with with a possibility of phone might be switched off so its bit off to do that.
What should i do in this scenario? any leads?

Switch Databases dynamically

I'm doing a POS(point of sale) as Saas with React in the frontend, NodeJs in backend(API Rest) and MongoDB as the database.
I've finished a basic program and now I want any user is registered will have his own database.
After read some articles and question on the internet my conclusion was switch between databases each time the frontend consume the backend(API).
General Logic:
User Log in
In the backend, I use a general database to check user credentials and also I acquire the name of the database of this user.
Each time the frontend consumes the API the next codes are executed in a middleware to know what database should use the API:
var dbUser = db.useDb('nameDataBaseUser');
var Product = dbUser.model('Product', ProductSquema);
I have the schemas and the variable 'db' defined fixed in the code:
var db = mongoose.createConnection('mongodb://localhost');
Problem:
I don't know if is the correct solution about what I am trying to make, but it seems me inefficient that the model is generated constantly each time the API is called, because in some API(i.e in some middlewares I have until 4 different models)
Question:
This is the best way? or any suggestion to face this problem?
Not sure about the idea of creating a new db for each new user. That seems to create a lot of complexity and makes it difficult to maintain, and makes it difficult to access the data for analytics and such later. Why not use a new collection per new user? That way you can use just one set of db access credentials. Furthermore, Creating a new collection happens automatically when you store data for it.

Real-Time Database Messaging

We've got an application in Django running against a PGSQL database. One of the functions we've grown to support is real-time messaging to our UI when data is updated in the backend DB.
So... for example we show the contents of a customer table in our UI, as records are added/removed/updated from the backend customer DB table we echo those updates to our UI in real-time via some redis/socket.io/node.js magic.
Currently we've rolled our own solution for this entire thing using overloaded save() methods on the Django table models. That actually works pretty well for our current functions but as tables continue to grow into GB's of data, it is starting to slow down on some larger tables as our engine digs through the current 'subscribed' UI's and messages out appropriately which updates are needed as which clients.
Curious what other options might exist here. I believe MongoDB and other no-sql type engines support some constructs like this out of the box but I'm not finding an exact hit when Googling for better solutions.
Currently we've rolled our own solution for this entire thing using
overloaded save() methods on the Django table models.
Instead of working on the app level you might want to work on the lower, database level.
Add a PostgreSQL trigger after row insertion, and use pg_notify to notify external apps of the change.
Then in NodeJS:
var PGPubsub = require('pg-pubsub');
var pubsubInstance = new PGPubsub('postgres://username#localhost/tablename');
pubsubInstance.addChannel('channelName', function (channelPayload) {
// Handle the notification and its payload
// If the payload was JSON it has already been parsed for you
});
See that and that.
And you will be able to to the same in Python https://pypi.python.org/pypi/pgpubsub/0.0.2.
Finally, you might want to use data-partitioning in PostgreSQL. Long story short, PostgreSQL has already everything you need :)

Node w/Express MongoDb/Mongoose how to trigger 2nd database request to create an event logger

I'm trying to create an event logger system that logs database events, I initially created this system to run on the front end (sending more than one request to the api), but have decided that it would be much better to do this all on the back end. I would like to trigger a 2nd event when a database request is made like when a user creates/modifies/deletes a document, that the system records that event along with some info that goes along with it.
I am struggling with how to add this to my node/mongo api and am wondering what is the best practice. I've read about event emitters, however i'm not sure if this would be the best way to trigger this second event - in addition to that, i'm not sure how to pass info through the emitter to the 2nd mongoose request.
Any guidance would be appreciated.
Looks like I overlooked the next() command.

Is this possible with node.js? (JSON, SQL and pushing changes)

I have a website which can have up to 500 concurrent viewers, with data updated every three seconds. Currently each user has an AJAX object which calls a web-page every three seconds which queries a DB and returns with the results.
What I would love to do is have each client get a socket to a node.js object, this node.js would poll the DB every 3 seconds for updated data, if it had updated data it would then be announced (ideally through JSON) and each client would then have the data pushed to it and update the page accordingly.
If this is possible, does anyone have a recommendation as to where I start? I am fairly familiar with JS but node.js seems to confuse me.
Thanks
I myself have quite few experience with node.js.
It is absolutely doable and looks like the perfect use case for node.js.
I recommend starting with an Express Tutorial and later on use socket.io.
I don't know which DBMS you are using, but there probably is a nice package for that as well. Just look through this list.

Resources