Structuring Session Data in MongoDB - node.js

This might be a bad title, but I was having trouble thinking of a good way to phrase my problem. Basically, I have a NodeJS application that has session management. Each session interacts with a set of data independent from the other sessions. I am having trouble coming up with a way to structure this in MongoDB. Things I have thought of:
Currently I'm storing a list of JSON "pages" that each have an ID corresponding to the session using it. I am almost positive this will not scale well though, because these "pages" will be read and updated frequently, so if I'm connected to Session1000, I'm going to have to search through 1000 items looking for the correct ID every time I update something from that session. If 1000 people are doing that roughly once a second, well...
Ideally I would like to store each session in a different collection, but the sessions need to be created and referenced dynamically, and I can't find a way in MongoDB to access a collection without hard-coding the name.
Hopefully this accurately describes my problem. Does anyone have any ideas to help me structure the db so that accessing/updating will give fast performance/scalability?

Related

How can I create multiple instance of mongodb connection and use each one based on some condition

Recently I came across an interesting problem. Basically I have a database of schools which manages records of teachers, students, parents etc.
Theory:
Now, with every changing school session I want to be able to update the information of (say) student's class while maintaining the old information as well. One solution to this I thought would be to create a new instance of mongodb database thats would be the exact copy of the original one that runs on the same server, so I'll have 2 instances of the same database (one that has data of current school session and one that has the data of last year's session) running on same server.
Now I can query a specific instance based on my needs and get the appropriate data. Any changes would be done on the current school session while the old one would be treated as historical data.
Application:
Now I can't exactly figure out how to do it. I've looked into mongodb database versioning but even that doesn't seems to do it. One thing I know I don't want to do is to create an extra field on each of the collection's schema named version and manipulate it somehow which most of the solutions seems to suggest. I know mongodb has and __v field but I don't think that would be very useful (I could be wrong).
Any help is appreciated, Thanks

Couchbase retrieving relational docs in nodeJS

I am still debating which way to go and possibly store certain information in its own doc. so for example the customer can have addresses with each address would be its own doc and then in the customer doc there would be an array of ref keys stored under addresses. The benefit would be i could update these docs simply based on the key value vs having to get the customer doc first, finding the array index of the address and then either modify the whole doc or go and use subdoc to replace the content of the array with the index.
Where i am stuck is how to retrieve those referenced subdoc's. is N1QL the only way to go or does the KV API offer a way to do this short of retrieving the whole customer doc, then looping thru address array and retrieving all referenced docs that way. I know Ottoman offers something like that but i am having an issue with the latest version of SDK 2.6 and Ottoman as its not very well maintained. So hopefully someone can share some insight what and why its the best way.
If you want to rely on key/value, then you'll need to do the multiple lookup as you've described. I'm not very familiar with Ottoman: it might do this for you, but behind the scenes it will still be multiple key/value operations and/or N1QL.
With N1QL, you can perform JOINs, but again, behind the scenes it's going to eventually be pulling documents out by key/value. It just does those extra steps for you. Direct key/value is always going to be the fastest route.
If you are still in the process of deciding whether to split the data amongst multiple documents or "denormalize" the data into a single doc, one thing you should think about is how often you're going to access customer+addresses together and how often you're going to customer/access separately. If you're reading/writing customer+address often, consider putting it in one document. Otherwise, consider putting it in multiple documents.
The third option is to store it both places, or rather "cache" the address data in the customer document. This is tricky, because it could get out of sync if you're not careful. So make sure it's worth it before you go down that road.

MEAN Stack: static list best practice

This is a general best practice question:
I am building a MEAN (mongo, express, angular, node) website. I have a user object that can have a gender [Mr or Miss] and a city [Paris, New York, Anything]
So this is quite a common problem: where should I store those lists that rarely change and never exceed, let's say, 50 rows.
1/ Is it better to have them stored in the database (mongo) with a foreign key in the user table. And so I have a gender table and a city table. But everytime I access these lists I need to read the base?
2/ Is it better to have them store in a file or in a controller? But this is a bit dangerous I think.
3/ Maybe there is another way that I don't know about.
I am not sure what is the best solution.
Are you concerned about an extra database call to get a list out?
If it was me I'd pick option 1 and I'd be storing it in a database. If you store value descriptions only front-end you'll run the risk of discrepancies if you end up updating your database's foreign keys but forget to update your controller or file and it seems rather untrustworthy. It also makes it more difficult to provide internationalization, because you'll have to start storing names and genders in files or controllers in multiple languages. Storing things is what a database is for and an additional call to get a list out is really not that big an impact on your performance.
Angular's $http object, which you are probably using to call your API has a caching option, which means you'll only need to retrieve the list once per app instantiation.
You could alternatively have a look at this post by Josh who found a way to pre populate a directive with JSON from the server before loading it.

CouchDB - human readable id

Im using CouchDB with node.js. Right now there is one node involved and even in remote future its not planned to changed that. While I can remove most of the cases where a short and auto-incremental-like (it can be sparse but not like random) ID is required there remains one place where the users actually needs to enter the ID of a product. I'd like to keep this ID as short as possible and in a more human readable format than something like '4ab234acde242349b' as it sometimes has to be typed by hand and so on.
However in the database it can be stored with whatever ID pleases CouchDB (using the default auto generated UUID) but it should be possible to give it a number that can be used to identify it as well. What I have thought about is creating a document that consists of an array with all the UUIDs from CouchDB. When in node I create a new product I would run an update handler that updates said document with the new unique ID at the end. To obtain the products ID I'd then query the array and client side using indexOf I could get the index as a short ID.
I dont know if this is feasible. From the performance point of view I can say the following: There are more queries that should do numerical ID -> uuid than uuid -> numerical ID. There will be at max 7000 new entries a year in the database. Also there is no use case where a product can be deleted yet I'd like not to rely on that.
Are there any other applicable ways to genereate a shorter and more human readable ID that can be associated with my document?
/EDIT
From a technical point of view: It seems to be working. I can do both conversions number <-> uuid and it seems go well. I dont now if this works well with replication and stuff but as there is said array i guess it should, right?
You have two choices here:
Set your human readable id as _id field. Basically you can just set in create document calls to DB, and it will accept it. This can be a more lightweight solution, but it comes with some limitations:
It has to be unique. You should also be careful about clients trying to create documents, but instead overwrite existing ones.
It can only contain alphanumeric or a few special characters. In my experience it is asking for trouble to have extra character types.
It cannot be longer than a theoretical string length limit(Couchdb doesn't define any, but you should). Long ids will increase your views(indexes) size really bad. And it might make it s lower.
If these things are no problem with you, then you should go with this solution.
As you said yourself, let the _id be a UUID, and set the human readable id to another field. To reach the document by the human readable id, you can just create a view emitting the human readable id as a key, and then either emit the document as value or get the document via include_docs=true option. Whenever the view is reached Couchdb will update the view incrementally and return you the list. This is really same as you creating a document with an array/object of ids inside it. Except with using a couchdb view, you get more performance.
This might be also slightly slower on querying and inserting. If the ids are inserted sequentially, it's fine, if not, CouchDB will slightly take more time to insert it at the right place. These don't work well with huge amounts of insert coming at the DB.
Querying shouldn't be more than 10% of total query time longer than first option. I think 10% is really a big number. It will be most probably less than 5%, I remember in my CouchDB application, I switched from reading by _id to reading from a view by a key and the slow down was very little that from user end point, when making 100 queries at the same time, it wasn't noticeable.
This is how people, query documents by other fields than id, for example querying a user document with email, when the user is logging in.
If you don't know how couchdb views work, you should read the views chapter of couchdb definite guide book.
Also make sure you stay away from documents with huge arrays inside them. I think CouchDB, has a limit of 4GB per document. I remember having many documents and it had really long querying times because the view had to iterate on each array item. In the end for each array item, instead I created one document. It was way faster.

Understanding Kohana ORM Relationships

I know this question has been asked a million times, but I can't seem to find one that really gives me a good understanding of how relationships work in Kohana's ORM Module.
I have a database with 5 tables:
approved_submissions
-submission_id
-contents
favorites
-user_id
-submission_id
ratings
-user_id
-submission_id
-rating
users
-user_id
votes
-user_id
-submission_id
-vote
Right now, favorites,ratings, and votes have a Primary Key that consists of every column in the table, so as to prevent a user favoriting the same submission_id multiple times, a user voting on the same submission_id multiple times etc. I also believe these fields are set up using foreign keys that reference approved_submissions and users so as to prevent invalid data existing in the respective fields.
Using the DB module, I can access and update these tables no problem. I really feel as though ORM may offer a more powerful and accessible way to accomplish the same things using less code.
Can you demonstrate how I might update a user voting on a submission_id? A user removing a favorite submission_id? A user changing their rating on a particular submission_id?
Also, do I need to make changes to my database structure or is it okay the way it is?
You're probably looking for has_many_through relationships.
So to add a new submission, you'd do something like
$user->add('submissions', $submission);
and to remove
$user->remove('submissions', $submission);
You may want to consider restructuring your database table and key names so you don't end up doing a lot of configuration.

Resources