Can I use only redis and node? - node.js

I've already looked on the internet and it only gave me doubts what architecture to use.
I am thinking of using:
BackEnd:
Nodejs, Mongo, Redis.
FrontEnd:
You travel
Because it is a chat application with the possibility of scaling horizontally.
The big question is do I have to use Mongo?
If not, use Redis, it is safe to treat with user.
Or leave the Mongo for user, and Redis for msg.
I thank anyone who has worked with something like this or knows the subject more than I ... Grateful

In my opinion, you can just use mongodb or redis or mysql and so on.
On the whole:
MongoDB: a document database
MySQL: a relational database
Redis: a in-memory data structure store. Of course, you can use redis as a database, but more people use it as cache to speed up request processing
So, I think you can use mongo or mysql as base storage. If your data too big, you can you add redis when your want to speed up request

Related

MERN stack and Socket to Mongodb - real time data to frontend from database

I am setting up a website with MERN stack. in the backend I will be constantly fetching data from api/socket and saving that to the Mongodb database. Now the frontend React I want to real-time show/update data with Socket.
I am a bit worried about the amount of request/connections the server side and Mongodb can handle.
Its not clear how to setup a proper system to handle millions of users.
If someone can give me some info on what, how and/or where to search on how to setup a stable system.
Any info is welcome, thank you.
This is a suggestion for the MongoDB portion.
You may want to try out their official course on MongoDB to understand how their transaction works. MongoBD University, for hands-on try their "MongoDB for JavaScript Developers" course.
Here are some things to look out for:
Connection Pooling
Understand how transactions work
Indexes, especially on how to create a good one like following the ESR
MongoDB Clusters, Write into primary and read from secondary
I will update the list if I have time and remember some of the stuff.

NodeJS sharding architecture with many MondoDB databases approaches

We have architecture problem on our project. This project requires sharding, as soon as we need almost unlimited scalability for the part of services.
Сurrently we use Node.js + MongoDb (Mongoose) and MySQL (TypeORM). Data is separated by databases through the simple 'DB Locator'. So node process needs connections to a lot of DBs (up to 1000).
Requests example:
HTTP request from client with Shop ID;
Get DB IP address/credentials in 'DB Locator' service by Shop ID;
Create connection to specific database with shop data;
Perform db queries.
We tried to implement it in two ways:
Create connection for each request, close it on response.
Problems:
we can't use connection after response (it's the main problem, because sometimes we need some asynchronous actions);
it works slower;
Keep all connections opened.
Problems:
reach simultaneous connections limit or some another limits;
memory leaks.
Which way is better? How to avoid described problems? Maybe there is a better solution?
Solution #1 perfectly worked for us on php as it runs single process on request and easily drops connections on process end. As we know, Express is pure JS code running in v8 and is not process based.
It would be great to close non-used connections automatically but can't find options to do that.
The short answer: stop using of MongoDB with Mongoose 😏
Longer answer:
MongoDB is document-oriented DBMS. The main usage case is when you have some not pretty structured data that you have to store, but you don't need to use too much. There is lazy indexing, dynamic typing and many more things that not allow you to use it as RDBMS, but it is great as a storage of logs or any serialized data.
The worth part here is Mongoose. This is the library that makes you feel like your trashbox is wonderful world with relations, virtual fields and many things that should not to be in DODBMS. Also, there is a lot of legacy code from previous versions that also make some troubles with connections management.
You already use TypeORM that may works instead Mongoose. With some restrictions, for sure.
It works exactly same way as MySQL connection management.
Here is some more data: https://github.com/typeorm/typeorm/blob/master/docs/mongodb.md#defining-entities-and-columns
In this case you may use you TypeORM Repository as transparent client that will init connections and close it or keep it alive on demand.

How to configure Mongoose with an already existing Mongo connection

I have an app that is already working with the native Node Mongo driver (v3.0).
I'm now trying to slowly implement Mongoose in order to make the app easier to maintain. I would like to do this in a gradual way so I rewrote all the user related operations with Mongoose and the rest like it was before. I noticed that my app now creates two connections to my Mongo db. This is clearly because Mongoose knows nothing about my existing connection.
I would like to handle connecting and disconnecting to Mongo myself and give Mongoose a reference to the already existing connection but I can't find anything like this in the docs.
Is this even possible or will I need two different connections until my app is fully rewritten to use Mongoose exclusively?
EDIT: My app is being run as an AWS Lambda function which has to connect and disconnect to mongo on every request so having two concurrent connections per request is effectively halving my mongo db available connections. That’s why I’m concerned about having an extra connection.
Turns out the answer to this is to do it the other way around. Just connect to Mongoose and then grab the connection.
let mongoConnection = mongoose.connection.client

Memcache v/s redis for maintaining persistent sessions?

I want to make persistent sessions on server i am using node.js with express and for that first i read about connect-redis enter link description here and connect-mongo enter link description here i read that redis is faster then mongo that's why i decided to use it but now i also find a module named memcached enter link description here i dont know which will be better for my project, also in mamcache is data stored in memory or where because if it is memory then it must be fastest.
If you have already setup Redis then I would stick with it as it is very fast and easy to manage. MemCached and Redis are very similar when used for caching however the key difference is that Redis can be set to persist to disk in the background meaning that if the server goes down the data in memory can be reloaded.
Personally, I would not use MongoDb for session persistence for speed reasons however if I was using MemCached I'd possibly use it as a backup for the sessions. e.g. Write session data to MemCached and Mongo but only read from MemCached and use Mongo to restore is an error occurs.
Bottom line, I think your choice to use Redis is the best one for what you've described

Redis Store with Socket.io

What is the benefit of using Redis as socket.io memory store, does it need additional resources. I'm using MongoDB as the database, can i use MongoDB as memory store for Socket.io, or do i replace MongoDB with Redis as database? What would be more efficient for building a real-time web app and providing maximum concurrent connections?
can i use MongoDB as memory store for Socket.io
Yes, you can try mong.socket.io
do i replace MongoDB with Redis as database?
Redis and MongoDB are different kind of databases, while mongodb is document oriented redis is key/value oriented (we can even say that redis is a data-structure server).
What would be more efficient for building a real-time web app and providing maximum concurrent connections?
Redis will be definitely faster than mongo on that matter, it supports pub/sub out of the box (while mong.socket.io uses a collection to simulate pub/sub) but you must know that all your data stored in redis must live in memory (here the only data that will be stored in redis will be additionnal socket.io informations).

Resources