Redis Store with Socket.io - node.js

What is the benefit of using Redis as socket.io memory store, does it need additional resources. I'm using MongoDB as the database, can i use MongoDB as memory store for Socket.io, or do i replace MongoDB with Redis as database? What would be more efficient for building a real-time web app and providing maximum concurrent connections?

can i use MongoDB as memory store for Socket.io
Yes, you can try mong.socket.io
do i replace MongoDB with Redis as database?
Redis and MongoDB are different kind of databases, while mongodb is document oriented redis is key/value oriented (we can even say that redis is a data-structure server).
What would be more efficient for building a real-time web app and providing maximum concurrent connections?
Redis will be definitely faster than mongo on that matter, it supports pub/sub out of the box (while mong.socket.io uses a collection to simulate pub/sub) but you must know that all your data stored in redis must live in memory (here the only data that will be stored in redis will be additionnal socket.io informations).

Related

Can I use only redis and node?

I've already looked on the internet and it only gave me doubts what architecture to use.
I am thinking of using:
BackEnd:
Nodejs, Mongo, Redis.
FrontEnd:
You travel
Because it is a chat application with the possibility of scaling horizontally.
The big question is do I have to use Mongo?
If not, use Redis, it is safe to treat with user.
Or leave the Mongo for user, and Redis for msg.
I thank anyone who has worked with something like this or knows the subject more than I ... Grateful
In my opinion, you can just use mongodb or redis or mysql and so on.
On the whole:
MongoDB: a document database
MySQL: a relational database
Redis: a in-memory data structure store. Of course, you can use redis as a database, but more people use it as cache to speed up request processing
So, I think you can use mongo or mysql as base storage. If your data too big, you can you add redis when your want to speed up request

Set cookie on browser with node js

I'm quite new to node js and I've encountered a problem with cookies on my client side browser.
So to resume, I am trying to set cookie to a page (which is not yet opened) with node http server, my goal is that I set cookie to this page, then open this URL with node (opn module) and then this page access cookie with document.cookie.
Also, the URL I'm opening is hosted on the node server (localhost/mail/sendingMail/index.html) on port 32126.
I've tried a lot of module and request but couldn't find one that did the job, is this possible and if so, how?
If this isn't possible any way of sending data to this page would be nice!
i think you have to use redis for this i also use redis and it is veryfast because it store data on RAM.
in redis you can store objects or hashes or arrays and it is very easy to use with node by installing redis-npm
this is the best dock which i prefer and describe how to use redis with node but remember you have to install redis first .
about redis :-
Redis is an open source (BSD licensed), in-memory data structure store, used as a database, cache and message broker. It supports data structures such as strings, hashes, lists, sets, sorted sets with range queries, bitmaps, hyperloglogs and geospatial indexes with radius queries. Redis has built-in replication, Lua scripting, LRU eviction, transactions and different levels of on-disk persistence, and provides high availability via Redis Sentinel and automatic partitioning with Redis Cluster.

Performant async I/O for relational databases

We are trying to optimize our database requests and also reduce the number of database connections in our Node.js servers to Postgres database.
Do Redis database drivers and Redis database connections perform better than relational database drivers/connections in Node.js? I hear tales of needing more database connections for a relational database. My experience with Redis is that just a few connections will suffice, for a given Node.js server process, no matter the load on the server.
With async I/O, can connections be reused, or perhaps the same DB connection used for multiple queries/requests in parallel? Does it differ between database vendors?
Does anyone have recommendation as far as the best possible database connection library for Node.js + Postgres in terms of performance

Memcache v/s redis for maintaining persistent sessions?

I want to make persistent sessions on server i am using node.js with express and for that first i read about connect-redis enter link description here and connect-mongo enter link description here i read that redis is faster then mongo that's why i decided to use it but now i also find a module named memcached enter link description here i dont know which will be better for my project, also in mamcache is data stored in memory or where because if it is memory then it must be fastest.
If you have already setup Redis then I would stick with it as it is very fast and easy to manage. MemCached and Redis are very similar when used for caching however the key difference is that Redis can be set to persist to disk in the background meaning that if the server goes down the data in memory can be reloaded.
Personally, I would not use MongoDb for session persistence for speed reasons however if I was using MemCached I'd possibly use it as a backup for the sessions. e.g. Write session data to MemCached and Mongo but only read from MemCached and use Mongo to restore is an error occurs.
Bottom line, I think your choice to use Redis is the best one for what you've described

node js using Mongo and Redis simulteniously

I am developing a little project using Node.js. I am using mongoose for models, therefore i am using MongoDb. And i keep sessions in MongoStore. Also i want to use socket.io running several processes of Node. From socket.io docs:
The MemoryStore only allows you deploy socket.io on a single process.
If you want to scale to multiple process and / or multiple servers
you can use our RedisStore which uses the Redis
NoSQL database as man in the middle.
So i think i need Redis too. I am new in Node and i want to know - is it normal to use two databases to manage different parts of application. Or is there a way to work with socket.io when running several processes of Node and use only MongoDb
Just recently a solution that uses MongoStore with pub/sub functionality using mubsub (Pub/sub for Node.js and MongoDB) has appeared.
It can be attached to socket.io in almost the same way as you would with RedisStore:
io.configure(function() {
io.set('store', new MongoStore({host: 'localhost', port: 27017, db:'session_db'}));
});
More information and source at: https://github.com/kof/socket.io-mongo
The Redis store is already built into Socket.IO, but more importantly has 2 important features that are particularly needed for Socket.IO:
1) Publish-subscribe (to communicate between processes)
2) Key-value store (to store all the info about connections)
While the key-value store part can be done with MongoDB, it doesn't provide the pub-sub functionality.
Bottom line, IF you need to scale beyond one process (meaning you are expecting more than some thousand concurrent request) than RedisStore is the solution.
Resources:
Examples in using RedisStore in socket.io
http://www.ranu.com.ar/2011/11/redisstore-and-rooms-with-socketio.html

Resources