How do I sync Redis master to another standalone Redis instance? - linux

I have a Master/Slave environment that I would like to sync (daily) with another standalone Redis instance.
I know that I can dump the data, transfer it to the other server than import the data.
Is there a way to stream the data from Redis to Redis?

You can make your instance a slave of the Redis that you want to sync.
You don't need to config the master. Just use the SLAVEOF host port command to make your instance a slave of the Redis that you want to sync. When the synchronization is done, use the SLAVEOF NO ONE command to stop the replication.

Related

Splitting read & write to redis with nodejs

I have setup redis on three seperate instances and have configured them in such a way that 1 instance is a master and 2 are replicas of master. I have used sentinels to make sure there is high availability of the setup. I have a nodejs application which needs to use the redis. How do i achieve the read and write splitting in my application as incase my redis master goes down one of my read replica becomes the master and the writes need to go to it.
As far has I know, ioredis is the only node redis client that supports sentinels.
"ioredis guarantees that the node you connected to is always a master even after a failover. When a failover happens, instead of trying to reconnect to the failed node (which will be demoted to slave when it's available again), ioredis will ask sentinels for the new master node and connect to it. All commands sent during the failover are queued and will be executed when the new connection is established so that none of the commands will be lost."

node js Master Slave replication with read and write queries split up

My application runs on node js and using PostgreSQL(pg-promise) for the database connection. I want that all write queries should go to master instances of DB and read queries to slave instance. I have set up the server configuration prostgresql.conf and pg_hba.conf files.
Now,how will the application will get to know that read queries going to slave and write to the master. Is there any library we have to install.
What you need is pgpool-II - http://www.pgpool.net/mediawiki/index.php/Main_Page
It is a multi-purpose tool, it can not only replicate your master db to slave DBs but will do load balancing for you. You have to just connect to your pgpool server, it will balance your write/read queries accordingly.

Nodejs connect to AWS ElasticCache replication group

I have a Redis replication group where I have 1 master and 2 slave nodes. Slave nodes are read only. I am using node_redis to connect to Redis endpoint.
Now I want my application to connect to only slave nodes for any read query and only write query should go to master node. Do I have to make any changes in my application to connect or I can connect to master node and elastic cache will automatically redirect read queries to slave nodes?
Point the 'read queries' to the 'endpoint' of the slave node if used for non-critical purposes.
Another point to note is that data in Slave node 'may' be stale
Keep in mind that primary node can also be used for 'read'
Elasticcache does the load balancing of read queries

Connecting to both master and slave in a replicated Redis cluster

I'm setting up a simple 1 Master - N Slaves Redis cluster (low write round, high read count). How to set this up is well documented on the Redis website, however, there is no information (or I missed it) about how the clients (Node.js servers in my case) handle the cluster. Do my servers need to have 2 Redis connections opened: one for the Master (writes) and one towards a Slave load-balancer for reads? Does the Redis driver handle this automatically and send reads to slaves and writes to the Master?
The only approach I found was using thunk-redis library. This library supports connecting to Redis master-slave without having a cluster configured or using a sentinel.
You just simply add multiple IP addresses to the client:
const client = redis.createClient(['127.0.0.1:6379', '127.0.0.1:6380'], {onlyMaster: false});
You don't need to specifically connect to particular instance, every instance in redis cluster has information of cluster. So even if you connect to one master, your client would to be connect to any instance in the cluster. So if you try to update a key present in different master(other than the one you connected), redis client takes care of it by using the redirection provided by the server.
To answer your second question, you can enable reads from slave by READONLY command

Redis Cluster in multiple threads

Im currently using Redis Cluster Mode with 3 master instances, i'm using Jedis(Java client) in listening server which every data received i create a new thread then the thread make an update in redis.
My question is how can i use Redis Cluster instance in multiple thread with pool configuration.
JedisCluster is thread-safe.
It contains JedisPool for each node internally, so you don't need to worry about dealing JedisCluster instance with multithread.

Resources