Connection pooling in elasticsearch - node.js

How can we configure connection pooling for elasticsearch in node js? For handling instance failures and detecting dead nodes.
How can I customize Transport, ConnectionPool, Connections classes of elasticsearch in nodejs.

This feature is supported with the new RC1 client
Here an example:
'use strict'
// docker run -p 9200:9200 -p 9300:9300 --rm -e "discovery.type=single-node" docker.elastic.co/elasticsearch/elasticsearch:6.7.0
const { Client } = require('#elastic/elasticsearch')
const client = new Client({
nodes: ['http://127.0.0.1:9200/'],
requestTimeout: 2000,
sniffInterval: 500,
sniffOnStart: true,
sniffOnConnectionFault: true
})
client.on('sniff', (err, req) => {
console.log('snif', err ? err.message : '', `${JSON.stringify(req.meta.sniff)}`)
})
setInterval(async () => {
try {
const info = await client.info()
console.log(info.body.name)
} catch (err) {
console.log(err.message);
}
}, 1500)
Note install the v6:
"#elastic/elasticsearch": "6.7.0-rc.1",

Related

AWS Redis Cluster MOVED Error using redis node library

I have created a Redis MemoryDB cluster with 2 nodes in AWS:
I connect to it using redis node library v4.0.0 like this:
import { createCluster } from 'redis';
(async () => {
const REDIS_USERNAME = 'test-username';
const REDIS_PASSWORD = 'test-pass';
const cluster = createCluster({
rootNodes: [
{
url: `rediss://node1.amazonaws.com:6379`,
},
{
url: `rediss://node2.amazonaws.com:6379`,
},
],
defaults: {
url: `rediss://cluster.amazonaws.com:6379`,
username: REDIS_USERNAME,
password: REDIS_PASSWORD,
}
});
cluster.on('error', (err) => console.log('Redis Cluster Error', err));
await cluster.connect();
console.log('connected to cluster...');
await cluster.set('key', 'value');
const value = await cluster.get('key');
console.log('Value', value);
await cluster.disconnect();
})();
But sometimes I get the error ReplyError: MOVED 12539 rediss://node2.amazonaws.com:6379 and I cannot get the value from the key.
Do you have any idea if there is something wrong with the configuration of the cluster or with the code using redis node library?
Edit:
I tried it with ioredis library and it works, so it's something wrong with the redis library.
Node.js Version: 16
Redis Server Version: 6
I had created an issue to redis library, so it's going to be solved soon with this PR.

Failed to connect ElastiCache from NodeJS server on Elastic Beanstalk

We have a nodeJS server with express on AWS Elastic Beanstalk and we are trying to connect it with the Elasticache(Redis clustered) from the NodeJS but getting this error Redis Client Connection Error ClusterAllFailedError: Failed to refresh slots cache.. The error seems very common as a lot of people are facing the same bug. In order to connect to ElastiCache, we are using an npm module named ioredis.
A lot of people recommend using the same VPC and security group for both ElastiCache and Elastic Beanstalk. We are already using the same VPC and on Elastic Beanstalk we are using two security groups one of them matches the security group of ElastiCache. For the default VPC, we have enabled All Traffic for the inbound and outbound rules, but still, we are facing the same bug.
In order to connect to ElastiCache from NodeJS server I am using the following code:
const Redis = require("ioredis");
exports.connect = () => {
const client = new Redis.Cluster(
["xxxxx.xxxxx.clustercfg.use1.cache.amazonaws.com:6379"],
{
slotsRefreshTimeout: 10000,
dnsLookup: (address, callback) => callback(null, address),
redisOptions: {
showFriendlyErrorStack: true,
tls: {
checkServerIdentity: (/*host, cert*/) => {
// skip certificate hostname validation
return undefined;
},
},
},
}
);
client.on("ready", () => {
console.log("Redis Client Ready");
});
client.on("connect", () => {
console.log("Redis Client Connected");
});
client.on("error", (error) => {
console.log("Redis Client Connection Error", error);
});
client.on("reconnecting", () => {
console.log("Redis Client Reconnecting");
});
client.on("end", () => {
console.log("Redis Client Connection ended");
});
return client;
};
ElastiCache Configuration
Default VPC Security Group with Inbound and Outbound rules
Elastic Beanstalk security group(Same as default)
Error information from Elastic Beanstalk
Versions:
Node.js running on 64bit Amazon Linux with platform version 4.15.1
NodeJS version: 12.18.3
ioredis version: 4.17.3
npm version: 6.14.6
express version: 4.17.1
UPDATE: I am able to access the ElastiCache from ElasticBeanstalk if I do ssh and use redis-cli, but unable to access it using ioredis on NodeJS which is running on ElasticBeanstalk.
I have a similar setup and eventually got it working, a few key points:
Elasticbeanstalk and Elasticache have to be in the same VPC
Elasticache's security group should have an inbound rule to allow traffic from Elasticbeanstalk
Here's a code to connect:
import { RedisPubSub } from 'graphql-redis-subscriptions';
import Redis from 'ioredis';
import config from '../../config/env';
const options = {
// AWS host will look like this: somecache-dev-ro.k6sjdj.ng.0001.use1.cache.amazonaws.com
host: config.redis.host || 'localhost',
port: config.redis.port || 6379,
retryStrategy: (times: number): number => {
// reconnect after
return Math.min(times * 50, 2000);
},
};
export const pubsub = new RedisPubSub({
publisher: new Redis(options),
subscriber: new Redis(options),
});
I was debugging a similar issue. To access redis, I had to add tls: {} to the ioredis options:
{
host: process.env.REDIS_HOST,
port: process.env.REDIS_PORT,
password: process.env.REDIS_PASSWORD,
tls: {}
}
you can simply create connection
const Redis = require("ioredis");
const client = new Redis(
6379,
"Configiration Endpoint (xxx.xxxx.xxxcache.amazonaws.com)"
);
client.on("ready", () => {
console.log("Redis Client Ready");
client.send(
});
client.on("connect", () => {
console.log("Redis Client Connected");
});
client.on("error", (error) => {
console.log("Redis Client Connection Error", error);
});

Connection to postgresql db from node js

I'm tyring to make a connection from my nodejs script to my db connection, but seems like there is a suspicius issue i'm not able to figure out.
At the moment, this is my code:
const { Pool } = require('pg');
const pool = new Pool({
user: 'user',
host: '192.168.1.xxx',
database: 'database',
password: 'password',
port: 5432,
});
pool.on('error', (err, client) => {
console.error('Error:', err);
});
const query = `SELECT * FROM users`;
pool.connect()
.then((client) => {
client.query(query)
.then(res => {
for (let row of res.rows) {
console.log(row);
}
})
.catch(err => {
console.error(err);
});
})
.catch(err => {
console.error(err);
});
The issue seems to be in pool.connect(), but i can't understand what i'm missing because i got no errors in the log. I've installed pg module in the directory of my project with npm install --prefix pg and i know modules are loaded correctly.
I edited postgresql.conf:
# - Connection Settings -
listen_addresses = '*'
and pg_hba.conf
host database user 192.168.1.0/24 md5
to make the database reachable via lan and seems liek it works, because i'm able to connect successfully with apps like DBeaver...but i can't with NodeJS.
It's possible there is some kind of configuration i've to active?

elasticsearch is not able to establish connection on ubuntu machine

I am trying to run elastic search on my express/nodejs app on ubuntu 17.10. But somehow, it is giving the error as shown below:
Error: Request error, retrying
HEAD http://localhost:9200/ => connect ECONNREFUSED 127.0.0.1:9200
My code for that is as shown below:
const connectElasticSearchClient = () => {
client = new elasticsearch.Client({
hosts: ['localhost:9200'],
maxRetries: 10,
keepAlive: true,
maxSockets: 10,
minSockets: 10
});
client.ping({
requestTimeout: 300000,
}, function (error) {
if (error) {
console.error('elasticsearch cluster is down!' + error);
} else {
console.log('Everything is ok');
}
});
};
I am trying to run it on node version 8.11.1. The version of elasticsearch npm package is 13.3.1.

Node-Redis: ready check failed - NOAUTH Authentication required

I have a strange redis behavior:
const redis = require('redis');
const { REDIS_URL: redisUrl, REDIS_PASSWORD: redisPassword } = process.env;
const client = redis.createClient(redisUrl, {
no_ready_check: true,
auth_pass: redisPassword
});
client.on('connect', () => {
redisPassword && client.auth(redisPassword);
});
client.on('error', err => {
global.console.log(err.message)
});
But all the time I receive following error:
throw er; // Unhandled 'error' event
ReplyError: Ready check failed: NOAUTH Authentication required.
Why unhandled ? I set onerror handler
Why Ready check failed ? I disabled it in options
I'm not sure why your code will throw this error. But I try this code in my local machine, it works well.
const redis = require('redis');
const redisPassword = "password" ;
const client = redis.createClient({
host : '127.0.0.1',
no_ready_check: true,
auth_pass: redisPassword,
});
client.on('connect', () => {
global.console.log("connected");
});
client.on('error', err => {
global.console.log(err.message)
});
client.set("foo", 'bar');
client.get("foo", function (err, reply) {
global.console.log(reply.toString())
})
And run node client.js will output :
connected
bar
NOAUTH Authentication required is caused by when redis process command , it found the client is not authenticated so it complained with it.
I guess maybe the redisUrl you give to createClient has some problem, try to debug it or change to my code's way to try. Hopefully you can fix it.
And one more thing: the client.auth(redisPassword) is not necessary because if you set an auth_pass or password option, the redis client will auto send auth command to server before any command.
If you have redis uri saved as string. You need decompose it to object. For ioredis you can use function
export function decomposeRedisUrl(url) {
const [[, , password, host, port]] = [...(url.matchAll(/redis:\/\/(([^#]*)#)?(.*?):(\d*)/g))];
return { password, host, port };
}
There are tests for this function:
it("redis url should be decomposed correctly with password", () => {
expect(decomposeRedisUrl("redis://pass#host.com:9183")).to.eql({
password: "pass",
host: "host.com",
port: "9183",
});
});
it("redis url should be decomposed correctly without password", () => {
expect(decomposeRedisUrl("redis://localhost:6379")).to.eql({
password: undefined,
host: "localhost",
port: "6379",
});
});
and usage
import Redis from "ioredis";
async function getKeysFromRedisUrl(url) {
const rc = new Redis(decomposeRedisUrl(url));
const keys = await rc.keys("*");
rc.disconnect();
return keys;
}
describe("Redis can connect", () => {
it("with cloud", async () => {
expect(await getKeysFromRedisUrl("redis://pass#host.com:9183")).to.be.an("array");
});
it("with local redis instance", async () => {
expect(await getKeysFromRedisUrl("redis://localhost:6379")).to.be.an("array");
});
});
user name is not handled in this function
if you're using docker to run Redis, check if your docker-compose has the command: redis-server --requirepass redis
Then check your .env file to make sure you're using it.
It was the problem here and I was able to fix it by adding the password at .env file.

Resources