ERROR : [ioredis] Unhandled error event: Error: connect ETIMEDOUT - node.js

I have facing this issue redis connection timeout in my node application.
I have tried this code,
new Redis({
connectTimeout: 10000
})
But was of no use it didn't help me with the code
[ioredis] Unhandled error event: Error: connect ETIMEDOUT
at Socket.<anonymous> (/code/node_modules/ioredis/lib/redis.js:291:21)
at Object.onceWrapper (events.js:313:30)
at emitNone (events.js:106:13)
at Socket.emit (events.js:208:7)
at Socket._onTimeout (net.js:407:8)
at ontimeout (timers.js:475:11)
at tryOnTimeout (timers.js:310:5)
at Timer.listOnTimeout (timers.js:270:5)

When you say container, I'd assume you mean docker containers. If you're using the docker network, I believe you can always change the host to the name of your docker container within the network. If you're using docker-compose, here's an idea of how that may be done:
version: '3'
services:
app:
image: app_image
depends_on:
- redis
networks:
- app_network
redis:
image: redis
networks:
- app_network
networks:
app_network: {}
So in your app, you'd then do
redis = new Redis({
host: 'redis://redis'
})

Related

MongoDB & nodejs with Docker "MongoTimeoutError: Server selection timed out after 30000 ms"

I am using mongoDB with and NodeJS backend. I have Problem and the Problem have docker logs is below.
{ MongoTimeoutError: Server selection timed out after 30000 ms
at Timeout.setTimeout [as _onTimeout] (/usr/src/app/node_modules/mongodb/lib/core/sdam/topology.js:897:9)
at ontimeout (timers.js:436:11)
at tryOnTimeout (timers.js:300:5)
at listOnTimeout (timers.js:263:5)
at Timer.processTimers (timers.js:223:10)
name: 'MongoTimeoutError',
reason:
{ Error: connect ECONNREFUSED 127.0.0.1:27017
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1113:14)
name: 'MongoNetworkError',
errorLabels: [ 'TransientTransactionError' ],
[Symbol(mongoErrorContextSymbol)]: {} },
[Symbol(mongoErrorContextSymbol)]: {} }
(node:1) UnhandledPromiseRejectionWarning: MongoTimeoutError: Server selection timed out after 30000 ms
at Timeout.setTimeout [as _onTimeout] (/usr/src/app/node_modules/mongodb/lib/core/sdam/topology.js:897:9)
at ontimeout (timers.js:436:11)
at tryOnTimeout (timers.js:300:5)
at listOnTimeout (timers.js:263:5)
at Timer.processTimers (timers.js:223:10)
(node:1) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 1)
(node:1) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
This is my Dockerfile
#FROM node:10.12.0
FROM node:latest-alpline
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package.json ./
RUN npm install
# If you are building your code for production
# RUN npm install --only=production
# Bundle app source
COPY . .
EXPOSE 3000
CMD [ "node", "app.js" ]
This is docker-compose.yml
version: "2"
services:
app:
container_name: app_test
restart: always
build: .
ports:
- "3000:3000"
links:
- mongo_test
depends_on:
- mongo_test
networks:
- nodeapp_network
mongo_test:
container_name: mongo_test
image: mongo
volumes:
- ./data:/data/db
ports:
- "27017:27017"
networks:
- nodeapp_network
networks:
nodeapp_network:
driver: bridge
The following code connects mongodb to mongoose in node js:
mongoose.connect('mongodb://mongo_test:27017/collectionName', {useNewUrlParser: true});
I get the same result if I use localhost, 127.0.0.1, IP address instead of mongo_test.
I have consulted several blogs, stack overflows.
How can I do?
(ps. I can't well English, so Please understand even if words are awkward. Thank you.)
There's nothing wrong with the current configuration, after few retries the app.js is connecting to the database
Here's what is happening:
docker-compose is launching Mongodb service, the port is not yet open
docker-compose starts app_test and app_test tries to connect it fails(2 to 3 times)
After a few seconds, Mongodb port is now open on 27017
Since your config has restart: always the node.js app restarts and connects successfully to the database
To avoid these restarts, you can write it like this
setTimeout(function() {
try {
await mongoose.connect('mongodb://mongo_test:27017/collectionName',
{ useNewUrlParser: true }
);
} catch (error) {
console.log('Error connecting - retrying in 30 seconds')
}, 30000); // Wait for 30 seconds and try to connect again
Explanation:
The connectTimeoutMS is a MongoDB Connection Option and it sets the number of milliseconds a socket stays inactive before closing during the connection phase of the driver. That is to say, when the application initiates a connection, when a replica set connects to new members, or when a replica set reconnects to members. the default value is 30000 milliseconds would mean the driver would wait up to 3 seconds for a response from a MongoDB server. link
Compose does not wait until your mongo_test container is “ready”.
The problem of waiting for a database to be ready is really just a subset of a much larger problem of distributed systems. In production, your database could become unavailable or move hosts at any time. Your application needs to be resilient to these types of failures. link
Using depends_on in your app_test service is still no guarantee to establish the connection between your services.
Solution:
You can update your code to connect to MongoDB:
const connectWithRetry = () => {
mongoose
.connect('mongodb://mongo_test:27017/collectionName', {useNewUrlParser: true})
.then(() => console.log("succesfully connected to DB"))
.catch((e) => {
console.log(e);
setTimeout(connectWithRetry, 5000);
});
};
connectWithRetry();

Docker: Unhandled Promise rejection when trying to access mongoDB using docker

So I was trying to deploy my very basic Node.js application using docker on a remote linux server(digital ocean droplet) following this tutorial. I'm pretty new to deployment related stuff so i'm surely doing some basic error. Please bear with me for the long post.
The code for project and docker, docker-compose config files that I was using was:
.env
MONGO_USERNAME=sammy
MONGO_PASSWORD=password
MONGO_DB=sharkinfo
PORT=8080
MONGO_PORT=27017
MONGO_HOSTNAME=127.0.0.1
Dockerfile
FROM node:10
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package*.json ./
RUN npm install
# If you are building your code for production
# RUN npm ci --only=production
# Bundle app source
COPY . .
EXPOSE 8080
CMD [ "node", "src/index.js"]
docker-compose.yml
version: '3'
services:
nodejs:
build: .
env_file: .env
environment:
- MONGO_USERNAME=$MONGO_USERNAME
- MONGO_PASSWORD=$MONGO_PASSWORD
- MONGO_HOSTNAME=myDB
- MONGO_PORT=$MONGO_PORT
- MONGO_DB=$MONGO_DB
ports:
- "80:8080"
networks:
- app-network
myDB:
image: mongo:4.1.8-xenial
env_file: .env
environment:
- MONGO_INITDB_ROOT_USERNAME=$MONGO_USERNAME
- MONGO_INITDB_ROOT_PASSWORD=$MONGO_PASSWORD
volumes:
- dbdata:/data/db
networks:
- app-network
networks:
app-network:
driver: bridge
volumes:
dbdata:
mongoose.js
const mongoose = require('mongoose');
const {
MONGO_USERNAME,
MONGO_PASSWORD,
MONGO_HOSTNAME,
MONGO_PORT,
MONGO_DB
} = process.env;
const url = `mongodb://${MONGO_USERNAME}:${MONGO_PASSWORD}#${MONGO_HOSTNAME}:${MONGO_PORT}/${MONGO_DB}?authSource=admin`;
mongoose.connect(url, {
useNewUrlParser: true,
useCreateIndex: true,
useFindAndModify: false
});
Then I created and ran a docker image using the commands (in-order) - docker-compose build docker-compose up -d.
Both the node and mongoDB server show up in list of running docker processes. However when i do docker logs node-application, i get an error:
Server is up on port 8080
(node:1) UnhandledPromiseRejectionWarning: MongoNetworkError: failed to connect to server [mydb:27017] on first connect [MongoNetworkError: connect ECONNREFUSED 172.19.0.3:27017]
at Pool.<anonymous> (/usr/src/app/node_modules/mongodb-core/lib/topologies/server.js:564:11)
at Pool.emit (events.js:198:13)
at Connection.<anonymous> (/usr/src/app/node_modules/mongodb-core/lib/connection/pool.js:317:12)
at Object.onceWrapper (events.js:286:20)
at Connection.emit (events.js:198:13)
at Socket.<anonymous> (/usr/src/app/node_modules/mongodb-core/lib/connection/connection.js:246:50)
at Object.onceWrapper (events.js:286:20)
at Socket.emit (events.js:198:13)
at emitErrorNT (internal/streams/destroy.js:91:8)
at emitErrorAndCloseNT (internal/streams/destroy.js:59:3)
at process._tickCallback (internal/process/next_tick.js:63:19)
(node:1) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 1)
(node:1) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
I've tried various tweaks to my code with no success and it continues to give the same error. What could be the issue here?

How to connect to postgres inside docker compose with NodeJs?

I have installed Mainflux in Cloud Server using Docker. Postgres DB is also
running in Docker Container. I have a situation to connect to PostgresDB with Node.Js(programmatically). I found "pg" module to connect to Cloud
Postgres DB. But, am unable to connect to Postgres which is running on
Docker Container. I pasted my code below, pls let me know the scenario to connect "Docker Postgres".
const pg = require('pg');
const conStringPri = postgres://mainflux:mainflux#MYIP/mainflux-things-db;
const Client = pg.Client;
const client = new Client({connectionString: conStringPri});
client.connect();
client.query(CREATE DATABASE DB_Name)
.then(() => client.end());
Getting error as below:
node:8084) UnhandledPromiseRejectionWarning: Error: connect ECONNREFUSED
MYIP:5432
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1191:14)
(node:8084) UnhandledPromiseRejectionWarning: Unhandled promise rejection.
This error originated either by throwing inside of an async function
without a catch block or by rejecting a promise which was not handled with
.catch() (rejection id: 1)
(node:8084) [DEP0018] DeprecationWarning: Unhandled promise rejections are
deprecated. In the future, promise rejections that are not handled will
terminate the Node.js process with a
non-zero exit code.
(node:8084) UnhandledPromiseRejectionWarning: Error: Connection
terminated unexpectedly
at Connection.con.once
(D:\postgres_Nodejs\node_modules\pg\lib\client.js:200:9)
at Object.onceWrapper (events.js:313:30)
at emitNone (events.js:106:13)
at Connection.emit (events.js:208:7)
at Socket.
(D:\postgres_Nodejs\node_modules\pg\lib\connection.js:76:10)
at emitOne (events.js:116:13)
at Socket.emit (events.js:211:7)
at TCP._handle.close [as _onclose] (net.js:561:12)
If you started Mainflux by using docker-compose configuration supplied here https://github.com/mainflux/mainflux/blob/master/docker/docker-compose.yml , then your PostgreSQL container doesn't have the port exposed to the host. In order to be able to connect to the database, you need to expose this port.
Here's an example of how the part of the docker-compose would look, with the things-db container having port 5432 (default PostgreSQL port) exposed
things-db:
image: postgres:10.2-alpine
container_name: mainflux-things-db
restart: on-failure
environment:
POSTGRES_USER: mainflux
POSTGRES_PASSWORD: mainflux
POSTGRES_DB: things
networks:
- mainflux-base-net
ports:
- 5432:5432
So you will need to modify your docker-compose.yml.
Please note that the Mainflux docker compose has 2 PostgreSQL databases in 2 containers: things-db and users-db.

Connecting to Mongodb docker container from another docker container

I am trying to connect mongo DB running in docker container from another docker container where my node js code is running.
So I run MongoDB docker using the following command:
docker run --name my-local-mongo -v mongo-data:/data/db -p 27017:27017 -d mongo
I can access from browser typing 0.0.0.0:27017, however when I try to connect from node js code I am getting following error. My url variable is:
var url = "mongodb://0.0.0.0:27017/surveydb";
{ MongoNetworkError: failed to connect to server [0.0.0.0:27017] on first connect [MongoNetworkError: connect **ECONNREFUSED 0.0.0.0:27017**]
at Pool.<anonymous> (/usr/src/appg08/node_modules/mongodb-core/lib/topologies/server.js:564:11)
at Pool.emit (events.js:182:13)
at Connection.<anonymous> (/usr/src/appg08/node_modules/mongodb-core/lib/connection/pool.js:317:12)
at Object.onceWrapper (events.js:273:13)
at Connection.emit (events.js:182:13)
at Socket.<anonymous> (/usr/src/appg08/node_modules/mongodb-core/lib/connection/connection.js:246:50)
at Object.onceWrapper (events.js:273:13)
at Socket.emit (events.js:182:13)
at emitErrorNT (internal/streams/destroy.js:82:8)
at emitErrorAndCloseNT (internal/streams/destroy.js:50:3)
name: 'MongoNetworkError',
errorLabels: [ 'TransientTransactionError' ],
[Symbol(mongoErrorContextSymbol)]: {} }
Thanks.
var url = "mongodb://0.0.0.0:27017/surveydb";
is ip adddress is in your nodejs container only, so you should know what is ip address of mongo container or assign to ip address gateway of containers.
var url = "mongodb://172.17.0.1:27017/surveydb";
or
var url = "mongodb://ipaddressofmongocontainer:27017/surveydb";

No living connections Error while Elasticsearch connections in nodejs

I am having this problem while connecting the elasticsearch connections.
var elasticsearch = require('elasticsearch');
var client = new elasticsearch.Client({ host: 'localhost:9200',
log: 'trace'});
Elasticsearch ERROR: 2016-07-19T19:09:26Z
Error: Request error, retrying -- connect ECONNREFUSED 127.0.0.1:9200
at Log.error (/root/git_build/FirstMoveChess/node_modules/elasticsearch/src/lib/log.js:225:56)
at checkRespForFailure (/root/git_build/FirstMoveChess/node_modules/elasticsearch/src/lib/transport.js:195:18)
at HttpConnector. (/root/git_build/FirstMoveChess/node_modules/elasticsearch/src/lib/connectors/http.js:154:7)
at ClientRequest.bound (/root/git_build/FirstMoveChess/node_modules/lodash-node/modern/internals/baseBind.js:56:17)
at emitOne (events.js:96:13)
at ClientRequest.emit (events.js:188:7)
at Socket.socketErrorListener (_http_client.js:308:9)
at emitOne (events.js:96:13)
at Socket.emit (events.js:188:7)
at emitErrorNT (net.js:1272:8)
at _combinedTickCallback (internal/process/next_tick.js:74:11)
at process._tickCallback (internal/process/next_tick.js:98:9)
Elasticsearch TRACE: 2016-07-19T19:09:27Z
-> HEAD http://localhost:9200/
I understand this question is quite old but I wanted to share how you can solve this problem.
If you are locally using elasticsearch
First thing you must do is running elasticsearch on your machine.
Error: Request error, retrying -- connect ECONNREFUSED 127.0.0.1:9200
Because above message indicates that you're not running elasticsearch locally.
So, visit the link and follow the insturction.
Docker environment
It gets much trickier here.
First, follow the instruction here.
And in case you're using node.js elasticsearch client, you have to specify elasticsearch host as 172.24.0.1.
If you use container_name or private IP of container in docker-compose.yml, it won't work.
In case of Docker Container Environment after changing from http://localhost:9200 to http://ipaddress:9200 in docker-compose.yml
please change the following live in docker-compose.yml that is related to CORS
Change this
** http.cors.allow-origin=/https?://localhost(:[0-9]+)?/ **
into this
*- http.cors.allow-origin= **

Resources