Nodejs application not connecting Redis in docker-compose - node.js

I have a Nodejs application which connects to a Redis instance. I am using docker-compose for the setup, and running docker-compose up. Here is my docker-compose.yml file:
# Specify docker-compose version.
version: '3'
# Define the services/containers to be run.
services:
express:
build: .
container_name: node-app
ports:
- '8000:8000'
depends_on:
- redis-cache
redis-cache:
image: redis
ports:
- 6379:6379
My Dockerfile:
FROM node:12-alpine
WORKDIR /app
COPY . .
RUN npm ci
EXPOSE 8000
CMD [ “npm”, “start” ] // Also tried CMD [ “node”, “index.js” ]
I am getting the following error:
redis-cache_1 | 1:M 19 Jan 2021 04:23:49.411 * Ready to accept connections
node-app | sh: “start”: unknown operand
node-app exited with code 2
So, I went inside the container and manually ran npm start. The Nodejs started successfully but gave the following error while connecting Redis:
Error: Redis connection to redis-cache:6379 failed - getaddrinfo ENOTFOUND redis-cache
at GetAddrInfoReqWrap.onlookup [as oncomplete] (dns.js:66:26) {
errno: 'ENOTFOUND',
code: 'ENOTFOUND',
syscall: 'getaddrinfo',
hostname: 'redis-cache'
}
Also tried node index.js but still got the same error. This is how I am connecting to my Redis instance in my Nodejs application:
const client = redis.createClient({host: 'redis-cache', port: 6379});
I have tried various answers on StackOverflow as well as various other sites, but none works for me. Please help !

Related

Node.js LoopBack framework, docker-compose.yml, MongoDB Connector - Error: MongoServerSelectionError: connect ECONNREFUSED

I can't connect MongoDB to my LoopBack framework.
I can connect into database with Mongo Express. I can create db and collections. But it's running on localhost. From my app i need to connect with mongo:27017.
docker-compose.yml
version: "3.7"
services:
web:
build:
context: .
dockerfile: .docker/node/Dockerfile
volumes:
- .:/home/node/app
ports:
- 3000:3000
depends_on:
- mongo
links:
- mongo
mongo:
image: mongo:latest
restart: always
volumes:
- ./src/datasources/mongodb:/data/db
environment:
MONGO_INITDB_ROOT_USERNAME: root
MONGO_INITDB_ROOT_PASSWORD: example
ports:
- 27017:27017
mongo-express:
image: mongo-express
restart: always
ports:
- 8081:8081
environment:
ME_CONFIG_MONGODB_ADMINUSERNAME: root
ME_CONFIG_MONGODB_ADMINPASSWORD: example
Dockerfile
# Check out https://hub.docker.com/_/node to select a new base image
FROM node:10-slim
# Set to a non-root built-in user `node`
USER node
# Create app directory (with user `node`)
RUN mkdir -p /home/node/app
WORKDIR /home/node/app
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY --chown=node package*.json ./
RUN npm install
# Bundle app source code
COPY --chown=node . /home/node/app
RUN npm run build
# Bind to all network interfaces so that it can be mapped to the host OS
ENV HOST=0.0.0.0 PORT=3000
EXPOSE ${PORT}
CMD [ "node", "." ]
LoopBack datasources - mongo-db.datasource.config.json
{
"name": "MongoDB",
"connector": "mongodb",
"url": "",
"host": "mongo",
"port": 27017,
"user": "root",
"password": "sportee",
"database": "sportee",
"useNewUrlParser": true
}
docker error
MongoServerSelectionError: connect ECONNREFUSED 127.0.0.1:27017
at Timeout.waitQueueMember.timer.setTimeout [as _onTimeout] (/home/node/app/node_modules/mongodb/lib/core/sdam/topology.js:430:30)
at ontimeout (timers.js:436:11)
at tryOnTimeout (timers.js:300:5)
at listOnTimeout (timers.js:263:5)
at Timer.processTimers (timers.js:223:10)
Emitted 'error' event at:
at MongoDbDataSource.postInit (/home/node/app/node_modules/loopback-datasource-juggler/lib/datasource.js:502:16)
at onError (/home/node/app/node_modules/loopback-connector-mongodb/lib/mongodb.js:316:21)
at /home/node/app/node_modules/loopback-connector-mongodb/lib/mongodb.js:324:9
at /home/node/app/node_modules/mongodb/lib/utils.js:722:9
at err (/home/node/app/node_modules/mongodb/lib/mongo_client.js:216:23)
at connectCallback (/home/node/app/node_modules/mongodb/lib/operations/connect.js:350:5)
at topology.connect.err (/home/node/app/node_modules/mongodb/lib/operations/connect.js:583:14)
at Object.selectServer.err [as callback] (/home/node/app/node_modules/mongodb/lib/core/sdam/topology.js:285:11)
at Timeout.waitQueueMember.timer.setTimeout [as _onTimeout] (/home/node/app/node_modules/mongodb/lib/core/sdam/topology.js:435:25)
[... lines matching original stack trace ...]
Unhandled error in GET /users: 500 Error: Timeout in connecting after 5000 ms
at Timeout._onTimeout (/home/node/app/node_modules/loopback-datasource-juggler/lib/datasource.js:2640:10)
at /home/node/app/node_modules/loopback-datasource-juggler/lib/datasource.js:343:12
Can someone help please? :)
UPDATE:
I resolve my problem with adding networks into my docker-compose.yml
networks:
app-tier:
driver: bridge
me-tier:
driver: bridge

dns in Dockercontainer not working with vue.js and node: Error: getaddrinfo ENOTFOUND db

I created a vue.js app which I know try to dockerize.
As Backend I'm using node.js for api calls, which is using mysql as a database.
So I try to create an ui container, a db container, and also a container for nginx.
When trying to create these via docker-compose up --build, I get the following error.
Does anybody know what is the problem here?
$ vue-cli-service build
- Building for production...
events.js:187
throw er; // Unhandled 'error' event
^
Error: getaddrinfo ENOTFOUND db
at GetAddrInfoReqWrap.onlookup [as oncomplete] (dns.js:60:26)
--------------------
at Protocol._enqueue (/app/node_modules/mysql/lib/protocol/Protocol.js:144:48)
at Protocol.handshake (/app/node_modules/mysql/lib/protocol/Protocol.js:51:23)
at Connection.connect (/app/node_modules/mysql/lib/Connection.js:116:18)
[...]
at processTicksAndRejections (internal/process/task_queues.js:80:21) {
errno: 'ENOTFOUND',
code: 'ENOTFOUND',
syscall: 'getaddrinfo',
hostname: 'db',
fatal: true
}
error Command failed with exit code 1.
info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.
ERROR: Service 'ui' failed to build: The command '/bin/sh -c yarn run build' returned a non-zero code: 1
This is my docker-compose.yml
version: '3'
networks:
app-tier:
services:
db:
image: mysql:5.7
volumes:
- ./sql:/docker-entrypoint-initdb.d
restart: always
networks:
- app-tier
environment:
- MYSQL_USER=user
[...]
ports:
- 3306:3306
ui:
build: ./ui
container_name: ui
restart: always
networks:
- app-tier
expose:
- 80
ports:
- 3000:3000
[...]
This .ui/Dockerfile
# build stage
FROM node:12.14.0-alpine as build-stage
[...]
# RUN yarn server
RUN yarn run build
# Production stage
FROM nginx:1.16.1-alpine as production-stage
COPY --from=build-stage /app/dist /usr/share/nginx/html
[...]
This is ./ui/src/server/db.js:
const mysql = require('mysql');
const connection = mysql.createConnection({
host: 'db',
user: 'root',
password: 'password',
database: 'dbname',
port: '3306'
});
connection.connect();
module.exports = connection;

Postgres ECONNREFUSED on Docker Compose with NodeJS [duplicate]

This question already has answers here:
ECONNREFUSED for Postgres on nodeJS with dockers
(7 answers)
Closed 2 years ago.
I get an ECONNREFUSED when trying to connect to a postgres server in docker from a NodeJS app in docker when running both via docker-compose. However I can connect from my host machine. Here is my docker-compose.yml:
version: "2.4"
services:
api:
build:
context: .
target: dev
depends_on:
- postgres
ports:
- "8080:8080"
- "9229:9229"
networks:
- backend
environment:
- NODE_ENV=development
- PGHOST=postgres
- PGPASSWORD=12345678
- PGUSER=test
- PGDATABASE=test
- PGPORT=5433
volumes:
- .:/node/app
- /node/app/node_modules # Use empty volume to hide the node_modules from the host os
postgres:
image: postgres:11
restart: always
ports:
- "5433:5432"
networks:
- backend
volumes:
- db-data:/var/lib/postgresql/data
environment:
POSTGRES_PASSWORD: 12345678
POSTGRES_USER: test
POSTGRES_DB: test
networks:
backend:
volumes:
db-data:
The nodeJS code:
const client = new Client({
user: process.env.PGUSER,
host: process.env.PGHOST,
database: process.env.PGDATABASE,
password: process.env.PGPASSWORD,
port: Number(process.env.PGPORT),
});
client.connect();
The error:
{ Error: connect ECONNREFUSED 172.22.0.2:5433
api_1 | at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1106:14)
api_1 | errno: 'ECONNREFUSED',
api_1 | code: 'ECONNREFUSED',
api_1 | syscall: 'connect',
api_1 | address: '172.22.0.2',
api_1 | port: 5433 }
At the same time I can connect from the host OS to the database server without any problems. Is there any problems with networking?
Edit: The dB server is ready to accept connections before the nodejs app tries that (I also tried with retrying the connection from within the nodejs app).
No, there is nothing wrong with networking. Just because you're connecting on the wrong port.
Inside compose network, your postgres container exposed 5432 port so it only accept the request via that port inside the compose network. So just need to change PGPORT=5433 to PGPORT=5432.
The reason why you can access from your host OS is because docker-compose mapped your port 5433:5432 so all request to 5433 from outside (host OS) will be pass to 5432 inside your compose network.
Hope that clear enough for you to solve the issue.

ECONNREFUSED when trying to connect to redis via node using docker-compose

I'm trying to switch over to docker, for some reason my Node.js application won't connect to redis running on docker.
This is my docker compose:
version: "3"
services:
chatty-backend:
container_name: chatty-backend
build: ./
volumes:
- ./:/usr/src/chatty-backend
command: npm start
working_dir: /usr/src/chatty-backend
ports:
- "5000:5000"
links:
- redis
- mongo
depends_on:
- redis
- mongo
mongo:
image: mongo:bionic
ports:
- "27017:27017"
redis:
image: redis
ports:
- "6379:6379"
command:
redis-server
This is where I'm connecting to redis:
import * as Redis from 'ioredis';
const redis = new Redis({ host: 'redis', port: 6379 });
export default redis;
Also tried new Redis('redis://redis:6379')
I'm getting this error currently:
Error: connect ECONNREFUSED 127.0.0.1:6379
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1056:14) {
errno: 'ECONNREFUSED',
code: 'ECONNREFUSED',
syscall: 'connect',
address: '127.0.0.1',
port: 6379
}
The issue caused by graphql-redis-subscriptions.
I forgot to pass the updated confifg.

ECONNREFUSED error in docker-compose with NodeJS and postgresql in google cloud

I have created my react app with Node.js and postgresql and I deployed in google cloud. I created a docker image of postgres and nodejs and I uploaded images to docker hub. From gcloud I accessing Those images.
This is my docker-compose-production.yml file.
version: '2.0'
services:
postgres:
image : mycompany/myapp:pglatest
restart : always
volumes:
- ./backdata/databackups:/var/lib/postgresql/backdata
ports:
- "5433:5432"
backend:
image: mycompany/myapp:nodelatest7
command: npm run start
ports:
- "5001:5000"
depends_on:
- postgres
environment:
POSTGRES_URL: postgresql://postgres:root#postgres:5432/db_mydb
DEBUG: hms-backend:*
when I run command
sudo docker-compose -f docker-compose-production.yml up --build -d
2 images are created.
after that I have run tail command
sudo docker-compose -f docker-compose-production.yml logs -t backend
I'm getting error as
backend_1 | 2018-09-15T09:12:18.926001351Z REST API listening on port 5000
backend_1 | 2018-09-15T09:12:18.937246598Z error { Error: connect ECONNREFUSED 192.168.80.2:5432
backend_1 | 2018-09-15T09:12:18.937268668Z at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1191:14)
backend_1 | 2018-09-15T09:12:18.937280934Z errno: 'ECONNREFUSED',
backend_1 | 2018-09-15T09:12:18.937283960Z code: 'ECONNREFUSED',
backend_1 | 2018-09-15T09:12:18.937286817Z syscall: 'connect',
backend_1 | 2018-09-15T09:12:18.937289488Z address: '192.168.80.2',
backend_1 | 2018-09-15T09:12:18.937292260Z port: 5432 }
How to solve this problem
For me your postgres url is wrong : postgresql://postgres:root#postgres:5432/db_mydb
It should be postgresql://postgres:root#postgres:5433/db_mydb since the postgres "exposed" port in 5433
Hum, by i think you should add "container_name" in the docker-compose
services:
postgres:
container_name: my_postgres
and use this name for the "adress" of your postgres
postgresql://my_postgres:root#postgres:5433/db_mydb

Resources