Docker container connection to host's Kafka throws : Error: connect ECONNREFUSED 127.0.0.1:9092 - node.js

I have a nodejs app containerized as a linux container which uses the kafka-node library.
Kafka runs on the host machine which runs windows with:
Zookeeper port : 2181
Kafka broker port : 9092
I run the the nodejs container with the following command:
docker container run --network host --name nm name:1.0
In order to connect with the host's kafka I am using the following command:
client = new kafka.KafkaClient({kafkaHost: "localhost:9092"});
But this throws :
Error: connect ECONNREFUSED 127.0.0.1:9092
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1126:14) {
errno: 'ECONNREFUSED',
code: 'ECONNREFUSED',
syscall: 'connect',
address: '127.0.0.1',
port: 9092
}
When I change the connect command to :
client = new kafka.KafkaClient({kafkaHost: "host.docker.internal:9092"});
I am getting :
TimeoutError: Request timed out after 30000ms
at new TimeoutError (/usr/src/app/node_modules/kafka-node/lib/errors/TimeoutError.js:6:9)
at Timeout._onTimeout (/usr/src/app/node_modules/kafka-node/lib/kafkaClient.js:491:14)
at listOnTimeout (internal/timers.js:531:17)
at processTimers (internal/timers.js:475:7) {
message: 'Request timed out after 30000ms'
}
Can someone advise what I am doing wrong ?
UPDATE
When switching to a linux host machine, the above localhost methodology runs just fine .

Same problem here, but I can solve
You must set the environment variable KAFKA_ADVERTISED_HOST_NAME with your domain host.docker.internal:9092 in Kafka broker
My example:
version: '3'
services:
zookeeper:
image: wurstmeister/zookeeper
kafka:
image: wurstmeister/kafka
ports:
- "9092:9092"
hostname: 'kafka-internal.io'
environment:
KAFKA_ADVERTISED_HOST_NAME: kafka-internal.io
KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
Now I can connect inside container using kafka-internal.io:9092 :)

Not the best answer but switching to a Linux host should make the first methodology (localhost:IP) run just fine .

Related

cant access the webapp running in docker with docusaurus

I have built a docker container with docusaurus server and opened port, but the website is not accessible and throws an error like this. How do I allow all the ip address in the docusaurus server?
yarn run v1.22.19
warning package.json: No license field
$ docusaurus-start
LiveReload server started on port 35729
Docusaurus server started on port 3000
Request failed: Error: connect ECONNREFUSED 127.0.0.1:80
at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1247:16) { errno: -111, code: 'ECONNREFUSED', syscall: 'connect',
address: '127.0.0.1', port: 80 }

Nodejs application not connecting Redis in docker-compose

I have a Nodejs application which connects to a Redis instance. I am using docker-compose for the setup, and running docker-compose up. Here is my docker-compose.yml file:
# Specify docker-compose version.
version: '3'
# Define the services/containers to be run.
services:
express:
build: .
container_name: node-app
ports:
- '8000:8000'
depends_on:
- redis-cache
redis-cache:
image: redis
ports:
- 6379:6379
My Dockerfile:
FROM node:12-alpine
WORKDIR /app
COPY . .
RUN npm ci
EXPOSE 8000
CMD [ “npm”, “start” ] // Also tried CMD [ “node”, “index.js” ]
I am getting the following error:
redis-cache_1 | 1:M 19 Jan 2021 04:23:49.411 * Ready to accept connections
node-app | sh: “start”: unknown operand
node-app exited with code 2
So, I went inside the container and manually ran npm start. The Nodejs started successfully but gave the following error while connecting Redis:
Error: Redis connection to redis-cache:6379 failed - getaddrinfo ENOTFOUND redis-cache
at GetAddrInfoReqWrap.onlookup [as oncomplete] (dns.js:66:26) {
errno: 'ENOTFOUND',
code: 'ENOTFOUND',
syscall: 'getaddrinfo',
hostname: 'redis-cache'
}
Also tried node index.js but still got the same error. This is how I am connecting to my Redis instance in my Nodejs application:
const client = redis.createClient({host: 'redis-cache', port: 6379});
I have tried various answers on StackOverflow as well as various other sites, but none works for me. Please help !

Access database on Server with localhost node Project

I have bought a VPS for my website. I have built a small project with vue and node.
In this I used postgresql that is working fine on localhost.
I want to change my database from localhost to the server database where I have installed postgres and made a database and table.
My db.js file looks like this:
const Pool = require('pg').Pool;
const pool = new Pool({
user: "postgres",
password: "root",
database: "todo_database",
host:"45.111.241.15",
port: 5432
});
module.exports = pool;
Then I tried to send form data from the localhost vue page and it gave the following error:
error connecting db Error: connect ECONNREFUSED 45.111.241.15:5432
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1141:16) {
errno: 'ECONNREFUSED',
code: 'ECONNREFUSED',
syscall: 'connect',
address: '45.111.241.15',
port: 5432
I do not have much knowledge about this and how to solve this.
Can you please guide me what should I do to connect with my server db?
I follow these steps and it worked:
edit your /etc/postgresql/9.1/main/postgresql.conf and change
Connection Settings -
listen_addresses = '*' # what IP address(es) to listen on;
In /etc/postgresql/10/main/pg_hba.conf
IPv4 local connections:
host all all 0.0.0.0/0 md5
Now restart your DBMS
sudo service postgresql restart
Now you can connect with
psql -h hostname(IP) -p 5432 -U postgres

Postgres ECONNREFUSED on Docker Compose with NodeJS [duplicate]

This question already has answers here:
ECONNREFUSED for Postgres on nodeJS with dockers
(7 answers)
Closed 2 years ago.
I get an ECONNREFUSED when trying to connect to a postgres server in docker from a NodeJS app in docker when running both via docker-compose. However I can connect from my host machine. Here is my docker-compose.yml:
version: "2.4"
services:
api:
build:
context: .
target: dev
depends_on:
- postgres
ports:
- "8080:8080"
- "9229:9229"
networks:
- backend
environment:
- NODE_ENV=development
- PGHOST=postgres
- PGPASSWORD=12345678
- PGUSER=test
- PGDATABASE=test
- PGPORT=5433
volumes:
- .:/node/app
- /node/app/node_modules # Use empty volume to hide the node_modules from the host os
postgres:
image: postgres:11
restart: always
ports:
- "5433:5432"
networks:
- backend
volumes:
- db-data:/var/lib/postgresql/data
environment:
POSTGRES_PASSWORD: 12345678
POSTGRES_USER: test
POSTGRES_DB: test
networks:
backend:
volumes:
db-data:
The nodeJS code:
const client = new Client({
user: process.env.PGUSER,
host: process.env.PGHOST,
database: process.env.PGDATABASE,
password: process.env.PGPASSWORD,
port: Number(process.env.PGPORT),
});
client.connect();
The error:
{ Error: connect ECONNREFUSED 172.22.0.2:5433
api_1 | at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1106:14)
api_1 | errno: 'ECONNREFUSED',
api_1 | code: 'ECONNREFUSED',
api_1 | syscall: 'connect',
api_1 | address: '172.22.0.2',
api_1 | port: 5433 }
At the same time I can connect from the host OS to the database server without any problems. Is there any problems with networking?
Edit: The dB server is ready to accept connections before the nodejs app tries that (I also tried with retrying the connection from within the nodejs app).
No, there is nothing wrong with networking. Just because you're connecting on the wrong port.
Inside compose network, your postgres container exposed 5432 port so it only accept the request via that port inside the compose network. So just need to change PGPORT=5433 to PGPORT=5432.
The reason why you can access from your host OS is because docker-compose mapped your port 5433:5432 so all request to 5433 from outside (host OS) will be pass to 5432 inside your compose network.
Hope that clear enough for you to solve the issue.

ECONNREFUSED error in docker-compose with NodeJS and postgresql in google cloud

I have created my react app with Node.js and postgresql and I deployed in google cloud. I created a docker image of postgres and nodejs and I uploaded images to docker hub. From gcloud I accessing Those images.
This is my docker-compose-production.yml file.
version: '2.0'
services:
postgres:
image : mycompany/myapp:pglatest
restart : always
volumes:
- ./backdata/databackups:/var/lib/postgresql/backdata
ports:
- "5433:5432"
backend:
image: mycompany/myapp:nodelatest7
command: npm run start
ports:
- "5001:5000"
depends_on:
- postgres
environment:
POSTGRES_URL: postgresql://postgres:root#postgres:5432/db_mydb
DEBUG: hms-backend:*
when I run command
sudo docker-compose -f docker-compose-production.yml up --build -d
2 images are created.
after that I have run tail command
sudo docker-compose -f docker-compose-production.yml logs -t backend
I'm getting error as
backend_1 | 2018-09-15T09:12:18.926001351Z REST API listening on port 5000
backend_1 | 2018-09-15T09:12:18.937246598Z error { Error: connect ECONNREFUSED 192.168.80.2:5432
backend_1 | 2018-09-15T09:12:18.937268668Z at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1191:14)
backend_1 | 2018-09-15T09:12:18.937280934Z errno: 'ECONNREFUSED',
backend_1 | 2018-09-15T09:12:18.937283960Z code: 'ECONNREFUSED',
backend_1 | 2018-09-15T09:12:18.937286817Z syscall: 'connect',
backend_1 | 2018-09-15T09:12:18.937289488Z address: '192.168.80.2',
backend_1 | 2018-09-15T09:12:18.937292260Z port: 5432 }
How to solve this problem
For me your postgres url is wrong : postgresql://postgres:root#postgres:5432/db_mydb
It should be postgresql://postgres:root#postgres:5433/db_mydb since the postgres "exposed" port in 5433
Hum, by i think you should add "container_name" in the docker-compose
services:
postgres:
container_name: my_postgres
and use this name for the "adress" of your postgres
postgresql://my_postgres:root#postgres:5433/db_mydb

Resources