Docker Postgress: connect ECONNREFUSED when running the application - node.js

I have a compose file that looks as follows:
version: "3.9"
services:
postgresdb:
image: postgres:14.4-alpine
container_name: pgsql
restart: always
volumes:
- ./db/init.sql:/docker-entrypoint-initdb.d/0_init.sql
# - $HOME/database:/var/lib/postgresql/data
ports:
- "8081:5432"
expose:
- "8081"
environment:
POSTGRES_DB: todos
POSTGRES_USER: admin
POSTGRES_PASSWORD: password
SERVICE_TAGS: dev
SERVICE_NAME: postgresdb
networks:
- internalnet
nodeapp:
container_name: server
build: .
image: nodeapp:1.0
ports:
- "3002:3002"
expose:
- "3002"
environment:
DB_HOST: postgresdb
DB_PORT: 8081
DB_USER: "admin"
DB_PASSWORD: "password"
DB_NAME: todos
DB_CONNECTION_LIMIT: 20
SERVICE_TAGS: dev
SERVICE_NAME: nodeapp
PORT: 3002
depends_on:
- postgresdb
networks:
- internalnet
networks:
internalnet:
driver: bridge
When i run the compose up -d command the ps command here is what i am getting:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
53a12ab5a9ae nodeapp:1.0 "docker-entrypoint.s…" 12 seconds ago Up 5 seconds 0.0.0.0:3002->3002/tcp server
98cda5442cb0 postgres:14.4-alpine "docker-entrypoint.s…" 14 seconds ago Up 8 seconds 8081/tcp, 0.0.0.0:8081->5432/tcp pgsql
To me this looks okay. But now when i try to send request to the server I'm getting an error saying:
"connect ECONNREFUSED 192.*.*.*:8081"
I don't know why my application is failing to connect to postgres of which they are on the same network.

Related

Can't start pgadmin container on linux server

I'm trying to migrate project from mysql to postgres using docker and docker compose file.
I'm connected to Linux server remotely .
My docker compose file :
version: '3.7'
services:
database:
container_name: ${PROJECT_NAME}-database
image: postgres:12
restart: unless-stopped
environment:
POSTGRES_USER: user
POSTGRES_PASSWORD: admin
POSTGRES_DB: dbtest
ports:
- "${POSTGRES_PORT}:5432"
volumes:
- ./docker/postgres/local_pgdata:/var/lib/postgresql/data
pgadmin:
image: dpage/pgadmin4
depends_on:
- database
container_name: ${PROJECT_NAME}-pgadmin4
restart: unless-stopped
ports:
- "${PGADMIN_PORT}:5454"
environment:
PGADMIN_DEFAULT_EMAIL: khaled.boussoffara-prestataire#labanquepostale.fr
PGADMIN_DEFAULT_PASSWORD: admin
PGADMIN_LISTEN_PORT: 5454
volumes:
- ./docker/pgadmin/pgadmin-data:/var/lib/pgadmin
My env file :
PROJECT_NAME=iig
PROJECT_FOLDER_NAME=sf_iig_api
HTTP_PORT=12078
HTTPS_PORT=12077
POSTGRES_PORT=12076
PGADMIN_PORT=5050
docker-compose ps :
I can't start pgadmin :
Your compose file seems okay to me, I use different ports, but my set-up is quite close to yours.
The error message recommends "check the proxy and firewall" (vérifer le proxy et le pare-feu) ... did you check it? I would use netcat:
nc -v -z RemoteHost Port
At least this could result in a helpful error message.

Backend can't query dockerized postgresql

I'm running my containers via a docker compose file. They are in the same network and I can ping from my backend container to my database container. I use the database name as the hostname in the connection string and it doesn't bring any errors that it couldn't find the host. Instead, it just hangs up and times out.
I have a test endpoint which is just suppose to test the connection. When you use that endpoint, database container logs "invalid packet length", and on the frontend, nothing happens, then it times out. I have no idea whats wrong. Any help?
version: '3.2'
services:
server:
restart: always
build:
dockerfile: Dockerfile
context: ./nginx
depends_on:
- backend
- frontend
- database
ports:
- '5000:80'
networks:
- app_network
database:
image: postgres:latest
container_name: database
ports:
- "5432:5432"
restart: always
hostname: database
environment:
POSTGRES_PASSWORD: 1234
POSTGRES_USER: postgres
backend:
build:
context: ./backend
dockerfile: ./Dockerfile
image: kalendae:backend
hostname: backend
container_name: backend
environment:
- WAIT_HOSTS=database:5432
- DATABASE_HOST=database
- DATABASE_PORT=5432
- PORT=5051
frontend:
build:
context: ./frontend
dockerfile: ./Dockerfile
image: kalendae:frontend
hostname: frontend
container_name: frontend
environment:
- WAIT_HOSTS=backend:5051
- REACT_APP_BACKEND_HOST=localhost
- REACT_APP_BACKEND_PORT=5051

Can't get docker-compose networking to work

I'm trying to make requests to my server container from my client app in another container. The Docker Compose docs state that the network is setup automatically, so shouldn't all ports be accessible from all containers? When I make a curl request to port 4000 from outside of the container (in a fresh terminal), it works. However when I enter the client container (selektor-client) and try the same request, it fails.
curl --request POST http://localhost:4000/api/music
What am I doing wrong?
docker-compose.yaml:
version: "3"
services:
client:
container_name: selektor-client
restart: always
build: ./client
ports:
- "3000:3000"
volumes:
- ./client/:/client/
- /client/node_modules/
command: ["yarn", "start"]
server:
container_name: selektor-server
restart: always
build: ./server
ports:
- "4000:4000"
volumes:
- ./server/:/server/
depends_on:
- mongo
command: ["yarn", "start"]
mongo:
container_name: selektor-mongo
command: mongod --noauth
build: .
restart: always
volumes:
# - ./mongo-init.js:/docker-entrypoint-initdb.d/mongo-init.js:ro
- data-volume:/data/db
ports:
- "27017:27017"
volumes:
data-volume:
When you call containers of the same stack each other, the host name is, by default, the service name, and the port is the internal port: <servicename>:<internal_port>. So, based on this part of your example:
version: "3"
server:
container_name: selektor-server
restart: always
build: ./server
ports:
- "4000:4000"
volumes:
- ./server/:/server/
depends_on:
- mongo
command: ["yarn", "start"]
The url you client have to use to reach the server is http://server:4000
I had the same problem . my solution was registering network and giving subnet mast to the network and register the static ip to each container :
version: "3"
networks:
my_network:
driver: bridge
ipam:
config:
- subnet: 172.20.0.0/24
services:
mongodb:
image: mongo:4.2.1
container_name: mongo
command: mongod --auth
hostname: mongo
networks:
my_network:
ipv4_address: 172.20.0.5
volumes:
- /data/db/mongo:/data/db
ports:
- "27017:27017"
rabitmq:
hostname: rabbitmq
container_name: rabbitmq
image: rabbitmq:latest
networks:
my_network:
ipv4_address: 172.20.0.3
volumes:
- /var/lib/rabbitmq:/data/db
ports:
- "5672:5672"
- "15672:15672"
restart: always
you can access through IP addresses

Docker compose error to connect postgres database with Node API

I am having the connection error with the docker-composer when I try to access a postgres database through an API in Node.
I'm using Sequelize as ORM to acess the database. But I'dont know what happened.
docker-compose.yml:
version: '3.5'
services:
api-service:
build:
context: .
dockerfile: ./api-docker.dockerfile
image: api-service
container_name: api-service
restart: always
env_file: .env
environment:
- NODE_ENV=$NODE_ENV
ports:
- ${PORT}:3000
volumes:
- .:/home/node/api
- node_modules:/home/node/api/node_modules
depends_on:
- postgres-db
networks:
- api-network
command: npm run start:dev
postgres-db:
expose:
- ${PORT_SERVICE}
ports:
- ${PORT_SERVICE}:5432
restart: always
env_file: .env
volumes:
- pgReportData:/var/lib/postgresql/data
environment:
POSTGRES_USER: ${USER_SERVICE}
POSTGRES_PASSWORD: ${PASSWORD_SERVICE}
POSTGRES_DB: ${DATABASE_SERVICE}
networks:
- api-network
container_name: postgres-db
image: postgres:10
networks:
api-network:
driver: bridge
volumes:
pgReportData:
driver: local
node_modules:
.env:
NODE_ENV=development
PORT=30780
HOST_SERVICE=postgres-db
DATABASE_SERVICE=base
USER_SERVICE=user
PASSWORD_SERVICE=password
DIALECT=postgres
PORT_SERVICE=5444
api-docker.dockerfile:
FROM node:12
WORKDIR /src
COPY . .
COPY --chown=node:node . .
USER node
RUN npm install
EXPOSE $PORT
ENTRYPOINT ["npm", "run", "start:dev"]
And when I run:
docker-compose up
I'm getting this error:
Any ideia?
Can someone help me ??
If node is the only app that connects to postgre-db you can remove the networks, and expose the postgredb running port (5432). To connect to the db you can simply use the container name as host.
Connection String: "postgres://YourUserName:YourPassword#postgres-db:5432/YourDatabase";
version: '3.5'
services:
api-service:
build:
context: .
dockerfile: ./api-docker.dockerfile
image: api-service
container_name: api-service
restart: always
env_file: .env
environment:
- NODE_ENV=$NODE_ENV
ports:
- ${PORT}:3000
volumes:
- .:/home/node/api
- node_modules:/home/node/api/node_modules
depends_on:
- postgres-db
command: npm run start:dev
postgres-db:
expose:
- 5432
restart: always
env_file: .env
volumes:
- pgReportData:/var/lib/postgresql/data
environment:
POSTGRES_USER: ${USER_SERVICE}
POSTGRES_PASSWORD: ${PASSWORD_SERVICE}
POSTGRES_DB: ${DATABASE_SERVICE}
container_name: postgres-db
image: postgres:10
volumes:
pgReportData:
driver: local
node_modules:

Backend container can't connect to database - Sequelize, Node, Postgres, Docker

When I try to connect my backend (using Sequelize) I get this following error:
error ConnectionRefusedError [SequelizeConnectionRefusedError]: connect ECONNREFUSED 127.0.0.1:5432
docker-compose.yml:
version: "3.7"
services:
frontend:
build:
context: ./client
dockerfile: Dockerfile
image: client
ports:
- "3000:3000"
volumes:
- ./client:/usr/src/app
backend:
build:
context: ./server
dockerfile: Dockerfile
image: server
ports:
- "8000:8000"
volumes:
- ./server:/usr/src/app
db:
image: postgres
environment:
POSTGRES_DB: ckl
POSTGRES_USER: postgres
POSTGRES_PASSWORD: docker
ports:
- "5432:5432"
What am I doing wrong ?
Thanks in advance
Assuming your backend is connecting to the db you should add a depends_on:
backend:
build:
context: ./server
dockerfile: Dockerfile
image: server
depends_on:
- db
ports:
- "8000:8000"
volumes:
- ./server:/usr/src/app
The db will now be accessible at the host db:5432 if your application is configured to connect to localhost:5432 or 172.0.0.1:5432 you'll need to replace the hostname localhost with db. Your postgres connection string might also not have a host and might be trying to connect to localhost by default. Should be able to look at sequelize to figure out how to pass a host.

Resources