Docker, mongodb don't connection nodejs - node.js

I am facing weird issue while connecting MongoDB running in a separate container from my nodejs container, it displays the following error while trying to connect to MongoDB.
My Dockerfile
FROM node:latest
WORKDIR /usr/src/app
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 8000
CMD ["npm","start"]
enter code here
docker-compose
version: '3'
services:
web:
build: .
ports:
- "8000:8000"
links:
- mongo
- redis
mongo:
image: mongo
ports:
- "49155:49155"
redis:
image: "redis:alpine"
mongo config
mongoose: { // MongoDB
// uri: mongodb://username:password#host:port/database?options
uri: `mongodb://localhost:27017/${DB_NAME}`,
options: {
},
seed: {
path: '/api/models/seeds/',
list: [
{
file: 'user.seed',
schema: 'User',
plant: 'once' // once - always - never
},
{
file: 'example.seed',
schema: 'Example',
plant: 'once'
}
]
},
},
Issue
enter image description here
I am only studied docker pls help me

On creating a new container, docker will attach that container to a default internal bridge.
https://docs.docker.com/network/ check network drivers.
To make it available for the localhost
you will have to set
network_mode: "host", in docker-compose in mongo section

indrajeet's answer will work, but since you are venturing into docker-compose, it is better if you treat each of your services/container as a host. In your case, the error is because your web app is trying to connect to a mongo service running on the same (web) localhost.
Instead you should connect to the mongo service/contaner, which docker-compose conveniently let your web container to access the mongo service/container/host via the hostname "mongo" ( the service name you specified in the docker-compose yaml)
My answer is to change the uri line to
uri: `mongodb://mongo:27017/${DB_NAME}`,

Related

Sequelize CLI ,Docker ,ConnectionRefusedError [SequelizeConnectionRefusedError] [duplicate]

I have a CRUD app working on node, on my local machine. It is running on node, with postgres as the database, using knex.js as a query builder, etc.
I have created a docker file, and a docker-compose file, and the containers start, but the node container can't reach the postgres container. I suspect it has to do with the enviornment variables but I am not sure. here is my docker file:
FROM node:12
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package*.json ./
RUN npm ci --only=production
# Bundle app source
COPY . .
ENV PORT=8080
EXPOSE 8080
CMD [ "npm", "start" ]
This is the docker-compose file:
version: '2'
services:
postgres:
image: postgres:alpine
environment:
POSTGRES_PASSWORD: password
POSTGRES_USER: app
POSTGRES_DB: db
app:
build: .
depends_on:
- "postgres"
links:
- "postgres"
environment:
DB_PASSWORD: 'password'
DB_USER: 'app'
DB_NAME: 'db'
DB_HOST: 'postgres'
PORT: 8080
ports:
- '8080:8080'
command: npm start
also, here is the knex.js file on root that handles the db connections based on the environment:
// Update with your config settings.
module.exports = {
development: {
client: 'pg',
connection: 'postgres://localhost/db'
},
test: {
client: 'pg',
connection: 'postgres://localhost/test-db'
}
};
additionally when I check the hosts file on the node app inside the docker i don't see anything mentioning the link to postgres container. Any help would be appreciated, thanks.
The reason why your node application is not connecting is because it is trying to connect to itself as you are referencing localhost. Your database is in a second container which is not local so you need to reference it by service name which would be postgres.
So assuming your application is handling authentication another way, your config would be something like this:
// Update with your config settings.
module.exports = {
development: {
client: 'pg',
connection: 'postgres://postgres/db'
},
test: {
client: 'pg',
connection: 'postgres://postgres/test-db'
}
};
If you can, you should use the environment variables you assigned to the app container.
Docker-compose creates an internal network shared by the different containers it launches.
Since app and postgres are 2 separate containers, they are considered as 2 hosts. This causes app to look for postgres on the same container when you point it at localhost instead of the postgres container.
You can solve this by just changing localhost with postgres in your knex.js file.

How to connect a node docker container with postgres docker container

I have a CRUD app working on node, on my local machine. It is running on node, with postgres as the database, using knex.js as a query builder, etc.
I have created a docker file, and a docker-compose file, and the containers start, but the node container can't reach the postgres container. I suspect it has to do with the enviornment variables but I am not sure. here is my docker file:
FROM node:12
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package*.json ./
RUN npm ci --only=production
# Bundle app source
COPY . .
ENV PORT=8080
EXPOSE 8080
CMD [ "npm", "start" ]
This is the docker-compose file:
version: '2'
services:
postgres:
image: postgres:alpine
environment:
POSTGRES_PASSWORD: password
POSTGRES_USER: app
POSTGRES_DB: db
app:
build: .
depends_on:
- "postgres"
links:
- "postgres"
environment:
DB_PASSWORD: 'password'
DB_USER: 'app'
DB_NAME: 'db'
DB_HOST: 'postgres'
PORT: 8080
ports:
- '8080:8080'
command: npm start
also, here is the knex.js file on root that handles the db connections based on the environment:
// Update with your config settings.
module.exports = {
development: {
client: 'pg',
connection: 'postgres://localhost/db'
},
test: {
client: 'pg',
connection: 'postgres://localhost/test-db'
}
};
additionally when I check the hosts file on the node app inside the docker i don't see anything mentioning the link to postgres container. Any help would be appreciated, thanks.
The reason why your node application is not connecting is because it is trying to connect to itself as you are referencing localhost. Your database is in a second container which is not local so you need to reference it by service name which would be postgres.
So assuming your application is handling authentication another way, your config would be something like this:
// Update with your config settings.
module.exports = {
development: {
client: 'pg',
connection: 'postgres://postgres/db'
},
test: {
client: 'pg',
connection: 'postgres://postgres/test-db'
}
};
If you can, you should use the environment variables you assigned to the app container.
Docker-compose creates an internal network shared by the different containers it launches.
Since app and postgres are 2 separate containers, they are considered as 2 hosts. This causes app to look for postgres on the same container when you point it at localhost instead of the postgres container.
You can solve this by just changing localhost with postgres in your knex.js file.

mongoose connection failure using docker-compose

I have a Node express server consuming a Mongo database.
I'm trying to create a container for each of them using docker-compose.
Here's my docker-compose.yml file:
version: "2"
services:
server:
container_name: server
restart: always
build: .
ports:
- "3000:3000"
depends_on:
- db
db:
container_name: db
image: mongo
volumes:
- /var/lib/mongodb:/data/db
ports:
- "27017:27017"
And my Dockerfile:
FROM node:latest
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY package.json /usr/src/app
RUN npm install
COPY . /usr/src/app
RUN npm run build-run
EXPOSE 3000
I saw on many tutorials that, when using Docker to create a Mongo container, the connection string should be updated in mongoose.connect to use Docker containers naming resolution.
So I changed my connection string according to my docker-compose file:
private readonly CONNECTION_STRING: String = 'mongodb://db/search-people-db'
public connect(): void {
mongoose.connect(this.CONNECTION_STRING)
this._db.on('error', (err) => {
console.log(`mongoose server failed to start: ${err}`)
})
this._db.once('open', () => {
console.log(`mongoose server running using ${this.CONNECTION_STRING}`)
})
}'
However, when running sudo docker-compose up, I keep getting the following error:
Mongoose server failed to start: MongoNetworkError: failed to connect to server [db:27017] on first connect [MongoNetworkError: getaddrinfo ENOTFOUND db db:27017]
What am I doing wrong ? Thanks in advance
MongoDB's container boots up but MongoDB itself needs more time start. so your application will not connect to it until it's fully started.
as Docker's documents suggested, you should set a wait time for your application and then run your code.
I suggest to make mongoose try to reconnect if couldn't connect at the first time or let the application crash if it couldn't connect. Docker will run your container again.
Replace depends_on with links in your docker-compose.yml and try to run command again.

Cannot connect to MongoDB via node.js in Docker

My node.js express app cannot connect to the MongoDB in a Docker. I'm not that familiar with Docker.
node.js connection:
import mongodb from 'mongodb';
...
mongodb.MongoClient.connect('mongodb://localhost:27017', ... );
Dockerfile:
FROM node:argon
RUN mkdir /app
WORKDIR /app
COPY package.json /app
RUN npm install
COPY . /app
EXPOSE 3000
CMD ["npm", "start"]
docker-compose.yml
version: “2”
services:
web:
build: .
volumes:
— ./:/app
ports:
— “3000:3000”
links:
— mongo
mongo:
image: mongo
ports:
— “27017:27017”
Build command: docker build -t NAME .
Run command: docker run -ti -p 3000:3000 NAME
Connection error:
[MongoError: failed to connect to server [localhost:27017] on first connect [MongoError: connect ECONNREFUSED 127.0.0.1:27017]]
name: 'MongoError',
message: 'failed to connect to server [localhost:27017] on first connect [MongoError: connect ECONNREFUSED 127.0.0.1:27017]'
Try:
mongodb.MongoClient.connect('mongodb://mongo:27017', ... );
Change your docker-compose.yml:
version: "2"
services:
web:
build: .
volumes:
- ./:/app
ports:
- "3000:3000"
links:
- mongo
mongo:
image: mongo
ports:
- "27017:27017"
And use some docker compose commands:
docker-compose down
docker-compose build
docker-compose up -d mongo
docker-compose up web
Try this.
When using linked docker containers you should use the name of the container in this case for example your connection to mongodb should be mongodb.MongoClient.connect('mongodb://mongo:27017', ... ); instead of mongodb.MongoClient.connect('mongodb://localhost:27017', ... );. The reason for changing it to mongo is because you used the links attribute to mongo in your docker-compose.yml. That would result to a hostname of mongo in your /etc/hosts of the web docker container. Reference linking-containers.
The docker-compose.yml seems to be lacking an indention. On the mongo attribute should be the same level as web.
version: '2'
services:
web:
build: .
volumes: ['./:/app']
ports: [ '3000:3000' ]
links: [ mongo ]
mongo:
image: mongo
ports: [ '27017:27017' ]
I tried your configuration using my docker what Ive done is update docker-compose.yml then I docker-compose build then docker-compose up. Logs of my local run
I am not sure if you still have this question, but the datasources.json should be:
"host": "mongo"
rather than "localhost".
In my logs I see:
mongo | NETWORK [listener] connection accepted from 172.22.0.3:47880 #1 (1 connection now open)
As you can see, docker compose will NAT mongo to another V-LAN. The IP address 172.22.0.0 is an internal IP address used by the daemon to route a docker-compose image. So localhost is now not in the game.
At least, it works for me.
datasources.json
"mongoDS": {
"host": "mongo",
"port": 27017,
...
in my case works like this :
just link to the container from the command line. db is my up database container
sudo docker run -it --link db:db1 --publish 4000-4006:4000-4006 --name backend backend:latest

Link nodejs app to Rethinkdb from another container

I made two 2 containers, one for the RethinkDB and one for a nodejs app.
I want to connect my nodejs app to this RethinkDB but everytime I try get an error
Error:{"message":"Failed to connect to localhost:58015\nFull error:\n{\"code\":\"ECONNREFUSED\"
But I can connect the same nodejs app running without Docker to the RethinkDB, with the open port (58015).
My Docker compose config look like this
# Rethink DB
rethink:
build: docker/rethinkdb
container_name: rethink
ports:
- 58080:8080
- 58015:28015
- 59015:29015
# NodeJS
nodejs:
build: docker/nodejs
container_name: nodejs
ports:
- 53000:3000
- 55000:5000
depends_on:
- rethink
To connect my app to the db I set the host and port inside a JS config file
database: {
servers: [
{
host: process.env.DB_PORT_28015_TCP_ADDR || 'localhost',
port: process.env.DB_PORT_28015_TCP_PORT || 28015
}
],
name: 'atlas'
},
I tried with RethinkDB port (28015) and with my open port (58015) without success.
I tried to link this two containers with links, network_mode, without success too.
Every solutions I tried don't work.
I think my Rethink container is not ready when the nodejs app try to connect. I really don't understand the problem, if this not this.
The nodejs app is running with pm2
How can I made this app connect to my db ?
For you config, you should use
# Rethink DB
rethink:
build: docker/rethinkdb
container_name: rethink
ports:
- 58080:8080
- 58015:28015
- 59015:29015
# NodeJS
nodejs:
build: docker/nodejs
container_name: nodejs
ports:
- 53000:3000
- 55000:5000
links:
- rethink
depends_on:
- rethink
and in JS code
database: {
servers: [
{
host: process.env.DB_PORT_28015_TCP_ADDR || 'rethink',
port: process.env.DB_PORT_28015_TCP_PORT || 28015
}
],
name: 'atlas'
},
As far as I know, one Docker container will not see the other unless specifically linked and using the same net:
docker run \
--name ${NEWAPP} \
--restart=always \
--env MYAPPPAR=${PROJ} \
-v /var/log/docker/node/logs:/usr/src/app/log \
--link myapp_rethink_1:myapp_rethink_1 \
--net myapp_default \
-p ${PORT}:9000 \
-d ${NEWAPP}
So you need both --net and --link:
--link format is sourcecontainername:containeraliasname
--net so that containers can find each other with internal DNS / containername. You can check network with 'docker network ls'
When using newer versions of docker-compose, your services will be configured to run on the same network.
The top level service name in your 'docker-compose.yml' will become the host that needs to be specified, when connecting to RethinkDB from your app:
# docker-compose.yml
version: '3.2'
Services:
web:
build: .
links: db
...
db:
image: rethinkDB
...
Using the example above, you can connect to RethinkDB by using the host named 'db', within the app configuration file:
module.exports = {
rethinkdb: {
host: 'db',
port: 28015
}
};

Resources