Can't connect to MongoDB on Docker Container - node.js

I'm working on a node js api and I'm using mongodb. Right now I'm facing one problem when I try to connect to the database, I'm getting this error MongooseServerSelectionError: connect ECONNREFUSED 127.0.0.1:27017 the api is running on a docker container and the database is local, I'm not running mongo on docker container.
this is my connection string mongodb://localhost:27017/database
and this is my docker file
FROM node
RUN apk add dumb-init
ENV PORT=4000
WORKDIR /usr/src/app
COPY package-lock.json /usr/src/app/
RUN npm ci
COPY . /usr/src/app/
USER node
EXPOSE 4000
CMD ["dumb-init", "node", "/usr/src/app/app.js"]
UPDATE
Forgot to add the docker-compose.yml
Here it is:
version: '2.1'
services:
api:
build:
context: .
dockerfile: ./Dockerfile.api
ports:
- "4000:4000"
- "27017:27017"
extra_hosts:
"host.docker.internal": host-gateway
volumes:
- ./var/:/var
restart: on-failure
Can someone tell what is wrong or what else is missing?

Make the hostname for your connection string to mongodb
host.docker.internal instead of localhost
i.e: mongodb://host.docker.internal:27017/database
https://docs.docker.com/desktop/networking/#i-want-to-connect-from-a-container-to-a-service-on-the-host

Related

MongoDB Database data getting deleted while using Docker on Digital Ocean droplet

I am hosting my website http://apgiiit.com/ on Digital Ocean cloud using docker containers. Site is build using Express and MongoDB. But, it seems when I run docker-compose down command all of my database data is getting wiped out somehow. I have no idea why this is happening. Any help would be greatly appreciated. Here's my docker-compose and Docker files for the project.
version: '3'
services:
app:
container_name: express_blog
restart: always
build: .
ports:
- '80:5000'
links:
- mongo
mongo:
container_name: mongo
image: mongo
ports:
- '27017:27017'
volumes:
- ./mongodb:/data/db/
volumes:
mongodb:
external: true
Here's the other docker file used to run express.
FROM node:12
WORKDIR /usr/src/app
COPY package.json ./
RUN npm install
COPY . .
EXPOSE 3000
CMD ["npm", "run", "dev"]
I am using external volumes for storing mongodb data. I've created another volume using docker volume command and using that volume in the docker-compose file. What am I doing wrong here ?

Docker-compose builds but app does not serve on localhost

Docker newbie here. Docker-compose file builds without any issues but when I try to run my app on localhost:4200, I get a message - localhost didn't send any data on chrome and the server unexpectedly dropped the connection in safari. I am working on MacOs Catalina. Here is my yml file:
version: '3.0'
services:
my-portal:
build: .
ports:
- "4200:4200"
depends_on:
- backend
backend:
build: ./backend
ports:
- "3000:3000"
environment:
POSTGRES_HOST: host.docker.internal
POSTGRES_USER: "postgres"
POSTGRES_PASSWORD: mypwd
depends_on:
-db
db:
image: postgres:9.6-alpine
environment:
POSTGRES_DB: mydb
POSTGRES_USER: "postgres"
POSTGRES_PASSWORD: mypwd
POSTGRES_HOST: host.docker.internal
ports:
- 5432:5432
restart: always
volumes:
- ./docker/db/data:/var/lib/postgresql/data
Log for Angular:
/docker-entrypoint.sh: Configuration complete; ready for start up
Log for Node: db connected
Log for Postgres: database system is ready to accept connections
Below are my Angular and Node Docker files:
FROM node:latest AS builder
WORKDIR /app
COPY . .
RUN npm install
RUN npm run build --prod
EXPOSE 4200
# Stage 2
FROM nginx:alpine
COPY --from=builder /app/dist/* /usr/share/nginx/html/
Node:
FROM node:12
WORKDIR /backend
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 3000
CMD [ "node", "server.js" ]
When I created Angular image and ran my app on localhost:4200 it worked fine. Please let me know if I am missing anything.
Your Angular container is built FROM nginx, and you use the default Nginx configuration from the Docker Hub nginx image. That listens on port 80, so that's the port number you need to use in use ports: directive:
services:
quickcoms-portal:
build: .
ports:
- "4200:80" # <-- second port must match nginx image's port
depends_on:
- backend
The EXPOSE directive in the first stage is completely ignored and you can delete it. The FROM nginx line causes docker build to basically completely start over from a new base image, so your final image is stock Nginx plus the files you COPY --from=builder.

Nodejs application docker unable to connect to mongodb docker container

I have a nodejs application which dockerized and need a replicated MongoDB database. I have built my replicated MongoDB in docker-compose and working just fine. if I run the command docker inspect MongoDB-primary |grep IPAddress its print:
"IPAddress": "",
"IPAddress": "172.18.0.2",
now in my application, i give this ip as mongoconnection string(of course with protocol names) but the application cannot connect to MongoDB and throw this error message(application also is a docker container):
message: 'failed to connect to server [172.18.0.2:27017] on first connect [MongoNetworkError: connection 1 to 172.18.0.2:27017 timed out]',
here is my mongodb docker compose file:
version: '2'
services:
mongodb-primary:
image: 'bitnami/mongodb:latest'
environment:
- MONGODB_REPLICA_SET_MODE=primary
volumes:
- 'mongodb_master_data:/bitnami'
mongodb-secondary:
image: 'bitnami/mongodb:latest'
depends_on:
- mongodb-primary
environment:
- MONGODB_REPLICA_SET_MODE=secondary
- MONGODB_PRIMARY_HOST=mongodb-primary
- MONGODB_PRIMARY_PORT_NUMBER=27017
mongodb-arbiter:
image: 'bitnami/mongodb:latest'
depends_on:
- mongodb-primary
environment:
- MONGODB_REPLICA_SET_MODE=arbiter
- MONGODB_PRIMARY_HOST=mongodb-primary
- MONGODB_PRIMARY_PORT_NUMBER=27017
volumes:
mongodb_master_data:
driver: local
and my node js application dockerfile is:
FROM node:6.0
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package*.json ./
RUN npm install
# If you are building your code for production
# RUN npm install --only=productio
# Bundle app source
COPY . .
EXPOSE 3001
CMD [ "npm", "start" ]
how can I fix this?
Your docker-compose does not automatically expose tcp ports to the outer world, like your host PC (I assume your nodeJs runs on host and not included in docker-compose). This is the behavior of docker bridge networks, you can read more at https://docs.docker.com/network/bridge/
You have to do one of the following:
Include your NodeJs container into docker-compose
or
Expose ports from docker-compose.yml
I had the same issue and adding the ports fixed it for me.
Make sure your connection url includes the image name:
mongodb://mongo:27017/ and not localhost.
mongo:
image: mongo
expose:
- 27017
ports:
- "27017:27017"
volumes:
- ./data/db:/data/db

Docker is not building container with changes in source code

I'm relatively new with Docker and I just created an Node.js application that should connect with other services also running on Docker.
So I get the source code and a Dockerfile to setup this image and a docker-compose to orchestrate the environment.
I had a few problems in the beginning so I just updated my source code and found out that it's not getting updated in the next build of docker-compose.
For example I commented all the lines that connect to Redis and MongoDB. I run the application locally and it's fine. But when I create it again in a container, I get the errors "Connection refused..."
I tried many things and this is what i get at the momment:
Dockerfile
FROM node:9
WORKDIR /app
COPY package.json /app
RUN npm install
COPY . /app
CMD node app.js
EXPOSE 8090
docker-compose.yml
version: '3'
services:
app:
build: .
ports:
- "8090:8090"
container_name: app
redis:
image: redis:latest
ports:
- "6379:6379"
container_name: redis
mongodb:
image: mongo:latest
container_name: "mongodb"
volumes:
- ./data/db:/data/db
ports:
- 27017:27017
up.sh
sudo docker stop app
sudo docker rm app
docker-compose build --no-cache app
sudo docker-compose up --force-recreate
Any ideas on what could be the problem? Why doesn't it use the current source code? It is using some sort of cache.

Cannot connect to MongoDB via node.js in Docker

My node.js express app cannot connect to the MongoDB in a Docker. I'm not that familiar with Docker.
node.js connection:
import mongodb from 'mongodb';
...
mongodb.MongoClient.connect('mongodb://localhost:27017', ... );
Dockerfile:
FROM node:argon
RUN mkdir /app
WORKDIR /app
COPY package.json /app
RUN npm install
COPY . /app
EXPOSE 3000
CMD ["npm", "start"]
docker-compose.yml
version: “2”
services:
web:
build: .
volumes:
— ./:/app
ports:
— “3000:3000”
links:
— mongo
mongo:
image: mongo
ports:
— “27017:27017”
Build command: docker build -t NAME .
Run command: docker run -ti -p 3000:3000 NAME
Connection error:
[MongoError: failed to connect to server [localhost:27017] on first connect [MongoError: connect ECONNREFUSED 127.0.0.1:27017]]
name: 'MongoError',
message: 'failed to connect to server [localhost:27017] on first connect [MongoError: connect ECONNREFUSED 127.0.0.1:27017]'
Try:
mongodb.MongoClient.connect('mongodb://mongo:27017', ... );
Change your docker-compose.yml:
version: "2"
services:
web:
build: .
volumes:
- ./:/app
ports:
- "3000:3000"
links:
- mongo
mongo:
image: mongo
ports:
- "27017:27017"
And use some docker compose commands:
docker-compose down
docker-compose build
docker-compose up -d mongo
docker-compose up web
Try this.
When using linked docker containers you should use the name of the container in this case for example your connection to mongodb should be mongodb.MongoClient.connect('mongodb://mongo:27017', ... ); instead of mongodb.MongoClient.connect('mongodb://localhost:27017', ... );. The reason for changing it to mongo is because you used the links attribute to mongo in your docker-compose.yml. That would result to a hostname of mongo in your /etc/hosts of the web docker container. Reference linking-containers.
The docker-compose.yml seems to be lacking an indention. On the mongo attribute should be the same level as web.
version: '2'
services:
web:
build: .
volumes: ['./:/app']
ports: [ '3000:3000' ]
links: [ mongo ]
mongo:
image: mongo
ports: [ '27017:27017' ]
I tried your configuration using my docker what Ive done is update docker-compose.yml then I docker-compose build then docker-compose up. Logs of my local run
I am not sure if you still have this question, but the datasources.json should be:
"host": "mongo"
rather than "localhost".
In my logs I see:
mongo | NETWORK [listener] connection accepted from 172.22.0.3:47880 #1 (1 connection now open)
As you can see, docker compose will NAT mongo to another V-LAN. The IP address 172.22.0.0 is an internal IP address used by the daemon to route a docker-compose image. So localhost is now not in the game.
At least, it works for me.
datasources.json
"mongoDS": {
"host": "mongo",
"port": 27017,
...
in my case works like this :
just link to the container from the command line. db is my up database container
sudo docker run -it --link db:db1 --publish 4000-4006:4000-4006 --name backend backend:latest

Resources