Connect docker compose containers without links - node.js

https://docs.docker.com/compose/networking/
At above official docker document, I found the part
version: "3"
services:
web:
build: .
ports:
- "8000:8000"
db:
image: postgres
ports:
- "8001:5432"
Each container can now look up the hostname web or db and get back the appropriate container’s IP address. For example, web’s application code could connect to the URL postgres://db:5432 and start using the Postgres database.
So I understood this paragraph that I can connect docker containers each other without links: or networks: explicitly. because above docker-compose.yml snippet doesn't have links or networks: part. and the document say web’s application code could connect to the URL postgres://db:5432
So I tried to test simple docker-compose with nodejs express, mongodb together using above way. I thought I can connect mongodb in express app with just mongodb://mongo:27017/myapp But I cannot connect mongodb in express container. I think I followed docker's official manual but I don't know why it's not working. Of course I can connect mongodb using links: or networks: But I heard links is depreciated and I cannot find the proper way to use networks:
I think I might be misunderstood, Please fix me.
Below is my docker-compose.yml
version: '3'
services:
app:
container_name: node
restart: always
build: .
ports:
- '3000:3000'
mongo:
image: mongo
ports:
- '27017:27017'
In express app, I connect to mongodb with
mongoose.connect('mongodb://mongo:27017/myapp', {
useMongoClient: true
});
// also doesn't work with mongodb://mongo/myapp
Plus) Dockerfile
FROM node:10.17-alpine3.9
ENV NODE_ENV development
WORKDIR /usr/src/app
COPY ["package*.json", "npm-shrinkwrap.json*", "./"]
RUN rm -rf node_modules
RUN apk --no-cache --virtual build-dependencies add \
python \
make \
g++ \
&& npm install \
&& apk del build-dependencies
COPY . .
EXPOSE 3000
CMD npm start

If you want to connect mongo with local then you should have to select network mode.
docket-compose.yml file content.
version: '2.1'
services:
z2padmin_docker:
image: z2padmin_docker
build: .
environment:
NODE_ENV: production
volumes: [/home/ankit/Z2PDATAHUB/uploads:/mnt/Z2PDATAHUB/uploads]
ports:
- 5000:5000
network_mode: host

Related

Docker NodeJS app cant connect to postgres database

I am farely new to docker and docker-compose. I tried to spin up a few services using docker which contain of a nodejs (Nest.js) api, a postgres db and pgadmin. Without the API (nodejs) app beeing dockerized I could connect to the docker database containers, but now that I also have dockerized the node app, it is not connecting anymore and I am clueless why. Is there anything wrong with the way I have set it up?
Here is my docker-compose file
version: "3"
services:
nftapi:
env_file:
- .env
build:
context: .
ports:
- '5000:5000'
depends_on:
- postgres
volumes:
- .:/app
- /app/node_modules
networks:
- postgres
postgres:
container_name: postgres
image: postgres:latest
ports:
- "5432:5432"
volumes:
- /data/postgres:/data/postgres
env_file:
- docker.env
networks:
- postgres
pgadmin:
links:
- postgres:postgres
container_name: pgadmin
image: dpage/pgadmin4
ports:
- "8080:80"
volumes:
- /data/pgadmin:/root/.pgadmin
env_file:
- docker.env
networks:
- postgres
networks:
postgres:
driver: bridge
This is the nodejs app Dockerfile which builds successfully and in the logs I see the app is trying to connect to the databse but it cant (no specific error) just that it doesnt find the db.
# Image source
FROM node:14-alpine
# Docker working directory
WORKDIR /app
# Copying file into APP directory of docker
COPY ./package.json /app/
RUN apk update && \
apk add git
# Then install the NPM module
RUN yarn install
# Copy current directory to APP folder
COPY . /app/
EXPOSE 5000
CMD ["npm", "run", "start:dev"]
I have 2 env files in my projecs root directory.
.env
docker.env
As mentioned above, when I remove the "nftapi" service from docker and run the nodejs up with a simple npm start it is connecting to the postgres container.
TypeOrmModule.forRoot({
type: 'postgres',
host: process.env.POSTGRES_HOST,
port: Number(process.env.POSTGRES_PORT),
username: process.env.POSTGRES_USER,
password: process.env.POSTGRES_PASSWORD,
database: process.env.POSTGRES_DB,
synchronize:true,
entities: ['dist/**/*.entity{.ts,.js}'],
}),
The host from the .env file that is used in the typeorm module is localhost
When using networks with docker-compose you should use the name of the service as you hostname.
so in your case the hostname should be postgres and not localhost
You can read more about at here:
https://docs.docker.com/compose/networking/

MongoDB Database data getting deleted while using Docker on Digital Ocean droplet

I am hosting my website http://apgiiit.com/ on Digital Ocean cloud using docker containers. Site is build using Express and MongoDB. But, it seems when I run docker-compose down command all of my database data is getting wiped out somehow. I have no idea why this is happening. Any help would be greatly appreciated. Here's my docker-compose and Docker files for the project.
version: '3'
services:
app:
container_name: express_blog
restart: always
build: .
ports:
- '80:5000'
links:
- mongo
mongo:
container_name: mongo
image: mongo
ports:
- '27017:27017'
volumes:
- ./mongodb:/data/db/
volumes:
mongodb:
external: true
Here's the other docker file used to run express.
FROM node:12
WORKDIR /usr/src/app
COPY package.json ./
RUN npm install
COPY . .
EXPOSE 3000
CMD ["npm", "run", "dev"]
I am using external volumes for storing mongodb data. I've created another volume using docker volume command and using that volume in the docker-compose file. What am I doing wrong here ?

Connect mongo and sapper server with docker

I am working in a sapper / svelte project and I need to build the sapper project and connect it to a mongodb (I need to start mongo compose from docker-compose.yml)
At the moment I was trying to connect the db to the local mongo on port localhost: 27017 but it can't establish the connection. What should I do?
Here there is my docker-compose
version: "3.4"
services:
myapp:
image: my_image
deploy:
update_config:
delay: 30s
parallelism: 1
failure_action: rollback
ports:
- "3000:3000"
and here my dockerfile
FROM node:lts-alpine
WORKDIR /app
COPY static static
COPY emails emails
COPY package.json .
ENV NODE_ENV production
RUN npm install
COPY __sapper__/build __sapper__/build
EXPOSE 3000
CMD ["node", "__sapper__/build/index.js"]
Also what should I do to start the mongo deployment directly from compose? I have mongo on docker but I should start both directly from compose.
I think mongo service should be added to services of docker-compose.yml.
for example.
services:
mongodb:
image: mongo
restart: always
environment:
MONGO_INITDB_ROOT_USERNAME: root
MONGO_INITDB_ROOT_PASSWORD: example
Then, the node application can access to mongodb by the service name.(ex. mongodb:27017).
I think this URL will help.
https://hub.docker.com/_/mongo
version: "3.4"
services:
app:
image: yourimage
ports:
- "3000:3000"
environment:
- MONGODB_URL=mongodb://yourip/yourdb
mongodb:
image: mongo
restart: always
ports:
- "yourportsdb:yourportsdb"
it is not necessary to authenticate the mongo with password and user, eventually it passes the environments as suggested #Jihoon Yeo

Docker-compose builds but app does not serve on localhost

Docker newbie here. Docker-compose file builds without any issues but when I try to run my app on localhost:4200, I get a message - localhost didn't send any data on chrome and the server unexpectedly dropped the connection in safari. I am working on MacOs Catalina. Here is my yml file:
version: '3.0'
services:
my-portal:
build: .
ports:
- "4200:4200"
depends_on:
- backend
backend:
build: ./backend
ports:
- "3000:3000"
environment:
POSTGRES_HOST: host.docker.internal
POSTGRES_USER: "postgres"
POSTGRES_PASSWORD: mypwd
depends_on:
-db
db:
image: postgres:9.6-alpine
environment:
POSTGRES_DB: mydb
POSTGRES_USER: "postgres"
POSTGRES_PASSWORD: mypwd
POSTGRES_HOST: host.docker.internal
ports:
- 5432:5432
restart: always
volumes:
- ./docker/db/data:/var/lib/postgresql/data
Log for Angular:
/docker-entrypoint.sh: Configuration complete; ready for start up
Log for Node: db connected
Log for Postgres: database system is ready to accept connections
Below are my Angular and Node Docker files:
FROM node:latest AS builder
WORKDIR /app
COPY . .
RUN npm install
RUN npm run build --prod
EXPOSE 4200
# Stage 2
FROM nginx:alpine
COPY --from=builder /app/dist/* /usr/share/nginx/html/
Node:
FROM node:12
WORKDIR /backend
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 3000
CMD [ "node", "server.js" ]
When I created Angular image and ran my app on localhost:4200 it worked fine. Please let me know if I am missing anything.
Your Angular container is built FROM nginx, and you use the default Nginx configuration from the Docker Hub nginx image. That listens on port 80, so that's the port number you need to use in use ports: directive:
services:
quickcoms-portal:
build: .
ports:
- "4200:80" # <-- second port must match nginx image's port
depends_on:
- backend
The EXPOSE directive in the first stage is completely ignored and you can delete it. The FROM nginx line causes docker build to basically completely start over from a new base image, so your final image is stock Nginx plus the files you COPY --from=builder.

Docker is not building container with changes in source code

I'm relatively new with Docker and I just created an Node.js application that should connect with other services also running on Docker.
So I get the source code and a Dockerfile to setup this image and a docker-compose to orchestrate the environment.
I had a few problems in the beginning so I just updated my source code and found out that it's not getting updated in the next build of docker-compose.
For example I commented all the lines that connect to Redis and MongoDB. I run the application locally and it's fine. But when I create it again in a container, I get the errors "Connection refused..."
I tried many things and this is what i get at the momment:
Dockerfile
FROM node:9
WORKDIR /app
COPY package.json /app
RUN npm install
COPY . /app
CMD node app.js
EXPOSE 8090
docker-compose.yml
version: '3'
services:
app:
build: .
ports:
- "8090:8090"
container_name: app
redis:
image: redis:latest
ports:
- "6379:6379"
container_name: redis
mongodb:
image: mongo:latest
container_name: "mongodb"
volumes:
- ./data/db:/data/db
ports:
- 27017:27017
up.sh
sudo docker stop app
sudo docker rm app
docker-compose build --no-cache app
sudo docker-compose up --force-recreate
Any ideas on what could be the problem? Why doesn't it use the current source code? It is using some sort of cache.

Resources