Run multiple Docker containers at once using docker-compose - node.js

The Problem
Currently I've created a Dockerfile and a docker-compose.yml to run my rest-api and database using docker-compose up.
What I want to do now is add another container, namely the web application (build with React). I'm a little bit confused on how to do that, since I just started learning Docker 2 days ago.
Folder Structure
This is my current folder structure
Folder: rest-api (NodeJS)
Dockerfile
dockercompose.yml
The Question
In the end I want to be able to run docker-compose up to fire up both the rest-api and the web-app.
Do I need to create a seperate Dockerfile in every folder and create a 'global' docker-compose.yml to link everything together?
New folder structure:
dockercompose.yml
Folder: rest-api (NodeJS)
Dockerfile
Folder: web-app (React)
Dockerfile
My current setup to run the rest-api and database
Dockerfile
FROM node:13.10
# The destination of the app in the container
WORKDIR /usr/src/app
# Moves the package.json, package-loc.json and tsconfig.json to the specified workdir
COPY package*.json ./
COPY tsconfig.json ./
# Create user and postgres
ENV POSTGRES_USER root
ENV POSTGRES_PASSWORD 12345
ENV POSTGRES_DB postgres
ENV POSTGRES_URI 'postgres://postgres:12345#postgres:5432/postgres'
RUN npm install
COPY . .
EXPOSE 3000
CMD ["npm", "start"]
docker-compose.yml
version: '3'
services:
node:
container_name: rest-api
restart: always
build: .
environment:
PORT: 3000
ports:
- '80:3000'
links:
- postgres
postgres:
container_name: postgres-database
image: postgres
environment:
POSTGRES_URI: 'postgres://postgres:12345#postgres-database:5432/postgres'
POSTGRES_PASSWORD: 12345
ports:
- '5432:5432'

Ok - so there are quite a few ways to approach this and it is pretty much more or less based on your preferance.
If you want to go with your proposed folder structure (which is fine), the you can for example do it like so:
Have a Dockerfile in the root of each of your applications which will build the specific application (as you already suggested) place your docker-compose.yml file in the parent folder of both applications (exactly as you proposed already) and then just make some changes to your docker-compose.yml (I only left the essential parts. Note that links are no longer necessary - the internal networking will resolve the service names to the corresponding service IP address)
version: '3'
services:
node:
build:
context: rest-api
environment:
PORT: 3000
ports:
- '3000:3000'
web:
image: web-app
build:
context: web-app
ports:
- 80:80
postgres:
image: postgres
environment:
POSTGRES_URI: 'postgres://postgres:12345#postgres-database:5432/postgres'
POSTGRES_PASSWORD: 12345
ports:
- '5432:5432'
So the context is what tells docker that what you are building is actually in a different directory and all of the commands executed in the Dockerfile will be relative to that folder
I also changed the port mappings, cause you probably will want to access your web app via HTTP port. Note that the web-app will be able to communicate with the rest-api container by using the node hostname as long as the node service is binding to 0.0.0.0:3000 (not 127.0.0.1:3000)

Related

Docker and NodeJS: could not connect to the container

I'm trying to dockerize a simple NodeJS API, I've tested it as a standalone and it's working. But after dockerize it I can't connect to the container, in the next image you can see two important facts: the container is permanently restarting and I could not connect to it:
After try to establish connection using a GET request the container begins to restart and after a minute later is up for short seconds.
This is my Dockerfile:
FROM node:lts-buster-slim
# Create app directory
WORKDIR /opt/myapps/noderest01
COPY package.json /opt/myapps/noderest01/package.json
COPY package-lock.json /opt/myapps/noderest01/package-lock.json
RUN npm ci
COPY . /opt/myapps/noderest01
EXPOSE 3005
CMD [ "npm", "run", "dev" ]
And this my yaml file:
services:
rest01:
container_name: rest01
ports:
- "3005:3005"
restart: always
build: .
volumes:
- rest01:/opt/myapps/noderest01
- rest01nmodules:/opt/myapps/noderest01/node_modules
networks:
- node-rest01
volumes:
rest01:
rest01nmodules:
networks:
node-rest01:
I used this command to create the image: docker-compose -f docker-compose.yaml up -d
Surely, I need to update my yaml or dockerfile to fix this, I've been searching for a while but I can't find the origin of the problem, so I want to ask for your advises how to fix and update my docker's files and connect to the container, if you have any suggestions please let me know.
Best.

Using Docker with Node image to develop a VuejS (NuxtJs) app

The situtaion
I have to work on a VueJs (NuxtJs) spa, so I'm trying to use Docker with a Node image to avoid installing it on my pc, but can't figure out how to make it work.
The project
The source cose is in its own application folder, since it is versioned, and at the root level there is the docker-compose.yaml file
The folder structure
my-project-folder
├ application
| └ ...
└ docker-compose.yaml
The docker-compose.yaml
version: "3.3"
services:
node:
# container_name: prova_node
restart: 'no'
image: node:lts-alpine
working_dir: /app
volumes:
- ./application:/app
The problem
The container start but quit immediately with exit status 0 (so it executed correctly), but this way I can't use it to work on the project.
Probably there is something I'm missing about the Node image or Docker in general; what i would like to to do is connecting to the docker container to run npm commands like install, run start etc and then check the application on the browser on localhost:3000 or whatever it is.
I would suggest to use Dockerfile with base image as node and then create your entrypoint which runs the application. That will eliminate the need to use volumes which is used when we want to maintain some state for our containers.
Your Dockerfile may look something like this:
FROM node:lts-alpine
RUN mkdir /app
COPY application/ /app/
EXPOSE 3000
CMD npm start --prefix /app
You can then either run it directly through docker run command or use docker-compose.yaml as following :
version: "3.3"
services:
node:
# container_name: prova_node
restart: 'no'
build:
context: .
ports:
- 3000:3000

My Docker container for a MEAN website dashboard does not work in Docker. Nodejs keeps restarting

For a project at university I am working on I am trying to get a MEAN stack website up and running via docker images and containers. However, when I run the command:
docker-compose up --build
It results in this: nodejs permanently restarting.
When the command is run, I get these messages, at various points, which look like errors to me:
failed to get console mode for stdout: The handle is invalid.
and
nodejs exited with code 0
and then it seems like the connection to the MongoDB keeps starting and ending with these errors:
db | 2021-03-30T08:50:22.519+0000 I NETWORK [listener] connection accepted from 172.21.0.2:39627 #27 (1 connection now open)
db | 2021-03-30T08:50:22.519+0000 I NETWORK [conn27] end connection 172.21.0.2:39627 (0 connections now open)
Prior to running the above command I have tested the website will work without a connection to MongoDB with: docker build . in the Angular root folder containing the Dockerfile and the Express API aspect of it works as I can visit the dashboard at http://localhost:3000/.
The full command process I run to achieve the failed state image linked above is as follows:
docker-compose pull → docker-compose build → docker-compose up --build.
I am using Docker Desktop and running the commands in Powershell on Windows 10 Pro.
My Dockerfile is as follows:
# We use the official image as a parent image.
FROM node:10-alpine
RUN mkdir -p /home/node/app/node_modules && chown -R node:node /home/node/app
# Set the working directory.
WORKDIR /home/node/app
# Copy the file(s) from your host to your current location.
COPY package*.json ./
# Change the user to node. This will apply to both the runtime user and the following commands.
USER node
# Run the command inside your image filesystem.
RUN npm install
COPY --chown=node:node . .
# Building the webstie
RUN ./node_modules/.bin/ng build
# Add metadata to the image to describe which port the container is listening on at runtime.
EXPOSE 3000
# Run the specified command within the container.
CMD [ "node", "server.js" ]
And my docker-compose.yml is:
version: '3'
services:
nodejs:
build:
context: .
dockerfile: Dockerfile
image: nodejs
container_name: nodejs
restart: unless-stopped
env_file: .env
environment:
- MONGO_USERNAME=$MONGO_USERNAME
- MONGO_PASSWORD=$MONGO_PASSWORD
- MONGO_HOSTNAME=db
- MONGO_PORT=$MONGO_PORT
- MONGO_DB=$MONGO_DB
ports:
- "3000:3000"
networks:
- app-network
command: ./wait-for.sh db:27017 -- /home/node/app/node_modules/.bin/nodemon server.js
db:
image: mongo:4.1.8-xenial
container_name: db
restart: unless-stopped
env_file: .env
environment:
- MONGO_INITDB_ROOT_USERNAME=$MONGO_USERNAME
- MONGO_INITDB_ROOT_PASSWORD=$MONGO_PASSWORD
volumes:
- dbdata:/data/db
networks:
- app-network
networks:
app-network:
driver: bridge
volumes:
dbdata:
These are the same files that have been provided by the University and supposedly work.
I am new to Docker and containerization but shall try and provide you with any additional information, should you need it.

How to copy build files from one container to another or host on docker

I am trying to dockerize application where I have a php backend and vuejs frontend. Backend is working as I expect however after running npm run build within the frontend container, I need to copy build files from dist folder to nginx container or to host and then use volume to bring those files to nginx container.
I tried to use named volume
services:
frontend:
.....
volumes:
- static:/var/www/frontend/dist
nginx:
.....
volumes:
- static:/var/www/frontend/dist
volumes:
static:
I also tried to do following as suggested on here to bring back dist folder to host
services:
frontend:
.....
volumes:
- ./frontend/dist:/var/www/frontend/dist
However none of the above options is working for me. Below is my docker-compose.yml file and frontend Dockerfile
version: "3"
services:
database:
image: mysql:5.7.22
.....
backend:
build:
context: ./docker/php
dockerfile: Dockerfile
.....
frontend:
build:
context: .
dockerfile: docker/node/Dockerfile
target: 'build-stage'
container_name: frontend
stdin_open: true
tty: true
volumes:
- ./frontend:/var/www/frontend
nginx:
build:
context: ./docker/nginx
dockerfile: Dockerfile
container_name: nginx
restart: unless-stopped
ports:
- 80:80
volumes:
- ./backend/public:/var/www/backend/public:ro
- ./frontend/dist:/var/www/frontend/dist:ro
depends_on:
- backend
- frontend
Frontend Dockerfile
# Develop stage
FROM node:lts-alpine as develop-stage
WORKDIR /var/wwww/frontend
COPY /frontend/package*.json ./
RUN npm install
COPY /frontend .
# Build stage
FROM develop-stage as build-stage
RUN npm run build
You can combine the frontend image and the Nginx image into a single multi-stage build. This basically just involves copying your docker/node/Dockerfile as-is into the start of docker/nginx/Dockerfile, and then COPY --from=build-stage into the final image. You will also need to adjust some paths since you'll need to make the build context be the root of your project.
# Essentially what you had in the question
FROM node:lts AS frontend
WORKDIR /frontend
COPY frontend/package*.json .
RUN npm install
COPY frontend .
RUN npm run build
# And then assemble the Nginx image with content
FROM nginx
COPY --from=frontend /frontend/dist /var/www/html
Once you've done this, you can completely delete the frontend container from your docker-compose.yml file. Note that it never did anything – the image didn't declare a CMD, and the docker-compose.yml didn't provide a command: to run either – and so this shouldn't really change your application.
You can use a similar technique to copy the static files from your PHP application into the Nginx proxy. When all is said and done, this leaves you with a simpler docker-compose.yml file:
version: "3.8"
services:
database:
image: mysql:5.7.22
.....
backend:
build: ./docker/php # don't need to explicitly name Dockerfile
# note: will not need volumes: to export files
.....
# no frontend container any more
nginx:
build:
context: . # because you're COPYing files from other components
dockerfile: docker/nginx/Dockerfile
restart: unless-stopped
ports:
- 80:80
# no volumes:, everything is built into the image
depends_on:
- backend
(There are two practical problems with trying to share content using Docker named volumes. The first is that volumes never update their content once they're created, and in fact hide the content in their original image, so using volumes here actually causes changes in your source code to be ignored in favor of arbitrarily old content. In environments like Kubernetes, the cluster doesn't even provide the copy-on-first-use that Docker has, and there are significant limitations on how files can be shared between containers. This works better if you build self-contained images and don't try to share files.)

NodeJS 14 in a Docker container can't connect to Postgres DB (in/out docker)

I'm making a React-Native app using Rest API (NodeJS, Express) and PostgreSQL.
Everything work good when hosted on my local machine.
Everything work good when API is host on my machine and PostgreSQL in docker container.
But when backend and frontend is both in docker, database is reachable from all my computer in local, but not by the backend.
I'm using docker-compose.
version: '3'
services:
wallnerbackend:
build:
context: ./backend/
dockerfile: ../Dockerfiles/server.dockerfile
ports:
- "8080:8080"
wallnerdatabase:
build:
context: .
dockerfile: ./Dockerfiles/postgresql.dockerfile
ports:
- "5432:5432"
volumes:
- db-data:/var/lib/postgresql/data
env_file: .env_docker
volumes:
db-data:
.env_docker and .env have the same parameters (just name changing).
Here is my dockerfiles:
Backend
FROM node:14.1
COPY package*.json ./
RUN npm install
COPY . .
CMD ["npm", "start"]
Database
FROM postgres:alpine
COPY ./wallnerdb.sql /docker-entrypoint-initdb.d/
I tried to change my hostname in connection url to postgres by using the name of the docker, my host IP address, localhost, but no results.
It's also the same .env (file in my node repo with db_name passwd etc) I do use in local to connect my backend to the db.
Since you are using NodeJS 14 in the Docker Container - make sure that you have the latest pg dependency installed:
https://github.com/brianc/node-postgres/issues/2180
Alternatively: Downgrade to Node 12.
Also make sure, that both the database and the "backend" are in the same network. Also: the backend should best "depend" on the database.
version: '3'
services:
wallnerbackend:
build:
context: ./backend/
dockerfile: ../Dockerfiles/server.dockerfile
ports:
- '8080:8080'
networks:
- default
depends_on:
- wallnerdatabase
wallnerdatabase:
build:
context: .
dockerfile: ./Dockerfiles/postgresql.dockerfile
ports:
- '5432:5432'
volumes:
- db-data:/var/lib/postgresql/data
env_file: .env_docker
networks:
- default
volumes:
db-data:
networks:
default:
This should not be necessary in you case - as pointed out in the comments - since Docker Compose already creates a default network
The container name "wallnerdatabase" is the host name of your database - if not configured otherwise.
I expect the issue to be in the database connection URL since you did not share it.
Containers in the same network in a docker-compose.yml can reach each other using the service name. In your case the service name of the database is wallnerdatabase so this is the hostname that you should use in the database connection URL.
The database connection URL that you should use in your backend service should be similar to this:
postgres://user:password#wallnerdatabase:5432/dbname
Also make sure that the backend code is calling the database using the hostname wallnerdatabase as it is defined in the docker-compose.yml file.
Here is the reference on Networking in Docker Compose.
You should access your DB using service name as hostname. Here is my working example - https://gitlab.com/gintsgints/vue-fullstack/-/blob/master/docker-compose.yml

Resources