Node + Docker Compose: Development and production setup - node.js

I'm looking for a solution to have both a development and a production environment in my project using docker, docker-compose and nodejs.
How do I approach this?
Basically what I want is a command to start my docker production environment, and a command to start my development environment (which could use nodemon for example).
Here is my Dockerfile
FROM node:13-alpine
RUN mkdir /app
WORKDIR /app
COPY . /app
RUN npm install
RUN npm run build
EXPOSE 1234
CMD ["npm", "run", "prod"] # <--- Have a possibility to run something like "npm run dev" here instead
docker-compose.yml
version: "3"
services:
findus:
build: .
ports:
- "1234:1234"
links:
- mongo
container_name: myapp
mongo:
image: mongo
restart: always
ports:
- "4444:4444"
package.json
// ...
"scripts": {
"build": "tsc",
"dev": "nodemon source/index.ts",
"prod": "node build/index.js"
},
// ...

You can make use of entrypoint and pass the command to the docker container. Then you can use docker-compose inharitance to launch compose for the environment that you want and append command to the entrypoint.
Dockerfile :
FROM node:13-alpine
RUN mkdir /app
WORKDIR /app
COPY . /app
RUN npm install
RUN npm run build
EXPOSE 1234
ENTRYPOINT ["npm", "run"]
Main docker-compose.yml :
version: "3"
services:
findus:
build: .
ports:
- "1234:1234"
links:
- mongo
container_name: myapp
mongo:
image: mongo
restart: always
ports:
- "4444:4444"
And then have two docker-compose files to append the command passed to the image entry point. For development - docker-compose.dev.yml :
version: "3"
services:
findus:
command: dev
and docker-compose.prod.yml :
version: "3"
services:
findus:
command: prod
Then to launch dev environment :
docker-compose -f docker-compose.yml -f docker-compose.dev.yml up
and for prod environment :
docker-compose -f docker-compose.yml -f docker-compose.prod.yml up
So the command will be appended to the ENTRYPOINT instruction.
Also this approach could work with enviroment variables if you wanted to pass the command as environment variable. You can find more information in the docs.

You can create a structure like this:
docker-compose.yml -->
docker-compose.dev.yml
docker-compose.prod.yml
Where the base configuration resides in docker-compose.yml, while environment-specific info such as ports or user credentials would be in docker-compose.dev.yml or docker-compose.prod.yml
And then you can run the dev environment with:
docker-compose \
-f docker-compose.yml \
-f docker-compose.dev.yml \
up -d
Or the prod environment with:
docker-compose \
-f docker-compose.yml \
-f docker-compose.prod.yml \
up -d

One way to do it is to create two "targets" on your Dockerfile like this:
Dockerfile:
FROM node:13-alpine As development
RUN mkdir /app
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
RUN npm run build
FROM node:13-alpine as production
ARG NODE_ENV=production
ENV NODE_ENV=${NODE_ENV}
WORKDIR /app
COPY package*.json ./
RUN npm install --only=production
COPY . .
COPY --from=development /app/dist ./dist
CMD ["node", "dist/index.js"]
And them on your docker-compose run only the Development version:
version: '3.7'
services:
main:
container_name: main
build:
context: .
target: development
command: npm run dev
...
So on your dev environment, you can run:
docker-compose up
and them in prod you can run docker directly with
docker run --target production
I'm assuming that when you run "npm run build" you are generating a dist/production folder so it's better to run node there instead of running straight at index.js on your root folder.

Related

Update docker container packages in a node app

I have a issue.
I'm using docker to host my node apps while I'm studying.
It runs normally when build for first time, but, if I add a new package it not been installed in host.
So the question is, i have to run "docker-compose up --build" every time I install new package or change my package.json?
Here is an example of my Dockerfile and docker-compose.yml
Dockerfile
FROM node:alpine
WORKDIR /home/app_node
COPY package.json ./
RUN npm install
COPY . .
EXPOSE 3333
CMD ["npm", "start"]
docker-compose.yml
version: "3"
services:
app:
build: .
image: testeeee
command: npm start
ports:
- "3333:3333"
volumes:
- .:/home/app_node
- /home/app_node/node_modules

Docker compose for monorepo with only one package.js for client and server

I have a monorepo and I am user only once package.json for both client and server and I am running the app user env var.
This is my folder structure
and this is my docker compose file
version: '3.8'
services:
api:
container_name: api
build: ./src/api
ports:
- 8888:8888
volumes:
- .:/app
- /app/node_modules
environment:
- APP_ENV=staging
manager:
container_name: manager
build: ./src/manager
ports:
- 3000:3000
volumes:
- .:/app
- /app/node_modules
environment:
- APP_ENV=staging
this is my DockerFile for individual apps
FROM public.ecr.aws/docker/library/node:16.13.0
# ENV NODE_OPTIONS=--max-old-space-size=8192
RUN npm install -g npm#8.1.0
# Bundle application source.
RUN mkdir -p /usr/src/app
COPY ["package*.json", "../../"]
WORKDIR /usr/src/app
# Bundle application source.
COPY . /usr/src/app
# WORKDIR ../../
RUN npm cache clear --force
RUN npm install
RUN npm ci --production
COPY . .
EXPOSE 8888
CMD ["node", "../../index.js"]
and this is my index.js where i am running the apps using env variables
switch (process.env.ENTRYPOINT) {
case 'api':
await import('./src/api/index.js');
break;
case 'manager':
chdir('src/manager');
await import('./src/manager/server.js');
break;
}
the package.json have scripts something like this
"start:api": "ENTRYPOINT=api node index.js",
"start:manager": "ENTRYPOINT=manager node index.js",

getting a local docker-composed node / express & react app deployed on GCP using a single Dockerfile

I have a nodejs / express / react app running locally that starts a node server at :3001 and a react app at :3000 which can make requests to the express API.
I then made a /client/Dockerfile and a /server/Dockerfile, and a /docker-compose.yml which is capable of running my app locally without issue.
I now want to deploy this to GCP's Cloud Run, but GCP does not allow multiple docker images / docker-compose (AFAIK - it only has a single field for "Dockerfile"), so I am trying to reconfigure things to work with a single Docker file.
Here's what I had for the working, local instance:
client/Dockerfile
FROM node:lts-slim
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
EXPOSE 3000
CMD [ "npm", "start" ]
server/Dockerfile
FROM node:lts-slim
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
EXPOSE 3001
# You can change this
CMD [ "npm", "run", "dev" ]
docker-compose.yml
version: "3"
services:
client:
container_name: gcp-cloudrun-client
build:
context: ./client
dockerfile: Dockerfile
image: mheavers/gcp-cloudrun-client
ports:
- "3000:3000"
volumes:
- ./client:/usr/src/app
server:
container_name: gcp-cloudrun-server
build:
context: ./server
dockerfile: Dockerfile
image: mheavers/gcp-cloudrun-server
ports:
- "3001:3001"
volumes:
- ./server:/usr/src/app
How do I combine all this into a single Dockerfile and do away with docker-compose?
this was my final docker file to replace docker-compose:
FROM node:lts-slim AS client
WORKDIR /usr/src/app
COPY client/ ./client/
RUN cd client && npm install && npm run build
FROM node:lts-slim AS server
WORKDIR /root/
COPY --from=client /usr/src/app/client/build ./client/build
COPY server/package*.json ./server/
RUN cd server && npm install
COPY server/index.js ./server/
EXPOSE 80
CMD ["node", "./server/index.js"]

Unable to enable hot reload for React on docker

I'm new to docker so I'm sure I'm missing something.
I'm trying to create a container with a react app. I'm using docker on a Windows 10 machine.
This is my docker file
FROM node:latest
EXPOSE 3000
ENV PATH /app/node_modules/.bin:$PATH
# install app dependencies
COPY package.json ./
COPY package-lock.json ./
RUN npm install --silent
RUN npm install react-scripts#3.4.1 -g --silent
COPY . /app
WORKDIR /app
CMD ["npm","run", "start"]
and this is my docker compose
version: '3.7'
services:
sample:
container_name: prova-react1
build:
context: .
dockerfile: Dockerfile
volumes:
- '.:/app'
- '/app/node_modules'
ports:
- 3000:3000
environment:
- CHOKIDAR_USEPOLLING=true
- COMPOSE_CONVERT_WINDOWS_PATHS=1
When i start the container if I go on the browser everything is working fine but when i go back to Visual Studio Code and I make a modification to the files and save nothing occurs to the container and even to the website

docker - sh: nodemon: not found

I am creating a docker container on my ec2 instance.
When I run docker-compose up --build, container test-web is not created successfully.
I tried to run docker logs test-web and see the logs, then I see below error
sh: nodemon: not found
I tried to add nodemon dependency on package.json and run docker-compose up --build again but still not working.
Dockerfile
FROM node:lts-alpine
WORKDIR /server
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 3030
CMD ["npm", "run", "dev"]
docker-compose.yml
version: '2.1'
services:
test-db:
image: mysql:5.7
environment:
- MYSQL_ALLOW_EMPTY_PASSWORD=true
- MYSQL_USER=admin
- MYSQL_PASSWORD=12345
- MYSQL_DATABASE=test
volumes:
- ./db-data:/var/lib/mysql
ports:
- 3306:3306
test-web:
environment:
- NODE_ENV=local
#- DEBUG=*
- PORT=3030
build: .
command: >
./wait-for-db-redis.sh test-db npm run dev
volumes:
- ./:/server
ports:
- "3030:3030"
depends_on:
- test-db
package.json
"scripts": {
"dev": "nodemon --legacy-watch src/",
},
I add RUN npm install --global nodemon in Dockerfile and it works now.
Dockerfile
FROM node:lts-alpine
RUN npm install --global nodemon
WORKDIR /server
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 3030
CMD ["npm", "run", "dev"]

Resources