docker - sh: nodemon: not found - node.js

I am creating a docker container on my ec2 instance.
When I run docker-compose up --build, container test-web is not created successfully.
I tried to run docker logs test-web and see the logs, then I see below error
sh: nodemon: not found
I tried to add nodemon dependency on package.json and run docker-compose up --build again but still not working.
Dockerfile
FROM node:lts-alpine
WORKDIR /server
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 3030
CMD ["npm", "run", "dev"]
docker-compose.yml
version: '2.1'
services:
test-db:
image: mysql:5.7
environment:
- MYSQL_ALLOW_EMPTY_PASSWORD=true
- MYSQL_USER=admin
- MYSQL_PASSWORD=12345
- MYSQL_DATABASE=test
volumes:
- ./db-data:/var/lib/mysql
ports:
- 3306:3306
test-web:
environment:
- NODE_ENV=local
#- DEBUG=*
- PORT=3030
build: .
command: >
./wait-for-db-redis.sh test-db npm run dev
volumes:
- ./:/server
ports:
- "3030:3030"
depends_on:
- test-db
package.json
"scripts": {
"dev": "nodemon --legacy-watch src/",
},

I add RUN npm install --global nodemon in Dockerfile and it works now.
Dockerfile
FROM node:lts-alpine
RUN npm install --global nodemon
WORKDIR /server
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 3030
CMD ["npm", "run", "dev"]

Related

What should the behaviour of running `docker-compose exec <container_name> npm install` be on the host?

I am setting up Docker for development purposes. With separate Dockerfile.dev for a NextJS front-end and an ExpressJS-powered back-end. For the NextJS application, the volume mounting is done such that the application runs perfectly fine in the dev server, in the container. However, due to empty node_modules directory in the host, I enter the following command in a new terminal instance:
docker-compose exec frontend npm install.
This gives me my needed modules so that I get normal linting while developing.
However, on starting up the Express backend, I have issues with installing mongodb from npm (module not found when running the container). So I resorted to the same strategy by performing docker-compose backend npm install and everything works. However, the node_modules directory in the host is still empty, which is not the same behaviour as when I had done it with the frontend.
Why is this the case?
#Dockerfile.dev (frontend)
FROM node:19-alpine
WORKDIR "/app"
COPY ./package*.json .
RUN npm install
COPY . .
CMD ["npm", "run", "dev"]
#Dockerfile.dev (backend)
FROM node:19-alpine
WORKDIR "/app"
RUN npm install -g nodemon
COPY ./package*.json .
RUN npm install
COPY . .
CMD ["npm", "run", "dev"]
docker-compose.yml:
version: '3'
services:
server:
build:
dockerfile: Dockerfile.dev
context: ./server
volumes:
- /app/node_modules
- ./server:/app
ports:
- "5000:5000"
client:
build:
dockerfile: Dockerfile.dev
context: ./client
volumes:
- /app/node_modules
- ./client:/app
ports:
- "3000:3000"
mongodb:
image: mongo:latest
volumes:
- /home/panhaboth/Apps/home-server/data:/data/db
ports:
- "27017:27017"

Dev dependencies not getting installed in docker container

I am new to docker. I am trying to create a container for react and express and run both the containers on same network using docker compose.
Below is my dockerfile for frontend:
FROM node:alpine
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 3000
CMD ["npm","run","start"]
Below is my dockerfile for backend
FROM node:alpine
WORKDIR /app
COPY package*.json ./
RUN NODE_ENV=development npm install
COPY . .
EXPOSE 5000
CMD ["npm","run","server"]
Below is my docker-compose.yml
version: '3'
services:
client:
build:
context: './frontend'
dockerfile: Dockerfile
ports:
- 3000:3000
container_name: react_cont
environment:
- WATCHPACK_POLLING=true
networks:
- mern
volumes:
- ./frontend:/app
depends_on:
- server
server:
build:
context: './backend'
dockerfile: Dockerfile
ports:
- 5000:5000
container_name: express_cont
networks:
- mern
volumes:
- ./backend:/app
networks:
mern:
react container is getting is created and running successfully but the express container is not getting created with an error
sh: nodemon: not found
I had installed nodemon as my dev dependency.
Any help is appreciated. Thanks in advance.
Answering my own Question.
I had installed nodemon globally in my machine but forgot to install it as a dependency for my current project.
Since nodemon was installed globally in my machine i was not getting any errors while i was trying to run my server.js using nodemon. nodemon server.js in scripts did not throw any error while i was in developing my project locally prior moving it to docker container.
But since neither my package.json had nodemon as my dependency and i had not installed it separately in my container nodemon did not get installed and it gave me error.
You can try to delete node_modules folder in your source code and add flag --production=false explicitly to the npm install command. I think it's caching problem.
You may need to install nodemon package globally in your Docker:
RUN NODE_ENV=development npm install && npm --global install nodemon

Docker container can't find an installed node dependency

When I start up my docker container it can't find one of my installed dependencies. I've tried removing my build + node_modules folder, going into my docker container terminal and running yarn list and searching for the dependency there (it is). I've also tried running yarn add <dependency and it says it works, but still throws the error error TS2307: Cannot find module '<dependency>' or its corresponding type declarations.
It works fine locally, just not on docker.
Here is my Dockerfile
FROM node:16-alpine AS dev
WORKDIR /app
COPY yarn.lock ./
COPY package.json ./
COPY . .
WORKDIR /app/api
RUN yarn install
RUN yarn build
EXPOSE 8000
CMD ["yarn", "dev"]
And my docker compose:
services:
api:
profiles: ['api', 'web']
container_name: slots-api
stdin_open: true
build:
context: ./
dockerfile: api/Dockerfile
target: dev
environment:
NODE_ENV: development
ports:
- 8000:8000
- 9229:9229 # for debugging
volumes:
- ./api:/app/api
- /app/node_modules/ # do not mount node_modules
- /app/api/node_modules/ # do not mount node_modules
depends_on:
- database
command: yarn dev

Docker compose for monorepo with only one package.js for client and server

I have a monorepo and I am user only once package.json for both client and server and I am running the app user env var.
This is my folder structure
and this is my docker compose file
version: '3.8'
services:
api:
container_name: api
build: ./src/api
ports:
- 8888:8888
volumes:
- .:/app
- /app/node_modules
environment:
- APP_ENV=staging
manager:
container_name: manager
build: ./src/manager
ports:
- 3000:3000
volumes:
- .:/app
- /app/node_modules
environment:
- APP_ENV=staging
this is my DockerFile for individual apps
FROM public.ecr.aws/docker/library/node:16.13.0
# ENV NODE_OPTIONS=--max-old-space-size=8192
RUN npm install -g npm#8.1.0
# Bundle application source.
RUN mkdir -p /usr/src/app
COPY ["package*.json", "../../"]
WORKDIR /usr/src/app
# Bundle application source.
COPY . /usr/src/app
# WORKDIR ../../
RUN npm cache clear --force
RUN npm install
RUN npm ci --production
COPY . .
EXPOSE 8888
CMD ["node", "../../index.js"]
and this is my index.js where i am running the apps using env variables
switch (process.env.ENTRYPOINT) {
case 'api':
await import('./src/api/index.js');
break;
case 'manager':
chdir('src/manager');
await import('./src/manager/server.js');
break;
}
the package.json have scripts something like this
"start:api": "ENTRYPOINT=api node index.js",
"start:manager": "ENTRYPOINT=manager node index.js",

Node + Docker Compose: Development and production setup

I'm looking for a solution to have both a development and a production environment in my project using docker, docker-compose and nodejs.
How do I approach this?
Basically what I want is a command to start my docker production environment, and a command to start my development environment (which could use nodemon for example).
Here is my Dockerfile
FROM node:13-alpine
RUN mkdir /app
WORKDIR /app
COPY . /app
RUN npm install
RUN npm run build
EXPOSE 1234
CMD ["npm", "run", "prod"] # <--- Have a possibility to run something like "npm run dev" here instead
docker-compose.yml
version: "3"
services:
findus:
build: .
ports:
- "1234:1234"
links:
- mongo
container_name: myapp
mongo:
image: mongo
restart: always
ports:
- "4444:4444"
package.json
// ...
"scripts": {
"build": "tsc",
"dev": "nodemon source/index.ts",
"prod": "node build/index.js"
},
// ...
You can make use of entrypoint and pass the command to the docker container. Then you can use docker-compose inharitance to launch compose for the environment that you want and append command to the entrypoint.
Dockerfile :
FROM node:13-alpine
RUN mkdir /app
WORKDIR /app
COPY . /app
RUN npm install
RUN npm run build
EXPOSE 1234
ENTRYPOINT ["npm", "run"]
Main docker-compose.yml :
version: "3"
services:
findus:
build: .
ports:
- "1234:1234"
links:
- mongo
container_name: myapp
mongo:
image: mongo
restart: always
ports:
- "4444:4444"
And then have two docker-compose files to append the command passed to the image entry point. For development - docker-compose.dev.yml :
version: "3"
services:
findus:
command: dev
and docker-compose.prod.yml :
version: "3"
services:
findus:
command: prod
Then to launch dev environment :
docker-compose -f docker-compose.yml -f docker-compose.dev.yml up
and for prod environment :
docker-compose -f docker-compose.yml -f docker-compose.prod.yml up
So the command will be appended to the ENTRYPOINT instruction.
Also this approach could work with enviroment variables if you wanted to pass the command as environment variable. You can find more information in the docs.
You can create a structure like this:
docker-compose.yml -->
docker-compose.dev.yml
docker-compose.prod.yml
Where the base configuration resides in docker-compose.yml, while environment-specific info such as ports or user credentials would be in docker-compose.dev.yml or docker-compose.prod.yml
And then you can run the dev environment with:
docker-compose \
-f docker-compose.yml \
-f docker-compose.dev.yml \
up -d
Or the prod environment with:
docker-compose \
-f docker-compose.yml \
-f docker-compose.prod.yml \
up -d
One way to do it is to create two "targets" on your Dockerfile like this:
Dockerfile:
FROM node:13-alpine As development
RUN mkdir /app
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
RUN npm run build
FROM node:13-alpine as production
ARG NODE_ENV=production
ENV NODE_ENV=${NODE_ENV}
WORKDIR /app
COPY package*.json ./
RUN npm install --only=production
COPY . .
COPY --from=development /app/dist ./dist
CMD ["node", "dist/index.js"]
And them on your docker-compose run only the Development version:
version: '3.7'
services:
main:
container_name: main
build:
context: .
target: development
command: npm run dev
...
So on your dev environment, you can run:
docker-compose up
and them in prod you can run docker directly with
docker run --target production
I'm assuming that when you run "npm run build" you are generating a dist/production folder so it's better to run node there instead of running straight at index.js on your root folder.

Resources