Docker container can't find an installed node dependency - node.js

When I start up my docker container it can't find one of my installed dependencies. I've tried removing my build + node_modules folder, going into my docker container terminal and running yarn list and searching for the dependency there (it is). I've also tried running yarn add <dependency and it says it works, but still throws the error error TS2307: Cannot find module '<dependency>' or its corresponding type declarations.
It works fine locally, just not on docker.
Here is my Dockerfile
FROM node:16-alpine AS dev
WORKDIR /app
COPY yarn.lock ./
COPY package.json ./
COPY . .
WORKDIR /app/api
RUN yarn install
RUN yarn build
EXPOSE 8000
CMD ["yarn", "dev"]
And my docker compose:
services:
api:
profiles: ['api', 'web']
container_name: slots-api
stdin_open: true
build:
context: ./
dockerfile: api/Dockerfile
target: dev
environment:
NODE_ENV: development
ports:
- 8000:8000
- 9229:9229 # for debugging
volumes:
- ./api:/app/api
- /app/node_modules/ # do not mount node_modules
- /app/api/node_modules/ # do not mount node_modules
depends_on:
- database
command: yarn dev

Related

What should the behaviour of running `docker-compose exec <container_name> npm install` be on the host?

I am setting up Docker for development purposes. With separate Dockerfile.dev for a NextJS front-end and an ExpressJS-powered back-end. For the NextJS application, the volume mounting is done such that the application runs perfectly fine in the dev server, in the container. However, due to empty node_modules directory in the host, I enter the following command in a new terminal instance:
docker-compose exec frontend npm install.
This gives me my needed modules so that I get normal linting while developing.
However, on starting up the Express backend, I have issues with installing mongodb from npm (module not found when running the container). So I resorted to the same strategy by performing docker-compose backend npm install and everything works. However, the node_modules directory in the host is still empty, which is not the same behaviour as when I had done it with the frontend.
Why is this the case?
#Dockerfile.dev (frontend)
FROM node:19-alpine
WORKDIR "/app"
COPY ./package*.json .
RUN npm install
COPY . .
CMD ["npm", "run", "dev"]
#Dockerfile.dev (backend)
FROM node:19-alpine
WORKDIR "/app"
RUN npm install -g nodemon
COPY ./package*.json .
RUN npm install
COPY . .
CMD ["npm", "run", "dev"]
docker-compose.yml:
version: '3'
services:
server:
build:
dockerfile: Dockerfile.dev
context: ./server
volumes:
- /app/node_modules
- ./server:/app
ports:
- "5000:5000"
client:
build:
dockerfile: Dockerfile.dev
context: ./client
volumes:
- /app/node_modules
- ./client:/app
ports:
- "3000:3000"
mongodb:
image: mongo:latest
volumes:
- /home/panhaboth/Apps/home-server/data:/data/db
ports:
- "27017:27017"

Docker compose does not work if node_modules are not installed on local machine

My goal is to create docker dev environment for a full-stack app: React, NodeJS and MongoDb (with hot reloading). My current dev environment works, but I noticed that "docker compose up" will only work, if the node_modules are installed on my local machine - it otherwise returns an error that nodemon is not installed, or react-scripts not found. It seems like docker is looking for the node files in my local machine, but it should be installing them in the container during the compose, right?
docker-compose.yml
version: '3.8'
services:
server:
build: ./server
container_name: server_backend
ports:
- '7000:7000'
volumes:
- ./server:/app
client:
build: ./client
container_name: client_frontend
ports:
- '3000:3000'
volumes:
- ./client:/app
stdin_open: true
tty: true
Dockerfile (backend server)
FROM node:16-bullseye-slim
WORKDIR /app
COPY package.json .
COPY package-lock.json .
RUN npm install
COPY . .
EXPOSE 7000
CMD ["node", "index.js"]
Dockerfile (frontend)
FROM node:16-bullseye-slim
WORKDIR /app
COPY package.json .
RUN npm install
COPY . .
EXPOSE 3000
CMD ["npm", "start"]
app architecture
- MyApp
-- client
-- server
Can it be that you are copying the whole directory with the node_modules when you execute the COPY . . command, try and either specifiy the directory to copy or add a .dockerignore.
Found this in the documentation, https://docs.docker.com/language/nodejs/build-images/#create-a-dockerignore-file

Unable to enable hot reload for React on docker

I'm new to docker so I'm sure I'm missing something.
I'm trying to create a container with a react app. I'm using docker on a Windows 10 machine.
This is my docker file
FROM node:latest
EXPOSE 3000
ENV PATH /app/node_modules/.bin:$PATH
# install app dependencies
COPY package.json ./
COPY package-lock.json ./
RUN npm install --silent
RUN npm install react-scripts#3.4.1 -g --silent
COPY . /app
WORKDIR /app
CMD ["npm","run", "start"]
and this is my docker compose
version: '3.7'
services:
sample:
container_name: prova-react1
build:
context: .
dockerfile: Dockerfile
volumes:
- '.:/app'
- '/app/node_modules'
ports:
- 3000:3000
environment:
- CHOKIDAR_USEPOLLING=true
- COMPOSE_CONVERT_WINDOWS_PATHS=1
When i start the container if I go on the browser everything is working fine but when i go back to Visual Studio Code and I make a modification to the files and save nothing occurs to the container and even to the website

Node application won't start in Docker if there's a shared volume

I generated a Gatsby website localy. Then I setup a Docker container to run it. It's a standard setup : copying package.json, installing the modules, copying the local files and starting the dev script with a shared volume.
But when it starts, I run into an error :
gatsby_1 | ERROR
gatsby_1 |
gatsby_1 | There was a problem loading the local develop command. Gatsby may not be installed in your site's "node_modules" directory. Perhaps you need to run "npm install"? You might need to delete your "package-lock.json" as well.
Here is the Dockerfile :
FROM node:14.11.0
RUN npm install -g gatsby-cli
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 8000
CMD ["gatsby", "develop", "-H", "0.0.0.0"]
And the docker-compose.yml :
version: "3"
services:
db:
image: mariadb:latest
environment:
MYSQL_ROOT_PASSWORD: root
volumes:
- ./db:/var/lib/mysql
adminer:
image: adminer:latest
ports:
- 8082:8080
wordpress:
image: wordpress:5.5.1-php7.4-apache
ports:
- 8081:80
volumes:
- ./wordpess/plugins:/var/www/html/wp-content/plugins
- ./wordpess/themes:/var/www/html/wp-content/themes
gatsby:
build: ./gatsby
ports:
- 8080:8000
volumes:
- ./gatsby:/app
But, if I remove the volume for the Gatsby container, everything rune well.
So my guess is there's something wrong with permissions somewhere, but I can't figure it out.
Here is a repo with the whole project.

How to use sqlite3 with docker compose

Between the following tutorials;
Dockerizing create-react-app
Developing microservices - Node, react & docker
I have been able to convert my nodejs app to dockerized micro-services which is up and running and connecting to services. However, my app uses Sqlite/Sequelize and this was working perfectly prior to dockerizing.
With the new setup, I get error;
/usr/src/app/node_modules/sequelize/lib/dialects/sqlite/connection-manager.js:31
throw new Error('Please install sqlite3 package manually');
Error: Please install sqlite3 package manually at new ConnectionManager
(/usr/src/app/node_modules/sequelize/lib/dialects/sqlite/connection-manager.js:31:15)
My question is;
Is it possible to use Sqlite3 with Docker
If so, anyone able to share sample docker-compose.yml and Dockerfile combo that works for this please.
My docker-compose.yml
version: '3.5'
services:
user-service:
container_name: user-service
build: ./services/user/
volumes:
- './services/user:/usr/src/app'
- './services/user/package.json:/usr/src/package.json'
ports:
- '9000:9000' # expose ports - HOST:CONTAINER
web-service:
container_name: web-service
build:
context: ./services/web
dockerfile: Dockerfile
volumes:
- './services/web:/usr/src/app'
- '/usr/src/app/node_modules'
ports:
- '3000:3000' # expose ports - HOST:CONTAINER
environment:
- NODE_ENV=development
depends_on:
- user-service
My user/ Dockerfile
FROM node:latest
# set working directory
RUN mkdir /usr/src/app
WORKDIR /usr/src/app
# add `/usr/src/node_modules/.bin` to $PATH
ENV PATH /usr/src/app/node_modules/.bin:$PATH
# install and cache app dependencies
ADD package.json /usr/src/package.json
RUN npm install
# start app
CMD ["npm", "start"]
My web/ Dockerfile
FROM node:latest
# set working directory
RUN mkdir /usr/src/app
WORKDIR /usr/src/app
# add `/usr/src/app/node_modules/.bin` to $PATH
ENV PATH /usr/src/app/node_modules/.bin:$PATH
# install and cache app dependencies
COPY package.json /usr/src/app/package.json
RUN npm install
RUN npm install react-scripts#1.1.4
RUN npm install gulp -g
# start app
CMD ["npm", "start"]
Many thanks.
Got it. The issue was that my local node_modules were being copied to the host container. Hence in the sqlite3 lib/binding, node-v57-darwin-x64 was there instead of what is expected - node-v57-linux-x64. Hence the mess.
I updated the Dockerfiles and docker-compose.yml as follows:
My docker-compose.yml
services:
user-service:
container_name: user-service
build:
context: ./services/user/
dockerfile: Dockerfile
volumes:
- './services/user:/usr/src/app'
- '/usr/src/node_modules'
ports:
- '9000:9000' # expose ports - HOST:CONTAINER
web-service:
container_name: web-service
build:
context: ./services/web/
dockerfile: Dockerfile
volumes:
- './services/web:/usr/src/app'
- '/usr/src/app/node_modules'
ports:
- '3000:3000' # expose ports - HOST:CONTAINER
environment:
- NODE_ENV=development
depends_on:
- user-service
My user/ Dockerfile
FROM node:latest
# set working directory
RUN mkdir /usr/src/app
WORKDIR /usr/src/app
# add `/usr/src/node_modules/.bin` to $PATH
ENV PATH /usr/src/node_modules/.bin:$PATH
# install and cache app dependencies
ADD package.json /usr/src/package.json
RUN npm install
# start app
CMD ["npm", "start"]
Helpful posts
Getting npm packages to be installed with docker-compose

Resources