How to use sqlite3 with docker compose - node.js

Between the following tutorials;
Dockerizing create-react-app
Developing microservices - Node, react & docker
I have been able to convert my nodejs app to dockerized micro-services which is up and running and connecting to services. However, my app uses Sqlite/Sequelize and this was working perfectly prior to dockerizing.
With the new setup, I get error;
/usr/src/app/node_modules/sequelize/lib/dialects/sqlite/connection-manager.js:31
throw new Error('Please install sqlite3 package manually');
Error: Please install sqlite3 package manually at new ConnectionManager
(/usr/src/app/node_modules/sequelize/lib/dialects/sqlite/connection-manager.js:31:15)
My question is;
Is it possible to use Sqlite3 with Docker
If so, anyone able to share sample docker-compose.yml and Dockerfile combo that works for this please.
My docker-compose.yml
version: '3.5'
services:
user-service:
container_name: user-service
build: ./services/user/
volumes:
- './services/user:/usr/src/app'
- './services/user/package.json:/usr/src/package.json'
ports:
- '9000:9000' # expose ports - HOST:CONTAINER
web-service:
container_name: web-service
build:
context: ./services/web
dockerfile: Dockerfile
volumes:
- './services/web:/usr/src/app'
- '/usr/src/app/node_modules'
ports:
- '3000:3000' # expose ports - HOST:CONTAINER
environment:
- NODE_ENV=development
depends_on:
- user-service
My user/ Dockerfile
FROM node:latest
# set working directory
RUN mkdir /usr/src/app
WORKDIR /usr/src/app
# add `/usr/src/node_modules/.bin` to $PATH
ENV PATH /usr/src/app/node_modules/.bin:$PATH
# install and cache app dependencies
ADD package.json /usr/src/package.json
RUN npm install
# start app
CMD ["npm", "start"]
My web/ Dockerfile
FROM node:latest
# set working directory
RUN mkdir /usr/src/app
WORKDIR /usr/src/app
# add `/usr/src/app/node_modules/.bin` to $PATH
ENV PATH /usr/src/app/node_modules/.bin:$PATH
# install and cache app dependencies
COPY package.json /usr/src/app/package.json
RUN npm install
RUN npm install react-scripts#1.1.4
RUN npm install gulp -g
# start app
CMD ["npm", "start"]
Many thanks.

Got it. The issue was that my local node_modules were being copied to the host container. Hence in the sqlite3 lib/binding, node-v57-darwin-x64 was there instead of what is expected - node-v57-linux-x64. Hence the mess.
I updated the Dockerfiles and docker-compose.yml as follows:
My docker-compose.yml
services:
user-service:
container_name: user-service
build:
context: ./services/user/
dockerfile: Dockerfile
volumes:
- './services/user:/usr/src/app'
- '/usr/src/node_modules'
ports:
- '9000:9000' # expose ports - HOST:CONTAINER
web-service:
container_name: web-service
build:
context: ./services/web/
dockerfile: Dockerfile
volumes:
- './services/web:/usr/src/app'
- '/usr/src/app/node_modules'
ports:
- '3000:3000' # expose ports - HOST:CONTAINER
environment:
- NODE_ENV=development
depends_on:
- user-service
My user/ Dockerfile
FROM node:latest
# set working directory
RUN mkdir /usr/src/app
WORKDIR /usr/src/app
# add `/usr/src/node_modules/.bin` to $PATH
ENV PATH /usr/src/node_modules/.bin:$PATH
# install and cache app dependencies
ADD package.json /usr/src/package.json
RUN npm install
# start app
CMD ["npm", "start"]
Helpful posts
Getting npm packages to be installed with docker-compose

Related

What should the behaviour of running `docker-compose exec <container_name> npm install` be on the host?

I am setting up Docker for development purposes. With separate Dockerfile.dev for a NextJS front-end and an ExpressJS-powered back-end. For the NextJS application, the volume mounting is done such that the application runs perfectly fine in the dev server, in the container. However, due to empty node_modules directory in the host, I enter the following command in a new terminal instance:
docker-compose exec frontend npm install.
This gives me my needed modules so that I get normal linting while developing.
However, on starting up the Express backend, I have issues with installing mongodb from npm (module not found when running the container). So I resorted to the same strategy by performing docker-compose backend npm install and everything works. However, the node_modules directory in the host is still empty, which is not the same behaviour as when I had done it with the frontend.
Why is this the case?
#Dockerfile.dev (frontend)
FROM node:19-alpine
WORKDIR "/app"
COPY ./package*.json .
RUN npm install
COPY . .
CMD ["npm", "run", "dev"]
#Dockerfile.dev (backend)
FROM node:19-alpine
WORKDIR "/app"
RUN npm install -g nodemon
COPY ./package*.json .
RUN npm install
COPY . .
CMD ["npm", "run", "dev"]
docker-compose.yml:
version: '3'
services:
server:
build:
dockerfile: Dockerfile.dev
context: ./server
volumes:
- /app/node_modules
- ./server:/app
ports:
- "5000:5000"
client:
build:
dockerfile: Dockerfile.dev
context: ./client
volumes:
- /app/node_modules
- ./client:/app
ports:
- "3000:3000"
mongodb:
image: mongo:latest
volumes:
- /home/panhaboth/Apps/home-server/data:/data/db
ports:
- "27017:27017"

NestJS does not connect with MongoDB when using Docker containers

NestJS App connect normally with MongoDB
but, after creating a docker containers for them
NestJS does not connect with MongoDB
here's Dockerfile
# Base image
FROM node:16-alpine
# Create app directory
WORKDIR /app
# A wildcard is used to ensure both package.json AND package-lock.json are copied
COPY package*.json ./
# Install app dependencies
RUN yarn install
# Bundle app source
COPY . .
# Creates a "dist" folder with the production build
RUN yarn build
here's the docker compose file
version: '3.8'
services:
mongodb:
image: mongo:latest
env_file:
- .env
ports:
- 27017:27017
volumes:
- mongodb_data_container:/data/db
api:
build: .
volumes:
- .:/app
- /app/node_modules
ports:
- ${PORT}:${PORT}
command: npm run start:dev
env_file:
- .env
depends_on:
- mongodb
volumes:
mongodb_data_container:
here's .env file
PORT=3000
DB_CONNECTION_STRING=mongodb://127.0.0.1:27017/db-name
here's the connect method inside NestJS app
MongooseModule.forRoot(process.env.DB_CONNECTION_STRING)
For everyone facing the same issue
replace mongodb://127.0.0.1:27017/db-name
with mongodb://mongodb:27017/db-name

Docker compose for monorepo with only one package.js for client and server

I have a monorepo and I am user only once package.json for both client and server and I am running the app user env var.
This is my folder structure
and this is my docker compose file
version: '3.8'
services:
api:
container_name: api
build: ./src/api
ports:
- 8888:8888
volumes:
- .:/app
- /app/node_modules
environment:
- APP_ENV=staging
manager:
container_name: manager
build: ./src/manager
ports:
- 3000:3000
volumes:
- .:/app
- /app/node_modules
environment:
- APP_ENV=staging
this is my DockerFile for individual apps
FROM public.ecr.aws/docker/library/node:16.13.0
# ENV NODE_OPTIONS=--max-old-space-size=8192
RUN npm install -g npm#8.1.0
# Bundle application source.
RUN mkdir -p /usr/src/app
COPY ["package*.json", "../../"]
WORKDIR /usr/src/app
# Bundle application source.
COPY . /usr/src/app
# WORKDIR ../../
RUN npm cache clear --force
RUN npm install
RUN npm ci --production
COPY . .
EXPOSE 8888
CMD ["node", "../../index.js"]
and this is my index.js where i am running the apps using env variables
switch (process.env.ENTRYPOINT) {
case 'api':
await import('./src/api/index.js');
break;
case 'manager':
chdir('src/manager');
await import('./src/manager/server.js');
break;
}
the package.json have scripts something like this
"start:api": "ENTRYPOINT=api node index.js",
"start:manager": "ENTRYPOINT=manager node index.js",

Docker compose does not work if node_modules are not installed on local machine

My goal is to create docker dev environment for a full-stack app: React, NodeJS and MongoDb (with hot reloading). My current dev environment works, but I noticed that "docker compose up" will only work, if the node_modules are installed on my local machine - it otherwise returns an error that nodemon is not installed, or react-scripts not found. It seems like docker is looking for the node files in my local machine, but it should be installing them in the container during the compose, right?
docker-compose.yml
version: '3.8'
services:
server:
build: ./server
container_name: server_backend
ports:
- '7000:7000'
volumes:
- ./server:/app
client:
build: ./client
container_name: client_frontend
ports:
- '3000:3000'
volumes:
- ./client:/app
stdin_open: true
tty: true
Dockerfile (backend server)
FROM node:16-bullseye-slim
WORKDIR /app
COPY package.json .
COPY package-lock.json .
RUN npm install
COPY . .
EXPOSE 7000
CMD ["node", "index.js"]
Dockerfile (frontend)
FROM node:16-bullseye-slim
WORKDIR /app
COPY package.json .
RUN npm install
COPY . .
EXPOSE 3000
CMD ["npm", "start"]
app architecture
- MyApp
-- client
-- server
Can it be that you are copying the whole directory with the node_modules when you execute the COPY . . command, try and either specifiy the directory to copy or add a .dockerignore.
Found this in the documentation, https://docs.docker.com/language/nodejs/build-images/#create-a-dockerignore-file

Unable to enable hot reload for React on docker

I'm new to docker so I'm sure I'm missing something.
I'm trying to create a container with a react app. I'm using docker on a Windows 10 machine.
This is my docker file
FROM node:latest
EXPOSE 3000
ENV PATH /app/node_modules/.bin:$PATH
# install app dependencies
COPY package.json ./
COPY package-lock.json ./
RUN npm install --silent
RUN npm install react-scripts#3.4.1 -g --silent
COPY . /app
WORKDIR /app
CMD ["npm","run", "start"]
and this is my docker compose
version: '3.7'
services:
sample:
container_name: prova-react1
build:
context: .
dockerfile: Dockerfile
volumes:
- '.:/app'
- '/app/node_modules'
ports:
- 3000:3000
environment:
- CHOKIDAR_USEPOLLING=true
- COMPOSE_CONVERT_WINDOWS_PATHS=1
When i start the container if I go on the browser everything is working fine but when i go back to Visual Studio Code and I make a modification to the files and save nothing occurs to the container and even to the website

Resources