how to separate backend and frontend in docker nuxt - node.js

I would separate nuxt app and dockerize backend and frontend path in a different folders.
fronted - one container (nuxt js)
backend - second container (express js)
project structure folders
my_nuxt_app
|-backend
|-frontend
docker-compose.yaml
when I create local that construction is work
serverMiddleware: [
{path: '/api', handler:'../backend'}
],
but how to create this on docker i don`t understand?
need link to container in serverMiddleware settings but i don`t undestand how please if yo know help me.
version: '3'
services:
forntend:
container_name: forntend
build:
context: ./frontend
ports:
- 8080:8080
backend:
container_name: backend
build:
context: ./backend
ports:
- 3000:3000
backend Dockerfile
FROM node:16.16.0-alpine
RUN npm i --location=global --force pm2
RUN npm i --location=global --force yarn
WORKDIR /backend
COPY . .
CMD ["pm2-runtime", "backend.js","--json","--no-auto-exit","--only","backend"]
frontend Dockerfile
FROM node:16.16.0-alpine
RUN npm i --location=global --force yarn
WORKDIR /mmc
COPY . .
CMD ["yarn","dev"]

Don't use serverMiddleware property.
Just change this props in nuxt.config.js:
server:{
host:'0.0.0.0,
port:8080
},

Related

What should the behaviour of running `docker-compose exec <container_name> npm install` be on the host?

I am setting up Docker for development purposes. With separate Dockerfile.dev for a NextJS front-end and an ExpressJS-powered back-end. For the NextJS application, the volume mounting is done such that the application runs perfectly fine in the dev server, in the container. However, due to empty node_modules directory in the host, I enter the following command in a new terminal instance:
docker-compose exec frontend npm install.
This gives me my needed modules so that I get normal linting while developing.
However, on starting up the Express backend, I have issues with installing mongodb from npm (module not found when running the container). So I resorted to the same strategy by performing docker-compose backend npm install and everything works. However, the node_modules directory in the host is still empty, which is not the same behaviour as when I had done it with the frontend.
Why is this the case?
#Dockerfile.dev (frontend)
FROM node:19-alpine
WORKDIR "/app"
COPY ./package*.json .
RUN npm install
COPY . .
CMD ["npm", "run", "dev"]
#Dockerfile.dev (backend)
FROM node:19-alpine
WORKDIR "/app"
RUN npm install -g nodemon
COPY ./package*.json .
RUN npm install
COPY . .
CMD ["npm", "run", "dev"]
docker-compose.yml:
version: '3'
services:
server:
build:
dockerfile: Dockerfile.dev
context: ./server
volumes:
- /app/node_modules
- ./server:/app
ports:
- "5000:5000"
client:
build:
dockerfile: Dockerfile.dev
context: ./client
volumes:
- /app/node_modules
- ./client:/app
ports:
- "3000:3000"
mongodb:
image: mongo:latest
volumes:
- /home/panhaboth/Apps/home-server/data:/data/db
ports:
- "27017:27017"

React app urls are undefined when running in Ngnix in docker

I am deploying my react app (after building it) to nginix server in docker.
This react app connects to a nodejs server running on : localhost:3000
React app is running on localhost:3005
When the react app is depoyed to ngnix+docker, the api urls refering to the nodejs server are showing up as undefined.
POST http://undefined/api/auth/login net::ERR_NAME_NOT_RESOLVED
It should be: http://localhost:3000/api/auth/login
This issue does not seem to be from react but due to ngnix or docker.
The react app is working perfectly fine is using: serve /build -p 3005, bassically testing it without ngnix+docker on a basic local server.
Also, is am not using any environment variables, all urls are hard coded.
I have not added any configuration for ngnix, I am using the default docker image and just copying my react app in it.
Here is my part of docker configuration.
Dockerlife.dev(react app)
FROM nginx:1.23
WORKDIR /react
COPY ./build/ /usr/share/nginx/html
EXPOSE 3005
dockerfile.dev (nodejs server)
FROM node:16.15-alpine3.15
WORKDIR /usr/src/server
COPY ./package.json .
RUN npm install
COPY . .
ENV NODE_ENV=development
EXPOSE 3000
CMD ["npm", "run", "app"]
Dcoker compose:
version: "3"
services:
client:
build:
context: ./react
dockerfile: Dockerfile.dev
ports:
- "3005:80"
volumes:
- /react/node_modules
- ./react:/react
deploy:
restart_policy:
condition: always
node-server:
network_mode: "host"
build:
context: ./server
dockerfile: Dockerfile.dev
ports:
- "3000:3000"
deploy:
restart_policy:
condition: always

Docker react express ap not running after one process run

Hi Dockerized a reactjs and expressjs project, everything is worked good when i have written separate docker compose file.
But now i written one compose file
docker-compose-all-dev.yml file
version: '3.7'
services:
client:
container_name: react-dev
build:
context: ./client
dockerfile: Dockerfile.react-dev
ports:
- 3000:3000
server:
container_name: server-dev
build:
context: ./server
dockerfile: Dockerfile.server-dev
ports:
- 5000:5000
Now it's running client server only, why not running backend server?
But it works when i run it in two different files like this.
docker-compose-client.yml file:
version: '3.7'
services:
client:
container_name: react-dev
build:
context: ./client
dockerfile: Dockerfile.react-dev
ports:
- 3000:3000
and docker-compose-server.yml file
version: '3.7'
services:
server:
container_name: server-dev
build:
context: ./server
dockerfile: Dockerfile.server-dev
ports:
- 5000:5000
Can anyone tell me what is the possible issue of not running the both app when i run in one compose file? how can i solve it?
For your reference.
My Dockerfile-server-dev file
FROM node:14
WORKDIR /usr/src/app
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 8080
CMD [ "node", "server.js" ]
and my Dockerfile.react-dev file
FROM node:14.1-alpine as build
WORKDIR /app
COPY . /app
ENV PATH /app/node_modules/.bin:$PATH
RUN yarn config delete proxy
Run npm config rm proxy
RUN npm config rm https-proxy
RUN npm install
RUN npm start
I dont know what is the issue actually running two development server in one docker-compose file
There are at least few problems, your Dockerfile.react-dev is missing entrypoint and CMD parts, you should not start your server with RUN clause. Instead use Entrypoint and possibly CMD for starting it. Another problem is, that you are exposing different port on Dockerfile-server-dev than on your compose file.
solved the issue.
I just add this line: CMD ["npm", "start"] and removed npm start now it is working

Run Google Firestore Emulator with Docker-Compose

I am trying to run my Node project as well as the Firestore Emulator with docker-compose locally in a dev environment.
I have a Dockerfile for my Node project that looks like this:
WORKDIR /app
ADD package*.json ./
RUN npm install
ADD bin ./bin
CMD [ "npm", "run", "dev" ]
Then I have a seperate Dockerfile called Dockerfile.firestorefor containerizing the Firestore Emulator. This Dockerfile looks like this:
FROM node:alpine
RUN apk add openjdk11
RUN npm install -g firebase-tools
WORKDIR /app
CMD [ "firebase", "--project=xrechnung-app", "emulators:start", "--only", "firestore" ]
The docker-compose.yml is written in the following way:
version: "3"
services:
api:
image: api
build:
context: api
dockerfile: Dockerfile.dev
depends_on:
- db
environment:
- PORT=3000
ports:
- 3000:3000
volumes:
- ./api/src:/app/src
db:
image: firestore
build:
context: api
dockerfile: Dockerfile.firestore
ports:
- 4000:4000
- 8080:8080
volumes:
- .cache/firebase/emulators/:/app/.cache/firebase/emulators/
I'm not sure about the last two lines but I found a hint in the Google Cloud docs that this could prevent multiple downloads of the emulator.
When spinning the container up with docker-compose up the Node project runs without problem and is available at localhost:3000. Also the Emulator spins up. The console logs that its running. But I can't make it available on the prescribed ports (4000 and 8080)
Did anyone try a similar thing already? I appreciate your help.
You probably need to set the host in the firebase.json file, like this:
{
"emulators": {
"firestore": {
"port": 8080,
"host": "0.0.0.0"
}
}
}
By default, the emulator runs only for localhost.

How to use sqlite3 with docker compose

Between the following tutorials;
Dockerizing create-react-app
Developing microservices - Node, react & docker
I have been able to convert my nodejs app to dockerized micro-services which is up and running and connecting to services. However, my app uses Sqlite/Sequelize and this was working perfectly prior to dockerizing.
With the new setup, I get error;
/usr/src/app/node_modules/sequelize/lib/dialects/sqlite/connection-manager.js:31
throw new Error('Please install sqlite3 package manually');
Error: Please install sqlite3 package manually at new ConnectionManager
(/usr/src/app/node_modules/sequelize/lib/dialects/sqlite/connection-manager.js:31:15)
My question is;
Is it possible to use Sqlite3 with Docker
If so, anyone able to share sample docker-compose.yml and Dockerfile combo that works for this please.
My docker-compose.yml
version: '3.5'
services:
user-service:
container_name: user-service
build: ./services/user/
volumes:
- './services/user:/usr/src/app'
- './services/user/package.json:/usr/src/package.json'
ports:
- '9000:9000' # expose ports - HOST:CONTAINER
web-service:
container_name: web-service
build:
context: ./services/web
dockerfile: Dockerfile
volumes:
- './services/web:/usr/src/app'
- '/usr/src/app/node_modules'
ports:
- '3000:3000' # expose ports - HOST:CONTAINER
environment:
- NODE_ENV=development
depends_on:
- user-service
My user/ Dockerfile
FROM node:latest
# set working directory
RUN mkdir /usr/src/app
WORKDIR /usr/src/app
# add `/usr/src/node_modules/.bin` to $PATH
ENV PATH /usr/src/app/node_modules/.bin:$PATH
# install and cache app dependencies
ADD package.json /usr/src/package.json
RUN npm install
# start app
CMD ["npm", "start"]
My web/ Dockerfile
FROM node:latest
# set working directory
RUN mkdir /usr/src/app
WORKDIR /usr/src/app
# add `/usr/src/app/node_modules/.bin` to $PATH
ENV PATH /usr/src/app/node_modules/.bin:$PATH
# install and cache app dependencies
COPY package.json /usr/src/app/package.json
RUN npm install
RUN npm install react-scripts#1.1.4
RUN npm install gulp -g
# start app
CMD ["npm", "start"]
Many thanks.
Got it. The issue was that my local node_modules were being copied to the host container. Hence in the sqlite3 lib/binding, node-v57-darwin-x64 was there instead of what is expected - node-v57-linux-x64. Hence the mess.
I updated the Dockerfiles and docker-compose.yml as follows:
My docker-compose.yml
services:
user-service:
container_name: user-service
build:
context: ./services/user/
dockerfile: Dockerfile
volumes:
- './services/user:/usr/src/app'
- '/usr/src/node_modules'
ports:
- '9000:9000' # expose ports - HOST:CONTAINER
web-service:
container_name: web-service
build:
context: ./services/web/
dockerfile: Dockerfile
volumes:
- './services/web:/usr/src/app'
- '/usr/src/app/node_modules'
ports:
- '3000:3000' # expose ports - HOST:CONTAINER
environment:
- NODE_ENV=development
depends_on:
- user-service
My user/ Dockerfile
FROM node:latest
# set working directory
RUN mkdir /usr/src/app
WORKDIR /usr/src/app
# add `/usr/src/node_modules/.bin` to $PATH
ENV PATH /usr/src/node_modules/.bin:$PATH
# install and cache app dependencies
ADD package.json /usr/src/package.json
RUN npm install
# start app
CMD ["npm", "start"]
Helpful posts
Getting npm packages to be installed with docker-compose

Resources