Dockerizing React-Express- MongoDB Atlas App with Docker-Compose and Proxy - node.js

I have a react-express app that connects to MongoDB Atlas and is deployed to Google App Engine. Currently, the client folder is stored inside the backend folder. I manually build the React client app with npm run build, then use gcloud app deploy to push the entire backend app and the built React files to GAE, where it is connected to a custom domain. I am looking to dockerize the app, but am running into problems with getting my backend up and running. Before Docker, I was using express static middleware to serve the React files, and had a setupProxy.js file in the client directory that I used to redirect api requests.
My express app middleware:
if (process.env.NODE_ENV === 'production') {
app.use(express.static(path.join(__dirname, '/../../client/build')));
}
if (process.env.NODE_ENV === 'production') {
app.get('/*', (req, res) => {
res.sendFile(path.join(__dirname, '/../../client/build/index.html'));
});
}
My client Dockerfile:
FROM node:14-slim
WORKDIR /usr/src/app
COPY ./package.json ./
RUN npm install
COPY . .
RUN npm run build
EXPOSE 3000
CMD [ "npm", "start"]
My backend Dockerfile:
FROM node:14-slim
WORKDIR /usr/src/app
COPY ./package.json ./
RUN npm install
COPY . .
EXPOSE 5000
CMD [ "npm", "start"]
docker-compose.yml
version: '3.4'
services:
backend:
image: backend
build:
context: backend
dockerfile: ./Dockerfile
environment:
NODE_ENV: production
ports:
- 5000:5000
networks:
- mern-app
client:
image: client
build:
context: client
dockerfile: ./Dockerfile
stdin_open: true
environment:
NODE_ENV: production
ports:
- 3000:3000
networks:
- mern-app
depends_on:
- backend
networks:
mern-app:
driver: bridge
I'm hoping that someone can provide some insight/assistance regarding how to effectively connect my React and Express apps inside a container using Docker Compose so that I can make calls to my API server. Thanks in advance!

Related

how to separate backend and frontend in docker nuxt

I would separate nuxt app and dockerize backend and frontend path in a different folders.
fronted - one container (nuxt js)
backend - second container (express js)
project structure folders
my_nuxt_app
|-backend
|-frontend
docker-compose.yaml
when I create local that construction is work
serverMiddleware: [
{path: '/api', handler:'../backend'}
],
but how to create this on docker i don`t understand?
need link to container in serverMiddleware settings but i don`t undestand how please if yo know help me.
version: '3'
services:
forntend:
container_name: forntend
build:
context: ./frontend
ports:
- 8080:8080
backend:
container_name: backend
build:
context: ./backend
ports:
- 3000:3000
backend Dockerfile
FROM node:16.16.0-alpine
RUN npm i --location=global --force pm2
RUN npm i --location=global --force yarn
WORKDIR /backend
COPY . .
CMD ["pm2-runtime", "backend.js","--json","--no-auto-exit","--only","backend"]
frontend Dockerfile
FROM node:16.16.0-alpine
RUN npm i --location=global --force yarn
WORKDIR /mmc
COPY . .
CMD ["yarn","dev"]
Don't use serverMiddleware property.
Just change this props in nuxt.config.js:
server:{
host:'0.0.0.0,
port:8080
},

React app urls are undefined when running in Ngnix in docker

I am deploying my react app (after building it) to nginix server in docker.
This react app connects to a nodejs server running on : localhost:3000
React app is running on localhost:3005
When the react app is depoyed to ngnix+docker, the api urls refering to the nodejs server are showing up as undefined.
POST http://undefined/api/auth/login net::ERR_NAME_NOT_RESOLVED
It should be: http://localhost:3000/api/auth/login
This issue does not seem to be from react but due to ngnix or docker.
The react app is working perfectly fine is using: serve /build -p 3005, bassically testing it without ngnix+docker on a basic local server.
Also, is am not using any environment variables, all urls are hard coded.
I have not added any configuration for ngnix, I am using the default docker image and just copying my react app in it.
Here is my part of docker configuration.
Dockerlife.dev(react app)
FROM nginx:1.23
WORKDIR /react
COPY ./build/ /usr/share/nginx/html
EXPOSE 3005
dockerfile.dev (nodejs server)
FROM node:16.15-alpine3.15
WORKDIR /usr/src/server
COPY ./package.json .
RUN npm install
COPY . .
ENV NODE_ENV=development
EXPOSE 3000
CMD ["npm", "run", "app"]
Dcoker compose:
version: "3"
services:
client:
build:
context: ./react
dockerfile: Dockerfile.dev
ports:
- "3005:80"
volumes:
- /react/node_modules
- ./react:/react
deploy:
restart_policy:
condition: always
node-server:
network_mode: "host"
build:
context: ./server
dockerfile: Dockerfile.dev
ports:
- "3000:3000"
deploy:
restart_policy:
condition: always

How to communicate Express API and React in separate Docker containers

I have a simple application that grabs data from express and displays it in react. It works as intended without docker, but not when launching them as containers. Both React and Express are able to launch and can be viewed in browser at localhost:3000 and localhost:5000 after running docker
How they are communicating
In the react-app package.json, I have
"proxy": "http://localhost:5000"
and a fetch to the express route.
React Dockerfile
FROM node:17 as build
WORKDIR /code
COPY package*.json ./
RUN npm install
COPY . .
RUN npm run build
FROM nginx:1.12-alpine
COPY --from=build /code/build /usr/share/nginx/html
EXPOSE 80
CMD ["nginx", "-g", "daemon off;"]
Express Dockerfile
FROM node:17
WORKDIR /usr/src/app
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 5000
CMD ["npm", "start"]
docker-compose.yml
version: "3"
services:
react-app:
image: react
stdin_open: true
ports:
- "3000:80"
networks:
- react-express
api-server:
image: express
ports:
- "5000:5000"
networks:
- react-express
networks:
react-express:
driver: bridge
from your example I figured out that you are using react-scripts?
If so, proxy parametr works only for development for npm start.
Keep in mind that proxy only has effect in development (with npm start), and it is up to you to ensure that URLs like /api/todos point to the right thing in production.
here: https://create-react-app.dev/docs/proxying-api-requests-in-development/
Using a proxy in package.json does not work, so instead you can put this in your react app. The same Dockerfile and docker-compose setup is used.
const api = axios.create({
baseURL: "http://localhost:5000"
})
and make request to express like this
api.post("/logs", {data:value})
.then(res => {
console.log(res)
})
This may raise an error with CORS, so you can put this in your Express API in the same file that you set the port and have it listening.
import cors from 'cors'
const app = express();
app.use(cors({
origin: 'http://localhost:3000'
}))

Nodejs and React inside docker containers

I am trying to run Node/React project inside Docker containers. I have a NodeJS server for the API's and the client app. I also have concurrently installed and everything works fine when running npm run dev.
This issue is when I run the server and app via a docker-compose.yml file I get the following error from the client:
client | [HPM] Error occurred while trying to proxy request /api/current_user from localhost:3000 to http://localhost:5000 (ECONNREFUSED) (https://nodejs.org/api/errors.html#errors_common_system_errors)
Here is the docker-compose.yml
version: "3"
services:
frontend:
container_name: client
build:
context: ./client
dockerfile: Dockerfile
image: client
ports:
- "3000:3000"
volumes:
- ./client:/usr/src/app
networks:
- local
backend:
container_name: server
build:
context: ./
dockerfile: Dockerfile
image: server
ports:
- "5000:5000"
depends_on:
- frontend
volumes:
- ./:/usr/src/app
networks:
- local
networks:
local:
driver: bridge
Server Dockerfile
FROM node:lts-slim
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
EXPOSE 5000
# You can change this
CMD [ "npm", "run", "dev" ]
Client Dockerfile
FROM node:lts-slim
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
EXPOSE 3000
CMD [ "npm", "start" ]
I am using "http-proxy-middleware": "^0.21.0" so my setupProxy.js is
const proxy = require('http-proxy-middleware');
module.exports = function(app) {
app.use(proxy('/auth/google', { target: 'http://localhost:5000' }));
app.use(proxy('/api/**', { target: 'http://localhost:5000' }));
};
You should use container_name instead of localhost
app.use(proxy('/auth/google', { target: 'http://localhost:5000' }));
app.use(proxy('/api/**', { target: 'http://localhost:5000' }));
You can also check these details by inspecting your network using following command:-
docker inspect <network_name>
It will show all the connected containers to the network, also the host names created for those containers.
NOTE : Host names are created based on container_names otherwise based
on service names.

Using node to proxy external traffic from container through other service mock the response

I have a node application which i want to write integration tests for. For that to work i need to be able to mock requests for both http requests and websocket.
I use docker-compose to define my app dependencies. The relevant part of my docker-compose.yml is
version: "3.2"
services:
app:
build: .
command: npm run dev
depends_on:
- proxycontainer
environment:
HTTP_PROXY: proxycontainer:8080
NO_PROXY: localhost,127.0.0.1
proxycontainer:
build: ./proxy
I have a simple Dockerfile for the node app
FROM node:8.12.0-alpine as base
WORKDIR /usr/app
COPY package*.json ./
RUN apk add --no-cache --virtual .gyp \
python \
make \
g++ \
&& npm install \
&& apk del .gyp
FROM base as build
COPY . .
RUN npm run build
The proxy Dockerfile looks like
FROM node:8.12.0-alpine
WORKDIR /usr/app
COPY package*.json ./
RUN npm install
COPY . .
CMD [ "npm", "run", "proxy" ]
Where npm run proxy is running node ./index.js on this simple file
const express = require('express')
const proxy = require('http-proxy-middleware')
const app = express()
app.use('/', proxy({ target: 'http://www.example.org', changeOrigin: true, logLevel: 'debug' }))
app.listen(8080)
To test just the proxy i have replaced my app with
const testProxy = async () => {
const data = await axios.get("http://example.org/");
console.log(data.data)
}
testProxy()
When running this example i get the error Error: connect EINVAL 0.0.31.144:80 - Local (0.0.0.0:0)
So how do i proxy external requests from one node docker service into a node proxy service, which then can mock the response for http and websocket connections?
If i remove the HTTP_PROXY env variable everything works as expected.
Did you try using a http:// prefix before your proxy container's name?
HTTP_PROXY: http://proxycontainer:8080
You might also create a custom network and assign a local IP address to each container, so you could access them using a static IP.
version: "3.2"
services:
app:
build: .
command: npm run dev
depends_on:
- proxycontainer
environment:
HTTP_PROXY: http://172.28.1.2:8080/
NO_PROXY: localhost,127.0.0.1
networks:
proxy_net:
ipv4_address: 172.28.1.1
proxycontainer:
build: ./proxy
networks:
proxy_net:
ipv4_address: 172.28.1.2
networks:
proxy_net:
ipam:
driver: default
config:
- subnet: 172.28.0.0/16

Resources