How to communicate Express API and React in separate Docker containers - node.js

I have a simple application that grabs data from express and displays it in react. It works as intended without docker, but not when launching them as containers. Both React and Express are able to launch and can be viewed in browser at localhost:3000 and localhost:5000 after running docker
How they are communicating
In the react-app package.json, I have
"proxy": "http://localhost:5000"
and a fetch to the express route.
React Dockerfile
FROM node:17 as build
WORKDIR /code
COPY package*.json ./
RUN npm install
COPY . .
RUN npm run build
FROM nginx:1.12-alpine
COPY --from=build /code/build /usr/share/nginx/html
EXPOSE 80
CMD ["nginx", "-g", "daemon off;"]
Express Dockerfile
FROM node:17
WORKDIR /usr/src/app
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 5000
CMD ["npm", "start"]
docker-compose.yml
version: "3"
services:
react-app:
image: react
stdin_open: true
ports:
- "3000:80"
networks:
- react-express
api-server:
image: express
ports:
- "5000:5000"
networks:
- react-express
networks:
react-express:
driver: bridge

from your example I figured out that you are using react-scripts?
If so, proxy parametr works only for development for npm start.
Keep in mind that proxy only has effect in development (with npm start), and it is up to you to ensure that URLs like /api/todos point to the right thing in production.
here: https://create-react-app.dev/docs/proxying-api-requests-in-development/

Using a proxy in package.json does not work, so instead you can put this in your react app. The same Dockerfile and docker-compose setup is used.
const api = axios.create({
baseURL: "http://localhost:5000"
})
and make request to express like this
api.post("/logs", {data:value})
.then(res => {
console.log(res)
})
This may raise an error with CORS, so you can put this in your Express API in the same file that you set the port and have it listening.
import cors from 'cors'
const app = express();
app.use(cors({
origin: 'http://localhost:3000'
}))

Related

how to separate backend and frontend in docker nuxt

I would separate nuxt app and dockerize backend and frontend path in a different folders.
fronted - one container (nuxt js)
backend - second container (express js)
project structure folders
my_nuxt_app
|-backend
|-frontend
docker-compose.yaml
when I create local that construction is work
serverMiddleware: [
{path: '/api', handler:'../backend'}
],
but how to create this on docker i don`t understand?
need link to container in serverMiddleware settings but i don`t undestand how please if yo know help me.
version: '3'
services:
forntend:
container_name: forntend
build:
context: ./frontend
ports:
- 8080:8080
backend:
container_name: backend
build:
context: ./backend
ports:
- 3000:3000
backend Dockerfile
FROM node:16.16.0-alpine
RUN npm i --location=global --force pm2
RUN npm i --location=global --force yarn
WORKDIR /backend
COPY . .
CMD ["pm2-runtime", "backend.js","--json","--no-auto-exit","--only","backend"]
frontend Dockerfile
FROM node:16.16.0-alpine
RUN npm i --location=global --force yarn
WORKDIR /mmc
COPY . .
CMD ["yarn","dev"]
Don't use serverMiddleware property.
Just change this props in nuxt.config.js:
server:{
host:'0.0.0.0,
port:8080
},

React app urls are undefined when running in Ngnix in docker

I am deploying my react app (after building it) to nginix server in docker.
This react app connects to a nodejs server running on : localhost:3000
React app is running on localhost:3005
When the react app is depoyed to ngnix+docker, the api urls refering to the nodejs server are showing up as undefined.
POST http://undefined/api/auth/login net::ERR_NAME_NOT_RESOLVED
It should be: http://localhost:3000/api/auth/login
This issue does not seem to be from react but due to ngnix or docker.
The react app is working perfectly fine is using: serve /build -p 3005, bassically testing it without ngnix+docker on a basic local server.
Also, is am not using any environment variables, all urls are hard coded.
I have not added any configuration for ngnix, I am using the default docker image and just copying my react app in it.
Here is my part of docker configuration.
Dockerlife.dev(react app)
FROM nginx:1.23
WORKDIR /react
COPY ./build/ /usr/share/nginx/html
EXPOSE 3005
dockerfile.dev (nodejs server)
FROM node:16.15-alpine3.15
WORKDIR /usr/src/server
COPY ./package.json .
RUN npm install
COPY . .
ENV NODE_ENV=development
EXPOSE 3000
CMD ["npm", "run", "app"]
Dcoker compose:
version: "3"
services:
client:
build:
context: ./react
dockerfile: Dockerfile.dev
ports:
- "3005:80"
volumes:
- /react/node_modules
- ./react:/react
deploy:
restart_policy:
condition: always
node-server:
network_mode: "host"
build:
context: ./server
dockerfile: Dockerfile.dev
ports:
- "3000:3000"
deploy:
restart_policy:
condition: always

Dockerizing React-Express- MongoDB Atlas App with Docker-Compose and Proxy

I have a react-express app that connects to MongoDB Atlas and is deployed to Google App Engine. Currently, the client folder is stored inside the backend folder. I manually build the React client app with npm run build, then use gcloud app deploy to push the entire backend app and the built React files to GAE, where it is connected to a custom domain. I am looking to dockerize the app, but am running into problems with getting my backend up and running. Before Docker, I was using express static middleware to serve the React files, and had a setupProxy.js file in the client directory that I used to redirect api requests.
My express app middleware:
if (process.env.NODE_ENV === 'production') {
app.use(express.static(path.join(__dirname, '/../../client/build')));
}
if (process.env.NODE_ENV === 'production') {
app.get('/*', (req, res) => {
res.sendFile(path.join(__dirname, '/../../client/build/index.html'));
});
}
My client Dockerfile:
FROM node:14-slim
WORKDIR /usr/src/app
COPY ./package.json ./
RUN npm install
COPY . .
RUN npm run build
EXPOSE 3000
CMD [ "npm", "start"]
My backend Dockerfile:
FROM node:14-slim
WORKDIR /usr/src/app
COPY ./package.json ./
RUN npm install
COPY . .
EXPOSE 5000
CMD [ "npm", "start"]
docker-compose.yml
version: '3.4'
services:
backend:
image: backend
build:
context: backend
dockerfile: ./Dockerfile
environment:
NODE_ENV: production
ports:
- 5000:5000
networks:
- mern-app
client:
image: client
build:
context: client
dockerfile: ./Dockerfile
stdin_open: true
environment:
NODE_ENV: production
ports:
- 3000:3000
networks:
- mern-app
depends_on:
- backend
networks:
mern-app:
driver: bridge
I'm hoping that someone can provide some insight/assistance regarding how to effectively connect my React and Express apps inside a container using Docker Compose so that I can make calls to my API server. Thanks in advance!

Error occurred while trying to proxy request while running on Docker

I am trying to deploy my React + Spring Boot app to docker. However, the api from backend seems not connected with my React app although I have already check the port 8080 of the Spring Boot server and check the proxy.js in the React app. It keeps performing "Error occurred while trying to proxy request" error. Please help me answer this!
Here's the proxy.js
export default {
dev: {
'/api/': {
target: 'http://localhost:8080/',
changeOrigin: true,
pathRewrite: {
'^': '',
},
},
}
}
This is the dockerfile of the React App
FROM node:12
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package*.json ./
RUN npm install
# If you are building your code for production
# RUN npm ci --only=production
# Bundle app source
COPY . .
EXPOSE 8000
ENTRYPOINT npm run dev
The Backend Dockerfile
FROM openjdk:8-jdk-alpine
EXPOSE 8080
RUN addgroup -S spring && adduser -S spring -G spring
USER spring:spring
ARG JAR_FILE=target/*.jar
COPY ${JAR_FILE} app.jar
ENTRYPOINT ["java","-jar","/app.jar"]
And the docker-compose.yml file
version: "3"
services:
server:
build:
context: ./service
dockerfile: ./Dockerfile
ports:
- "8080:8080"
image: academy-server
client:
build:
context: ./web
dockerfile: ./Dockerfile
ports:
- "8000:8000"
image: academy-client
links:
- "server"
Running in Docker is the same as if you were running your front end and backend in two different machines. As such, you cannot use localhost to talk to your backend. Instead you need to use the service names as defined in your docker-compose. So in your case you should use 'server' instead of localhost.
Docker-compose automatically creates an internal network, attaches both of your containers to that network and uses the service names for routing between the containers

Using node to proxy external traffic from container through other service mock the response

I have a node application which i want to write integration tests for. For that to work i need to be able to mock requests for both http requests and websocket.
I use docker-compose to define my app dependencies. The relevant part of my docker-compose.yml is
version: "3.2"
services:
app:
build: .
command: npm run dev
depends_on:
- proxycontainer
environment:
HTTP_PROXY: proxycontainer:8080
NO_PROXY: localhost,127.0.0.1
proxycontainer:
build: ./proxy
I have a simple Dockerfile for the node app
FROM node:8.12.0-alpine as base
WORKDIR /usr/app
COPY package*.json ./
RUN apk add --no-cache --virtual .gyp \
python \
make \
g++ \
&& npm install \
&& apk del .gyp
FROM base as build
COPY . .
RUN npm run build
The proxy Dockerfile looks like
FROM node:8.12.0-alpine
WORKDIR /usr/app
COPY package*.json ./
RUN npm install
COPY . .
CMD [ "npm", "run", "proxy" ]
Where npm run proxy is running node ./index.js on this simple file
const express = require('express')
const proxy = require('http-proxy-middleware')
const app = express()
app.use('/', proxy({ target: 'http://www.example.org', changeOrigin: true, logLevel: 'debug' }))
app.listen(8080)
To test just the proxy i have replaced my app with
const testProxy = async () => {
const data = await axios.get("http://example.org/");
console.log(data.data)
}
testProxy()
When running this example i get the error Error: connect EINVAL 0.0.31.144:80 - Local (0.0.0.0:0)
So how do i proxy external requests from one node docker service into a node proxy service, which then can mock the response for http and websocket connections?
If i remove the HTTP_PROXY env variable everything works as expected.
Did you try using a http:// prefix before your proxy container's name?
HTTP_PROXY: http://proxycontainer:8080
You might also create a custom network and assign a local IP address to each container, so you could access them using a static IP.
version: "3.2"
services:
app:
build: .
command: npm run dev
depends_on:
- proxycontainer
environment:
HTTP_PROXY: http://172.28.1.2:8080/
NO_PROXY: localhost,127.0.0.1
networks:
proxy_net:
ipv4_address: 172.28.1.1
proxycontainer:
build: ./proxy
networks:
proxy_net:
ipv4_address: 172.28.1.2
networks:
proxy_net:
ipam:
driver: default
config:
- subnet: 172.28.0.0/16

Resources