Error occurred while trying to proxy request while running on Docker - node.js

I am trying to deploy my React + Spring Boot app to docker. However, the api from backend seems not connected with my React app although I have already check the port 8080 of the Spring Boot server and check the proxy.js in the React app. It keeps performing "Error occurred while trying to proxy request" error. Please help me answer this!
Here's the proxy.js
export default {
dev: {
'/api/': {
target: 'http://localhost:8080/',
changeOrigin: true,
pathRewrite: {
'^': '',
},
},
}
}
This is the dockerfile of the React App
FROM node:12
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package*.json ./
RUN npm install
# If you are building your code for production
# RUN npm ci --only=production
# Bundle app source
COPY . .
EXPOSE 8000
ENTRYPOINT npm run dev
The Backend Dockerfile
FROM openjdk:8-jdk-alpine
EXPOSE 8080
RUN addgroup -S spring && adduser -S spring -G spring
USER spring:spring
ARG JAR_FILE=target/*.jar
COPY ${JAR_FILE} app.jar
ENTRYPOINT ["java","-jar","/app.jar"]
And the docker-compose.yml file
version: "3"
services:
server:
build:
context: ./service
dockerfile: ./Dockerfile
ports:
- "8080:8080"
image: academy-server
client:
build:
context: ./web
dockerfile: ./Dockerfile
ports:
- "8000:8000"
image: academy-client
links:
- "server"

Running in Docker is the same as if you were running your front end and backend in two different machines. As such, you cannot use localhost to talk to your backend. Instead you need to use the service names as defined in your docker-compose. So in your case you should use 'server' instead of localhost.
Docker-compose automatically creates an internal network, attaches both of your containers to that network and uses the service names for routing between the containers

Related

Not able to connect to vue app in docker container (Vue+Flask+Docker)

I am trying to set up a skeleton project for a web app. Since I have no experience using docker I followed this tutorial for a Flask+Vue+Docker setup:
https://www.section.io/engineering-education/how-to-build-a-vue-app-with-flask-sqlite-backend-using-docker/
The backend and frontend on their own run correct, now I wanted to dockerize the parts as described with docker-compose and separate containers for back- and frontend. Now when I try to connect to localhost://8080 I get this:
"This page isnt working, localhost did not send any data"
This is my frontend dockerfile:
#Base image
FROM node:lts-alpine
#Install serve package
RUN npm i -g serve
# Set the working directory
WORKDIR /app
# Copy the package.json and package-lock.json
COPY package*.json ./
# install project dependencies
RUN npm install
# Copy the project files
COPY . .
# Build the project
RUN npm run build
# Expose a port
EXPOSE 5000
# Executables
CMD [ "serve", "-s", "dist"]
and this is the docker-compose.yml
version: '3.8'
services:
backend:
build: ./backend
ports:
- 5000:5000
frontend:
build: ./frontend
ports:
- 8080:5000
In the Docker-Desktop GUI for the frontend container I get the log message "Accepting connections at http://localhost:3000" but when I open it in browser it connects me to the 8080 port.
During research I found that many people say I have to make the app serve on 0.0.0.0 to work from a docker container, but I don't know how to configure that. I tried adding
devServer: {
public: '0.0.0.0:8080'
}
to my vue.config.js which did not change anything. Others suggested to change the docker run command to incorporate the host change, but I don't use that but use docker-compose up to start the app.
Sorry for my big confusion, I hope someone can help me out here. I really hope it's something simple I am overlooking.
Thanks to everyone trying to help in advance!

how to separate backend and frontend in docker nuxt

I would separate nuxt app and dockerize backend and frontend path in a different folders.
fronted - one container (nuxt js)
backend - second container (express js)
project structure folders
my_nuxt_app
|-backend
|-frontend
docker-compose.yaml
when I create local that construction is work
serverMiddleware: [
{path: '/api', handler:'../backend'}
],
but how to create this on docker i don`t understand?
need link to container in serverMiddleware settings but i don`t undestand how please if yo know help me.
version: '3'
services:
forntend:
container_name: forntend
build:
context: ./frontend
ports:
- 8080:8080
backend:
container_name: backend
build:
context: ./backend
ports:
- 3000:3000
backend Dockerfile
FROM node:16.16.0-alpine
RUN npm i --location=global --force pm2
RUN npm i --location=global --force yarn
WORKDIR /backend
COPY . .
CMD ["pm2-runtime", "backend.js","--json","--no-auto-exit","--only","backend"]
frontend Dockerfile
FROM node:16.16.0-alpine
RUN npm i --location=global --force yarn
WORKDIR /mmc
COPY . .
CMD ["yarn","dev"]
Don't use serverMiddleware property.
Just change this props in nuxt.config.js:
server:{
host:'0.0.0.0,
port:8080
},

React app urls are undefined when running in Ngnix in docker

I am deploying my react app (after building it) to nginix server in docker.
This react app connects to a nodejs server running on : localhost:3000
React app is running on localhost:3005
When the react app is depoyed to ngnix+docker, the api urls refering to the nodejs server are showing up as undefined.
POST http://undefined/api/auth/login net::ERR_NAME_NOT_RESOLVED
It should be: http://localhost:3000/api/auth/login
This issue does not seem to be from react but due to ngnix or docker.
The react app is working perfectly fine is using: serve /build -p 3005, bassically testing it without ngnix+docker on a basic local server.
Also, is am not using any environment variables, all urls are hard coded.
I have not added any configuration for ngnix, I am using the default docker image and just copying my react app in it.
Here is my part of docker configuration.
Dockerlife.dev(react app)
FROM nginx:1.23
WORKDIR /react
COPY ./build/ /usr/share/nginx/html
EXPOSE 3005
dockerfile.dev (nodejs server)
FROM node:16.15-alpine3.15
WORKDIR /usr/src/server
COPY ./package.json .
RUN npm install
COPY . .
ENV NODE_ENV=development
EXPOSE 3000
CMD ["npm", "run", "app"]
Dcoker compose:
version: "3"
services:
client:
build:
context: ./react
dockerfile: Dockerfile.dev
ports:
- "3005:80"
volumes:
- /react/node_modules
- ./react:/react
deploy:
restart_policy:
condition: always
node-server:
network_mode: "host"
build:
context: ./server
dockerfile: Dockerfile.dev
ports:
- "3000:3000"
deploy:
restart_policy:
condition: always

Unable to deploy node image(NestJS) on AWS Elastic beanstalk

I am literally new with AWS as well and Containerization technology. What I am trying to achieve is that deploying a node image to AWS.
AS I am working with NESTJS my main.ts bootstrap method
async function bootstrap() {
const app = await NestFactory.create(AppModule);
app.setGlobalPrefix('api/v1');
await app.listen(5000);
Logger.log(`Server is running on port ${port}`, "Bootstrap");
}
bootstrap();
I am also using Travis CI to ship my container to AWS
My Docker file
# Download base image
FROM node:alpine as builder
# Define Base Directory
WORKDIR /usr/app/Api
# Copy and restore packages
COPY ./package*.json ./
RUN npm install
# Copy all other directories
COPY ./ ./
# Setup base command
CMD ["npm", "run", "start"]
MY .travis.yml file --> which is the config of Travis CI
sudo: required
services:
- docker
branches:
only:
- master
before_install:
- docker build -t xx/api .
script:
- docker run xx/api npm run test
deploy:
provider: elasticbeanstalk
region: "us-east-2"
app: "api"
env: "api-env"
bucket_name: "name"
bucket_path: "api"
on:
branch: master
access_key_id: "$AWS_ACCESS_KEY"
secret_access_key: "$AWS_SECRET_KEY"
Every time code pushed from Travis CI my Elastic beanstalk start building and failed.
So, I start googling to solve the issue. What I could is that I need to expose port using NGINX. Expost 80 PORT
FROM Nginx
EXPOSE 80
COPY --from=builder /app/build /usr/share/nginx/html
My question is how should I incorporate NGINX to my docker file? AS my application is not something static content. If I move all my build artefacts to /usr/share/nginx/html. This will simply not work as I am not serving static content. So What I need is that I simultaneously run my server to server node app as well as run another container with NGINX which will export 80 port and proxy my requests.
How should I do that? Any help?

Using node to proxy external traffic from container through other service mock the response

I have a node application which i want to write integration tests for. For that to work i need to be able to mock requests for both http requests and websocket.
I use docker-compose to define my app dependencies. The relevant part of my docker-compose.yml is
version: "3.2"
services:
app:
build: .
command: npm run dev
depends_on:
- proxycontainer
environment:
HTTP_PROXY: proxycontainer:8080
NO_PROXY: localhost,127.0.0.1
proxycontainer:
build: ./proxy
I have a simple Dockerfile for the node app
FROM node:8.12.0-alpine as base
WORKDIR /usr/app
COPY package*.json ./
RUN apk add --no-cache --virtual .gyp \
python \
make \
g++ \
&& npm install \
&& apk del .gyp
FROM base as build
COPY . .
RUN npm run build
The proxy Dockerfile looks like
FROM node:8.12.0-alpine
WORKDIR /usr/app
COPY package*.json ./
RUN npm install
COPY . .
CMD [ "npm", "run", "proxy" ]
Where npm run proxy is running node ./index.js on this simple file
const express = require('express')
const proxy = require('http-proxy-middleware')
const app = express()
app.use('/', proxy({ target: 'http://www.example.org', changeOrigin: true, logLevel: 'debug' }))
app.listen(8080)
To test just the proxy i have replaced my app with
const testProxy = async () => {
const data = await axios.get("http://example.org/");
console.log(data.data)
}
testProxy()
When running this example i get the error Error: connect EINVAL 0.0.31.144:80 - Local (0.0.0.0:0)
So how do i proxy external requests from one node docker service into a node proxy service, which then can mock the response for http and websocket connections?
If i remove the HTTP_PROXY env variable everything works as expected.
Did you try using a http:// prefix before your proxy container's name?
HTTP_PROXY: http://proxycontainer:8080
You might also create a custom network and assign a local IP address to each container, so you could access them using a static IP.
version: "3.2"
services:
app:
build: .
command: npm run dev
depends_on:
- proxycontainer
environment:
HTTP_PROXY: http://172.28.1.2:8080/
NO_PROXY: localhost,127.0.0.1
networks:
proxy_net:
ipv4_address: 172.28.1.1
proxycontainer:
build: ./proxy
networks:
proxy_net:
ipv4_address: 172.28.1.2
networks:
proxy_net:
ipam:
driver: default
config:
- subnet: 172.28.0.0/16

Resources