Nodejs and React inside docker containers - node.js

I am trying to run Node/React project inside Docker containers. I have a NodeJS server for the API's and the client app. I also have concurrently installed and everything works fine when running npm run dev.
This issue is when I run the server and app via a docker-compose.yml file I get the following error from the client:
client | [HPM] Error occurred while trying to proxy request /api/current_user from localhost:3000 to http://localhost:5000 (ECONNREFUSED) (https://nodejs.org/api/errors.html#errors_common_system_errors)
Here is the docker-compose.yml
version: "3"
services:
frontend:
container_name: client
build:
context: ./client
dockerfile: Dockerfile
image: client
ports:
- "3000:3000"
volumes:
- ./client:/usr/src/app
networks:
- local
backend:
container_name: server
build:
context: ./
dockerfile: Dockerfile
image: server
ports:
- "5000:5000"
depends_on:
- frontend
volumes:
- ./:/usr/src/app
networks:
- local
networks:
local:
driver: bridge
Server Dockerfile
FROM node:lts-slim
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
EXPOSE 5000
# You can change this
CMD [ "npm", "run", "dev" ]
Client Dockerfile
FROM node:lts-slim
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
EXPOSE 3000
CMD [ "npm", "start" ]
I am using "http-proxy-middleware": "^0.21.0" so my setupProxy.js is
const proxy = require('http-proxy-middleware');
module.exports = function(app) {
app.use(proxy('/auth/google', { target: 'http://localhost:5000' }));
app.use(proxy('/api/**', { target: 'http://localhost:5000' }));
};

You should use container_name instead of localhost
app.use(proxy('/auth/google', { target: 'http://localhost:5000' }));
app.use(proxy('/api/**', { target: 'http://localhost:5000' }));
You can also check these details by inspecting your network using following command:-
docker inspect <network_name>
It will show all the connected containers to the network, also the host names created for those containers.
NOTE : Host names are created based on container_names otherwise based
on service names.

Related

Cant access node.js/express API from Vue.js app on same docker network

I am creating a web architecture with Vue.JS, a node.js / express backend and a mongoDB database. All is docked in separate containers but on the same subnet.
Dockers config files :
docker-compose.yml
version: "2"
services:
backend:
container_name: WESCANADMINBACK
image: "node:latest"
depends_on:
- db
working_dir: /home/node/app
volumes:
- /home/***/***/data/backend:/home/node/app
- /usr/app/node_modules
environment:
- MONGO_URL=mongodb://172.0.43.9:27017/wescan
- APP_PORT=8080
expose:
- "8080"
command: "node app.js"
networks:
adminNet:
ipv4_address: 172.0.43.8
db:
container_name: WESCANADMINDB
image: mongo:4.0
restart: always
networks:
adminNet:
ipv4_address: 172.0.43.9
frontend:
container_name: WESCANADMIN
build:
context: /home/***/***/data
volumes:
- /home/***/***/data:/app
- /app/node_modules
expose:
- "80"
environment:
- BACKEND_URL=http://172.0.43.8/wescan
networks:
adminNet:
ipv4_address: 172.0.43.10
networks:
adminNet:
driver: bridge
ipam:
config:
- subnet: 172.0.43.0/24
Dockerfile (frontend)
FROM node:latest as builder
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
RUN npm run build
FROM nginx:alpine as production-build
COPY ./.nginx/nginx.conf /etc/nginx/nginx.conf
RUN rm -rf /usr/share/nginx/html/*
COPY --from=builder /app/dist /usr/share/nginx/html
ENV HOST 0.0.0.0
EXPOSE 80
ENTRYPOINT ["nginx", "-g", "daemon off;"]
I tried to send request from my server to ip adress of the API and it's works. However, when the request is sent from the web application with Axios it is impossible to access to API.
import axios from 'axios'
const instance = axios.create({
crossDomain: true,
baseURL: 'http://172.0.43.8:8080/api'
})
export default instance
I tried a lot of things from the web like ENV HOST 0.0.0.0 in Dockerfile or crossDomain: true in axios parameters but nothing.
cya

Dockerizing React-Express- MongoDB Atlas App with Docker-Compose and Proxy

I have a react-express app that connects to MongoDB Atlas and is deployed to Google App Engine. Currently, the client folder is stored inside the backend folder. I manually build the React client app with npm run build, then use gcloud app deploy to push the entire backend app and the built React files to GAE, where it is connected to a custom domain. I am looking to dockerize the app, but am running into problems with getting my backend up and running. Before Docker, I was using express static middleware to serve the React files, and had a setupProxy.js file in the client directory that I used to redirect api requests.
My express app middleware:
if (process.env.NODE_ENV === 'production') {
app.use(express.static(path.join(__dirname, '/../../client/build')));
}
if (process.env.NODE_ENV === 'production') {
app.get('/*', (req, res) => {
res.sendFile(path.join(__dirname, '/../../client/build/index.html'));
});
}
My client Dockerfile:
FROM node:14-slim
WORKDIR /usr/src/app
COPY ./package.json ./
RUN npm install
COPY . .
RUN npm run build
EXPOSE 3000
CMD [ "npm", "start"]
My backend Dockerfile:
FROM node:14-slim
WORKDIR /usr/src/app
COPY ./package.json ./
RUN npm install
COPY . .
EXPOSE 5000
CMD [ "npm", "start"]
docker-compose.yml
version: '3.4'
services:
backend:
image: backend
build:
context: backend
dockerfile: ./Dockerfile
environment:
NODE_ENV: production
ports:
- 5000:5000
networks:
- mern-app
client:
image: client
build:
context: client
dockerfile: ./Dockerfile
stdin_open: true
environment:
NODE_ENV: production
ports:
- 3000:3000
networks:
- mern-app
depends_on:
- backend
networks:
mern-app:
driver: bridge
I'm hoping that someone can provide some insight/assistance regarding how to effectively connect my React and Express apps inside a container using Docker Compose so that I can make calls to my API server. Thanks in advance!

Docker-compose builds but app does not serve on localhost

Docker newbie here. Docker-compose file builds without any issues but when I try to run my app on localhost:4200, I get a message - localhost didn't send any data on chrome and the server unexpectedly dropped the connection in safari. I am working on MacOs Catalina. Here is my yml file:
version: '3.0'
services:
my-portal:
build: .
ports:
- "4200:4200"
depends_on:
- backend
backend:
build: ./backend
ports:
- "3000:3000"
environment:
POSTGRES_HOST: host.docker.internal
POSTGRES_USER: "postgres"
POSTGRES_PASSWORD: mypwd
depends_on:
-db
db:
image: postgres:9.6-alpine
environment:
POSTGRES_DB: mydb
POSTGRES_USER: "postgres"
POSTGRES_PASSWORD: mypwd
POSTGRES_HOST: host.docker.internal
ports:
- 5432:5432
restart: always
volumes:
- ./docker/db/data:/var/lib/postgresql/data
Log for Angular:
/docker-entrypoint.sh: Configuration complete; ready for start up
Log for Node: db connected
Log for Postgres: database system is ready to accept connections
Below are my Angular and Node Docker files:
FROM node:latest AS builder
WORKDIR /app
COPY . .
RUN npm install
RUN npm run build --prod
EXPOSE 4200
# Stage 2
FROM nginx:alpine
COPY --from=builder /app/dist/* /usr/share/nginx/html/
Node:
FROM node:12
WORKDIR /backend
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 3000
CMD [ "node", "server.js" ]
When I created Angular image and ran my app on localhost:4200 it worked fine. Please let me know if I am missing anything.
Your Angular container is built FROM nginx, and you use the default Nginx configuration from the Docker Hub nginx image. That listens on port 80, so that's the port number you need to use in use ports: directive:
services:
quickcoms-portal:
build: .
ports:
- "4200:80" # <-- second port must match nginx image's port
depends_on:
- backend
The EXPOSE directive in the first stage is completely ignored and you can delete it. The FROM nginx line causes docker build to basically completely start over from a new base image, so your final image is stock Nginx plus the files you COPY --from=builder.

Connection Refused Error on React using env Variables from Docker

I'm trying to define some env variables in my docker files, in order to use these in my react application:
The next is my docker file in the Node server side:
FROM node:lts-slim
RUN mkdir -p /app
WORKDIR /app
# install node_modules
ADD package.json /app/package.json
RUN npm install --loglevel verbose
# copy codebase to docker codebase
ADD . /app
EXPOSE 8081
# You can change this
CMD [ "nodemon", "serverApp.js" ]
This is my docker-compose file:
version: "3"
services:
frontend:
stdin_open: true
container_name: firestore_manager
build:
context: ./client/firestore-app
dockerfile: DockerFile
image: rasilvap/firestore_manager
ports:
- "3000:3000"
volumes:
- ./client/firestore-app:/app
environment:
- BACKEND_HOST=backend
- BACKEND_PORT=8081
depends_on:
- backend
backend:
container_name: firestore_manager_server
build:
context: ./server
dockerfile: Dockerfile
image: rasilvap/firestore_manager_server
ports:
- "8081:8081"
volumes:
- ./server:/app
This is the way in which I'm using it in the react code:
axios.delete(`http://backend:8081/firestore/`, request).then((res) => {....
But I'm getting a connection refused Error. I'm new with react and not pretty sure how can I achieve this.
Any ideas?
Your way of requesting the service looks fine : http://foo-service:port.
I think that your issue is a security issue.
Because the two applications are not considered on the same origin (origin = domain + protocol + port), you fall into into a CORS (Cross Origin Resource Sharing) requirement.
In that scenario, your browser will not accept to perform your ajax query if the backend didn't return a its agreement to the preflight CORS request to share its resources with that other "origin".
So enable CORS in the backend to solve the issue (each api/framework has its own way).
Maybe that post may help you for firebase.

Docker nuxt js and django getting error (Econnrefused)

i am currently working on docker with django and nuxt js. I can get json data from https://jsonplaceholder.typicode.com/posts in asyncData or nuxtServerInit it's correctly fetching and getting no error. But when i want to fetch from my django rest api service is getting error (cors headers added). I tested to fetching posts from django rest api with created() hook and it's working. I don't understand why its getting error.
pages/index.vue
export default {
asyncData(context){
return context.app.$axios.$get('http://127.0.0.1:8000') // this is throwing an error
.then((res) => {
let posts = [];
posts = res;
return { posts: res }
})
},
created() {
this.$axios.$get('http://127.0.0.1:8000') // this is not throwing any error
.then(e => {console.log(e);})
}
}
my docker compse ile
version: '3'
networks:
main:
driver: bridge
services:
api:
container_name: blog_api
build:
context: ./backend
ports:
- "8000:8000"
command: >
sh -c "python manage.py runserver 0.0.0.0:8000"
volumes:
- ./backend:/app
networks:
- main
web:
container_name: blog_web
build:
context: ./frontend
ports:
- "8080:8080"
volumes:
- ./frontend:/code
networks:
- main
backend docker file
FROM python:3.8-alpine
ENV PYTHONBUFFERED 1
ENV PYTHONWRITEBYTECODE 1
RUN mkdir /app
WORKDIR /app
COPY ./requirements.txt /app/
RUN pip install -r requirements.txt
EXPOSE 8000
ADD . /app
frontend dockerfile
FROM node:13-alpine
WORKDIR /code/
COPY . .
EXPOSE 8080
ENV NUXT_HOST=0.0.0.0
ENV NUXT_PORT=8080
RUN npm install
CMD ["npm", "run", "dev"]
Nuxt-config.js
server: {
port: 8080,
host: '0.0.0.0'
}

Resources