Docker connect NodeJS container with Apache container in front end JS - node.js

I am building a chat application that i am implementing in Docker. I have a NodeJS container with socket.io and a container with apache server and website on it.
The thing is i need to connect to the website(with javascript) to the NodeJS server. I have looked at the Docker-compose docks and read about networking. The docs said that the address should be the name of the container. But when i try that i get the following error in my browser console:
GET http://nodejs:3000/socket.io/socket.io.js net::ERR_NAME_NOT_RESOLVED
The whole project works outside containers.The only thing that i cannot figure out is the connection between the NodeJs container and the Apache container.
code that throws the error:
<script type="text/javascript" src="//nodejs:3000/socket.io/socket.io.js"></script>
My docker compose file:
version: '3.5'
services:
apache:
build:
context: ./
dockerfile: ./Dockerfile
networks:
default:
ports:
- 8080:80
volumes:
- ./:/var/www/html
container_name: apache
nodejs:
image: node:latest
working_dir: /home/node/app
networks:
default:
ports:
- '3001:3000'
volumes:
- './node_server/:/home/node/app'
command: [npm, start]
depends_on:
- mongodb
container_name: nodejs
networks:
default:
driver: bridge
Can anyone explain me how to succesfully connect the apache container to the NodeJS container so it can serve the socket.io.js file?
I can give more of the source code if needed.

The nodejs service is exposing port 3001 not 3000. 3001:3000 is a port mapping which forwards :3001 to the :3000 container port. So you would need to point it to nodejs:3001.
However, I don't think that'll work since the nodejs hostname is not accessible by the browser. You need to point it to the host in which docker is running since you are exposing those ports there. If you are running this locally it might look like:
<script type="text/javascript" src="//localhost:3001/socket.io/socket.io.js"></script>
In other words, you are not connecting to the nodejs server from the apache service, you are accessing it externally through the browser.

Related

I can't connect a nodejs app to a redis server using docker

Good morning guys.
I'm having a problem connecting a nodejs application, in a container, to another container that contains a redis server. On my local machine I can connect the application to this redis container without any problem. However, when trying to upload this application in a container, a timeout error is returned.
I'm new to docker and I don't understand why I can connect to this docker container in the application running locally on my machine but that same connection doesn't work when I upload the application in a container.
I tried using docker-compose, but from what I understand it will upload in another container to the redis server, instead of using the redis container that is already in docker.
To connect to redis I'm using the following code:
createClient({
socket: {
host: process.env.REDIS_HOST,
port: Number(process.env.REDIS_PORT)
}
});
Where REDIS_HOST is the address of my container running on the server and REDIS_PORT is the port where this container is running on my server.
To run redis on docker I used the following guide: https://redis.io/docs/stack/get-started/install/docker/
I apologize if my problem was not very clear, I'm still studying docker.
You mentioned you are using Docker Compose. Here's an example showing how to start Redis in a container, and make your Node application wait for that container then use an environment variable in your Node application to specify the name of the host to connect to Redis on. In this example it connects to the container running Redis that I've called "redis":
version: "3.9"
services:
redis:
container_name: redis_kaboom
image: "redislabs/redismod"
ports:
- 6379:6379
volumes:
- ./redisdata:/data
entrypoint:
redis-server
--loadmodule /usr/lib/redis/modules/rejson.so
--appendonly yes
deploy:
replicas: 1
restart_policy:
condition: on-failure
node:
container_name: node_kaboom
build: .
volumes:
- .:/app
- /app/node_modules
command: sh -c "npm run load && npm run dev"
depends_on:
- redis
ports:
- 8080:8080
environment:
- REDIS_HOST=redis
So in your Node code you'd then use the value of process.env.REDIS_HOST to connect to the right Redis host. Here, I'm not using a password or a non-standard port, you could also supply those as environment variables that match the configuration of the Redis container in Docker Compose too if you needed to.
Disclosure: I work for Redis.

SvelteKit SSR fetch() when backend is in a Docker container

I use docker compose for my project. It includes these containers:
Nginx
PostgreSQL
Backend (Node.js)
Frontend (SvelteKit)
I use SvelteKit's Load function to send request to my backend. In short, it sends http request to the backend container either on client-side or server-side. Which means, the request can be send not only by browser but also by container itself.
I can't get it to work on both client-side and server-side fetch. Only one of them is working.
I tried these URLs:
http://api.localhost/articles (only client-side request works)
http://api.host.docker.internal/articles (only server-side request works)
http://backend:8080/articles (only server-side request works)
I get this error:
From SvelteKit:
FetchError: request to http://api.localhost/articles failed, reason: connect ECONNREFUSED 127.0.0.1:80
From Nginx:
Timeout error
Docker-compose.yml file:
version: '3.8'
services:
webserver:
restart: unless-stopped
image: nginx:latest
ports:
- 80:80
- 443:443
depends_on:
- frontend
- backend
networks:
- webserver
volumes:
- ./webserver/nginx/conf/:/etc/nginx/conf.d/
- ./webserver/certbot/www:/var/www/certbot/:ro
- ./webserver/certbot/conf/:/etc/nginx/ssl/:ro
backend:
restart: unless-stopped
build:
context: ./backend
target: development
ports:
- 8080:8080
depends_on:
- db
networks:
- database
- webserver
volumes:
- ./backend:/app
frontend:
restart: unless-stopped
build:
context: ./frontend
target: development
ports:
- 3000:3000
depends_on:
- backend
networks:
- webserver
networks:
database:
driver: bridge
webserver:
driver: bridge
How can I send server-side request to docker container by using http://api.localhost/articles as URL? I also want my container to be accesible by other containers as http://backend:8080 if possible.
Use SvelteKit's externalFetch hook to have a different and overridden API URL in frontend and backend.
In docker-compose, the containers should be able to access each other by name if they are in the same Docker network.
Your frontend docker SSR should be able to call your backend docker by using the URL:
http://backend:8080
Web browser should be able to call your backend by using the URL:
(whatever reads in your Nginx configuration files)
Naturally, there are many reasons why this could fail. The best way to tackle this is to test URLs one by one, server by server using curl and entering addresses to the web browser address. It's not possible to answer the exact reason why it fails, because the question does not contain enough information, or generally repeatable recipe for the issue.
For further information, here is our sample configuration for a dockerised SvelteKit frontend. The internal backend shortcut is defined using hooks and configuration variables. Here is our externalFetch example.
From a docker compose you will be able to CURL from one container using the dns (service name you gave in the compose file)
CURL -XGET backend:8080
You can achieve this also by running all of these containers on host driver network.
Regarding the http://api.localhost/articles
You can change the /etc/hosts
And specify the IP you want your computer to try to communicate with when this url : http://api.localhost/articles is used.

How to connect to Node API docker container from Angular Nginx container

I am currently working on an angular app using Rest API (Express, Nodejs) and Postgresql. Everything worked well when hosted on my local machine. After testing, I moved the images to Ubuntu server so the app can be hosted on an external port. I am able to access the angular frontend using the https://server-external-ip:80 but when trying to login, Nginx is not connecting to NodeApi. Here is my docker-compose file:
version: '3.0'
services:
db:
image: postgres:9.6-alpine
environment:
POSTGRES_DB: myDb
POSTGRES_PASSWORD: myPwd
ports:
- 5432:5432
restart: always
volumes:
- ./postgres-data:/var/lib/postgresql/data
networks:
- my-network
backend: # name of the second service
image: myId/mynodeapi
ports:
- 3000:3000
environment:
POSTGRES_DB: myDb
POSTGRES_PASSWORD: myPwd
POSTGRES_PORT: 5432
POSTGRES_HOST: db
depends_on:
- db
networks:
- my-network
command: bash -c "sleep 20 && node server.js"
myapp:
image: myId/myangularapp
ports:
- "80:80"
depends_on:
- backend
networks:
- my-network
networks:
my-network:
I am not sure what the apiUrl should be? I have tried the following and nothing worked:
apiUrl: "http://backend:3000/api"
apiUrl: "http://server-external-ip:3000/api"
apiUrl: "http://server-internal-ip:3000/api"
apiUrl: "http://localhost:3000/api"
I think you should use the docker-compose service as a DNS. It seems you've several docker hosts/ports available, there are the following in your docker-compose structure:
db:5432
http://backend:3000
http://myapp
Make sure to use db as POSTGRES_DB in the environment part for backend service.
Take a look to my repo, I think is the best way to learn how a similar project works and how to build several apps with nginx, you also can check my docker-compose.yml, it uses several services and are proxied using nginx and are worked together.
On this link you’ll find a nginx/default.conf file and it contains several nginx upstream configuration please take a look at how I used docker-compose service references there as hosts.
Inside the client/ directory, I also have another nginx as a web server of a react.js project.
On server/ directory, it has a Node.js API, It connects to Redis and Postgres SQL database also built from docker-compose.yml.
If you need set or redirect traffic to /api you can use some ngnix config like this
I think this use case can be useful for you and other users!

NodeJS 14 in a Docker container can't connect to Postgres DB (in/out docker)

I'm making a React-Native app using Rest API (NodeJS, Express) and PostgreSQL.
Everything work good when hosted on my local machine.
Everything work good when API is host on my machine and PostgreSQL in docker container.
But when backend and frontend is both in docker, database is reachable from all my computer in local, but not by the backend.
I'm using docker-compose.
version: '3'
services:
wallnerbackend:
build:
context: ./backend/
dockerfile: ../Dockerfiles/server.dockerfile
ports:
- "8080:8080"
wallnerdatabase:
build:
context: .
dockerfile: ./Dockerfiles/postgresql.dockerfile
ports:
- "5432:5432"
volumes:
- db-data:/var/lib/postgresql/data
env_file: .env_docker
volumes:
db-data:
.env_docker and .env have the same parameters (just name changing).
Here is my dockerfiles:
Backend
FROM node:14.1
COPY package*.json ./
RUN npm install
COPY . .
CMD ["npm", "start"]
Database
FROM postgres:alpine
COPY ./wallnerdb.sql /docker-entrypoint-initdb.d/
I tried to change my hostname in connection url to postgres by using the name of the docker, my host IP address, localhost, but no results.
It's also the same .env (file in my node repo with db_name passwd etc) I do use in local to connect my backend to the db.
Since you are using NodeJS 14 in the Docker Container - make sure that you have the latest pg dependency installed:
https://github.com/brianc/node-postgres/issues/2180
Alternatively: Downgrade to Node 12.
Also make sure, that both the database and the "backend" are in the same network. Also: the backend should best "depend" on the database.
version: '3'
services:
wallnerbackend:
build:
context: ./backend/
dockerfile: ../Dockerfiles/server.dockerfile
ports:
- '8080:8080'
networks:
- default
depends_on:
- wallnerdatabase
wallnerdatabase:
build:
context: .
dockerfile: ./Dockerfiles/postgresql.dockerfile
ports:
- '5432:5432'
volumes:
- db-data:/var/lib/postgresql/data
env_file: .env_docker
networks:
- default
volumes:
db-data:
networks:
default:
This should not be necessary in you case - as pointed out in the comments - since Docker Compose already creates a default network
The container name "wallnerdatabase" is the host name of your database - if not configured otherwise.
I expect the issue to be in the database connection URL since you did not share it.
Containers in the same network in a docker-compose.yml can reach each other using the service name. In your case the service name of the database is wallnerdatabase so this is the hostname that you should use in the database connection URL.
The database connection URL that you should use in your backend service should be similar to this:
postgres://user:password#wallnerdatabase:5432/dbname
Also make sure that the backend code is calling the database using the hostname wallnerdatabase as it is defined in the docker-compose.yml file.
Here is the reference on Networking in Docker Compose.
You should access your DB using service name as hostname. Here is my working example - https://gitlab.com/gintsgints/vue-fullstack/-/blob/master/docker-compose.yml

How to run node js docker container using docker-composer to manage php app assets

Lets say we have three services
- php+ apache
- mysql
- nodejs
I know how to use docker-compose to setup application to link mysql with php apache service. I was wondering how we can add node.js service just to manage
js/css assets. The purpose of node.js service is to just manage javascript/css resources. Since docker provides this flexibility I was wondering to use docker service instead of setting up node.js on my host computer.
version: '3.2'
services:
web:
build: .
image: lap
volumes:
- ./webroot:/var/www/app
- ./configs/php.ini:/usr/local/etc/php/php.ini
- ./configs/vhost.conf:/etc/apache2/sites-available/000-default.conf
links:
- dbs:mysql
dbs:
image: mysql
ports:
- "3307:3306"
environment:
- MYSQL_ROOT_PASSWORD=root
- MYSQL_PASSWORD=rest
- MYSQL_DATABASE=symfony_rest
- MYSQL_USER=restman
volumes:
- /var/mysql:/var/lib/mysql
- ./configs/mysql.cnf:/etc/mysql/conf.d/mysql.cnf
node:
image: node
volumes:
- ./webroot:/var/app
working_dir: /var/app
I am not sure this is correct strategy , I am sharing ./webroot with both web and node service. docker-compose up -d only starts mysql and web and fails to start node container , probably there is not valid entrypoint set.
if you want to use node js separate from PHP service you must set two more options to make node stay up, one is stdin_open and the other one is tty like bellow
stdin_open: true
tty: true
this is equivalent to CLI command -it like bellow
docker container run --name nodeapp -it node:latest
if you have a separate port to run your node app (e.g. your frontend is completely separate from your backend and you must run it independently from your backend like you must run npm run start command in order to run the frontend app) you must publish your port like bellow
ports:
- 3000:3000
ports structure is systemPort:containerInnerPort.
this means publish port 3000 from inside node container to port 3000 on the system, in another way your make port 3000 inside your container accessible on your system and you can access this port like localhost:3000.
in the end, your node service would be like bellow
node:
image: node
stdin_open: true
tty: true
volumes:
- ./webroot:/var/app
working_dir: /var/app
You can also add nginx service to docker-compose, and nginx can take care of forwarding requests to php container or node.js container. You need some server that binds to 80 port and redirect requests to designated container.

Resources