How to connect NodeJS' API from AngularJS docker container from the browser? - node.js

I've two docker containers which are AngularJS and NodeJS applications. I'm using Azure App Service with Multi-Container docker configuration. I have docker compose file for my application and using the same in Azure App Service. From the browser, I got the application's UI screen once I click on the button it should call an API from NodeJS but It is not happening though. My Dockerfiles and docker compose file are very simple and straight forward.
I have provided nodejs as container name for NodeJS application and mentioned the same name in AngularJS code to access the API but it is not working from the browser.
docker-compose.yml
version: '3.3'
services:
angularjs:
container_name: 'angularjs'
image: 'ui:1'
ports:
- '4200:4200'
depends_on:
- 'nodejs'
nodejs:
container_name: 'nodejs'
image: 'api:1'
ports:
- '4000:4000'
In the AngularJS code, I'm using http://nodejs:4000/data and it is a GET method. In this way, I'm facing an error ERR_NAME_NOT_RESOLVED.

Related

How to deploy a multi-container docker with GitHub actions to Azure web application?

I want to deploy a web application, which is built with Docker into a Azure web app.
There are a lot of tutorials and documentation about how to easily deploy a single docker image into Azure. But how to deploy multiple images into Azure?
I want to achieve this:
Local development with Docker-Compose. Works.
Versioning with GitHub. Works.
GitHub actions > Building the Docker images and pushing it to Docker-Hub (maybe not necessary, if the images are registered in Azure). Works.
Deploy everything to Azure and run the web application.
There is a similar question here: How to deploy a multi-container app to Azure with a Github action?
But I want to avoid the manual step, which is mentionend in the answer.
My docker-compose.yml:
version: '3.8'
services:
server-app:
image: tensorflow/serving
command:
- --model_config_file=/models/models.config
ports:
- 8501:8501
container_name: TF_serving
tty: true
volumes:
- type: bind
source: ./content
target: /models/
client-app:
build:
context: ./client-app
dockerfile: dockerfile
image: user1234/client-app:latest
restart: unless-stopped
ports:
- 7862:80

multi-container app build via docker-compose in Azure Container Registry work locally but fail on Azure app services

I containerized an app that has two servers to run: rasa and action_server. I used docker-compose to build the images and then pushed them to Azure Registery Container.
I run the images locally from Azure Registry Container and works fine on localhost.
However, when I deployed the app on Azure App Services, the app restarts with an error.
2022-11-04T17:57:56.583Z ERROR - Start multi-container app failed
2022-11-04T17:57:56.593Z INFO - Stopping site demosite because it failed during startup.
2022-11-04T18:01:36.904Z INFO - Starting multi-container app..
2022-11-04T18:01:36.910Z ERROR - Exception in multi-container config parsing: Exception: System.InvalidCastException, Msg: Unable to cast object of type 'YamlDotNet.RepresentationModel.YamlScalarNode' to type 'YamlDotNet.RepresentationModel.YamlMappingNode'.
2022-11-04T18:01:36.911Z ERROR - Start multi-container app failed
2022-11-04T18:01:36.914Z INFO - Stopping site demosite because it failed during startup.
docker-compose.yaml
version: '3'
services:
rasa:
container_name: "rasa_server"
user: root
build:
context: .
volumes:
- "./:/app"
ports:
- "80:80"
action_server:
container_name: "action_server"
build:
context: actions
volumes:
- ./actions:/app/actions
- ./data:/app/data
ports:
- "5055:5055"
I tried to make another app but the problem remains.
As stated in the documentation, build is unsupported. You need to modify your compose file to reference the images stored in ACR.
Also, App Service only listens on port 80/443 so you can't expose your action_server on port 5055.

SvelteKit SSR fetch() when backend is in a Docker container

I use docker compose for my project. It includes these containers:
Nginx
PostgreSQL
Backend (Node.js)
Frontend (SvelteKit)
I use SvelteKit's Load function to send request to my backend. In short, it sends http request to the backend container either on client-side or server-side. Which means, the request can be send not only by browser but also by container itself.
I can't get it to work on both client-side and server-side fetch. Only one of them is working.
I tried these URLs:
http://api.localhost/articles (only client-side request works)
http://api.host.docker.internal/articles (only server-side request works)
http://backend:8080/articles (only server-side request works)
I get this error:
From SvelteKit:
FetchError: request to http://api.localhost/articles failed, reason: connect ECONNREFUSED 127.0.0.1:80
From Nginx:
Timeout error
Docker-compose.yml file:
version: '3.8'
services:
webserver:
restart: unless-stopped
image: nginx:latest
ports:
- 80:80
- 443:443
depends_on:
- frontend
- backend
networks:
- webserver
volumes:
- ./webserver/nginx/conf/:/etc/nginx/conf.d/
- ./webserver/certbot/www:/var/www/certbot/:ro
- ./webserver/certbot/conf/:/etc/nginx/ssl/:ro
backend:
restart: unless-stopped
build:
context: ./backend
target: development
ports:
- 8080:8080
depends_on:
- db
networks:
- database
- webserver
volumes:
- ./backend:/app
frontend:
restart: unless-stopped
build:
context: ./frontend
target: development
ports:
- 3000:3000
depends_on:
- backend
networks:
- webserver
networks:
database:
driver: bridge
webserver:
driver: bridge
How can I send server-side request to docker container by using http://api.localhost/articles as URL? I also want my container to be accesible by other containers as http://backend:8080 if possible.
Use SvelteKit's externalFetch hook to have a different and overridden API URL in frontend and backend.
In docker-compose, the containers should be able to access each other by name if they are in the same Docker network.
Your frontend docker SSR should be able to call your backend docker by using the URL:
http://backend:8080
Web browser should be able to call your backend by using the URL:
(whatever reads in your Nginx configuration files)
Naturally, there are many reasons why this could fail. The best way to tackle this is to test URLs one by one, server by server using curl and entering addresses to the web browser address. It's not possible to answer the exact reason why it fails, because the question does not contain enough information, or generally repeatable recipe for the issue.
For further information, here is our sample configuration for a dockerised SvelteKit frontend. The internal backend shortcut is defined using hooks and configuration variables. Here is our externalFetch example.
From a docker compose you will be able to CURL from one container using the dns (service name you gave in the compose file)
CURL -XGET backend:8080
You can achieve this also by running all of these containers on host driver network.
Regarding the http://api.localhost/articles
You can change the /etc/hosts
And specify the IP you want your computer to try to communicate with when this url : http://api.localhost/articles is used.

Docker compose networking issue with react app

I am running a react app and a json server with docker-compose.
Usually I connect to the json server from my react app by the following:
fetch('localhost:8080/classes')
.then(response => response.json())
.then(classes => this.setState({classlist:classes}));
Here is my docker-compose file:
version: "3"
services:
frontend:
container_name: react_app
build:
context: ./client
dockerfile: Dockerfile
image: praventz/react_app
ports:
- "3000:3000"
volumes:
- ./client:/usr/src/app
backend:
container_name: json_server
build:
context: ./server
dockerfile: Dockerfile
image: praventz/json_server
ports:
- "8080:8080"
volumes:
- ./server:/usr/src/app
the problem is I can't seem to get my react app to fetch this information from the json server.
on my local machine I use 192.168.99.100:3000 to see my react app
and I use 192.168.99.100:8080 to see the json server but I can't seem to connect them with any of the following:
backend:8080/classes
json_server:8080/classes
backend/classes
json_server/classes
{host:"json_server/classes", port:8080}
{host:"backend/classes", port:8080}
Both the react app and the json server are running perfectly fine independently with docker-compose up.
What should I be putting in fetch() ?
Remember that the React application always runs in some user's browser; it has no idea that Docker is involved, and can't reach or use any of the Docker-related networking setup.
on my local machine I use [...] 192.168.99.100:8080 to see the json server
Then that's what you need in your React application too.
You might consider setting up some sort of proxy in front of this that can, for example, forward URL paths beginning with /api to the backend container and forward other URLs to the frontend container (or better still, run a tool like Webpack to compile your React application to static files and serve that directly). If you have that setup, then the React application can use a path /api/v1/... with no host, and it will be resolved relative to whatever the browser thinks "the current host" is, which should usually be the proxy.
You have two solutions:
use CORS on Express server see https://www.npmjs.com/package/cors
set up proxy/reverse proxy using NGINX

Docker connect NodeJS container with Apache container in front end JS

I am building a chat application that i am implementing in Docker. I have a NodeJS container with socket.io and a container with apache server and website on it.
The thing is i need to connect to the website(with javascript) to the NodeJS server. I have looked at the Docker-compose docks and read about networking. The docs said that the address should be the name of the container. But when i try that i get the following error in my browser console:
GET http://nodejs:3000/socket.io/socket.io.js net::ERR_NAME_NOT_RESOLVED
The whole project works outside containers.The only thing that i cannot figure out is the connection between the NodeJs container and the Apache container.
code that throws the error:
<script type="text/javascript" src="//nodejs:3000/socket.io/socket.io.js"></script>
My docker compose file:
version: '3.5'
services:
apache:
build:
context: ./
dockerfile: ./Dockerfile
networks:
default:
ports:
- 8080:80
volumes:
- ./:/var/www/html
container_name: apache
nodejs:
image: node:latest
working_dir: /home/node/app
networks:
default:
ports:
- '3001:3000'
volumes:
- './node_server/:/home/node/app'
command: [npm, start]
depends_on:
- mongodb
container_name: nodejs
networks:
default:
driver: bridge
Can anyone explain me how to succesfully connect the apache container to the NodeJS container so it can serve the socket.io.js file?
I can give more of the source code if needed.
The nodejs service is exposing port 3001 not 3000. 3001:3000 is a port mapping which forwards :3001 to the :3000 container port. So you would need to point it to nodejs:3001.
However, I don't think that'll work since the nodejs hostname is not accessible by the browser. You need to point it to the host in which docker is running since you are exposing those ports there. If you are running this locally it might look like:
<script type="text/javascript" src="//localhost:3001/socket.io/socket.io.js"></script>
In other words, you are not connecting to the nodejs server from the apache service, you are accessing it externally through the browser.

Resources