How to Dockerise front-end application to communicate with already dockerised api application - node.js

Currently, I am developing an application that makes use of a fully dockerised Node API service (which is served locally by Docker Nginx) and a simple React front-end which is served locally using PHP valet.
Running docker-compose up, builds the dockerized application successfully hosted on Nginx on port 80 and proxy passes requests to the node API on port 9999.
The problem that I am having is getting the react front-end to communicate with the dockerised node API.
Ideally I would like to either; move the dockerized Node api service to be served by PHP valet or move the front-end webapp to also be served by the docker nginx service
How can I fully dockerize the front-end and back-end services together so that they can talk to each other (ideally both hosted on port 80) or alternatively suggest how I can use Valet or MAMP together with containerised Node API.

How can I fully dockerize the front-end and back-end services together so that they can talk to each other (ideally both hosted on port 80) or alternatively suggest how I can use Valet or MAMP together with containerised node API.
With or without Docker, you will be confronted to same origin policy for frontend requests to the backend (XMLHttpRequest and some other requests) since two URLs have the same origin if the protocol, port and host are the same.
If the two apps were containers, hosts would differ by design since these will be two distinct containers. And if you kept the frontend app on the host, it would be also the case.
So in any case, you should enable CORS in the backend side (NodeJS) for at least requests of the frontend host. From the frontend side, nothing should be needed since CORS is a browser stuff : it sends pre-flights requests to the target host to get resource sharing agreement.
About make the frontend a Docker container, I think that if the frontend is designed to work with the Node backend, you should probably also make that a docker image/container and use docker-compose to manage them : it will make them more simple to manage/maintain.

Build the front-end to static HTML and serve it from the backend from the static directory.
The React front-end is loaded on a user's machine and executes there in their browser. It communicates to the server over HTTP(S).
How are you serving the React app now? A React app is just plain text files that are served over HTTP.

Related

Deploying back-end and front-end on AWS

I have a full-stack application with Node and express for the back-end (with Postgres in a AWS RDS created already) and Angular for the front-end.
Now, when running locally in development I have the back-end listening to port 3000 and conected a Pool client with a Postgres DB in AWS RDS. Separately, I have my front-end listenting to port 4200.
When running the server and the angular in these two different ports and opening my browser everything works fine.
Now my questions are about how to deploy this same structure in AWS all together.
Shall I deploy in AWS the back-end and front-end listening to these two different ports (as deployment) or they should listen to the same one and add a proxy server like Ngnix like I have been reading?
In the last case, how?
If you have a simple setup, and you always deploy frontend and backend at the same time, one option would be: just let you express backend also serve your static Angular files.
This avoids any unnecessary complexity and components.
So, you have both frontend and backend exposed at the same port. Maybe all your backend APIs are listening under the /api route. Everything else is served statically, with a fallback to index.html for client-side routing.
By having frontend and backend on the same origin (host:port), you also avoid any CORS issues.

HTTP requests between two docker services works fine but I don't know why [duplicate]

I have a ReactJS project with its own Dockerfile, exposing port 3000:3000.
I also have a PHP project with its own Dockerfile, exposing port 80:80. The PHP app also has containers for MySQL, Redis and Nginx
For the PHP app, I have a docker-compose file that creates a network (my-net) for PHP, Nginx, MySQL and Redis to communicate on. However, I now want the ReactJS (which is in a separate project) to be able to communicate with the PHP app.
I added a docker-compose file to the React project, and added it to the network from the PHP project my-net and declared it as external so that it doesn't try to create it.
This seems to work: From the ReactJS container, I can ping app (the name of my backend service) and it works properly. However, from the ReactJS code, if I use something like axios to try and hit the backend API, it can't resolve app or http://app or any variation. It can however access the underlying IP address if I substitute that into in axios.
So there seems to be some issue with the hostname resolution, and presumably this is on the axios / JavaScript end. is there something I'm missing or a reason this isn't working?
When the JavaScript runs in a browser (outside of Docker) you can not use app because that is only available inside the Docker network (via the embedded DNS server).
To access your PHP server from outside use localhost and the exposed port (80) instead.

Using an auth-cookie when local server and SPA running on different ports

I have a nest.js webserver listening on localhost:3000 and an angular frontend served to localhost:4200 (with dev server). These ports are the defaults. My authentication flow consists of sending an access-token in a cookie to the frontend which doesn't get send back on subsequent calls because of different domain issues by the different ports. Can I overcome this issue somehow? I understand that if I don't run npm serve for the angular application only npm run build then a development server won't be started and I can serve the static files with nest.js which would solve the domain issue for the cookie, but this way I would loose watch mode and hot reloading for angular. Any suggestions?
Your nest.js webserver needs to set the Access-Control-Allow-Origin header, so that the browser running the frontend doesn't complain about communicating with that host.

Is there way to run React app in nginx(Docker, nginx knowledge required)

I have react app and nodejs api server. React app make fetch requests to nodejs server. React app and nodejs server deployed in own containers.
The problem is I can access nodejs server directly in browser, so is there way to 'hide' nodejs backend server, and let access only through frontend.
It should work something like this
React app make fetch request
nginx intercept request and forward to nodejs server
nodejs handles request
I think it can be done with nginx reverse proxy or docker networks or somehow...
yes there is. What you do is run a docker-compose that runs 3 docker containers. One container runs nginx, the second one runs create-react-app host ui, and the 3rd one runs a node js api. You then set the nginx routing rule for all /api/* routes to be reverse proxied to the nodejs api then you make sure all other get requests /* go to the create-react-app being hosted.
Here is a similar example on medium: https://medium.com/#xiaolishen/develop-in-docker-a-node-backend-and-a-react-front-end-talking-to-each-other-5c522156f634

Running two applications on Linux server and routing

I have got 2 applications:
Nodejs application and Angular application.
I would like to host them both on the same Linux server (Linode).
Also I have a DNS record for example : forexample.com.
I would like that when I navigate to api.forexample.com it will navigate inside the linux server to the Angular application, and I should see the angular pages.
The nodejs application is a API application which I would like other people to make all the HTTP requests to api.forexample.com/api.
So the question is how to make the navigation inside the linux server?
Generally speaking to run multiple applications on a server. First you need to add an A record on your DNS record for api.forexample.com
Then you can use nginx to handle the two applications. The way it will work is that each application will run locally on its own port and nginx will handle the url you provide and map it to the appropriate application. Check out this tutorial: Configure Nginx as a web server
In your situation you could serve the angular app from the node application.
Check this too: How to serve an angular2 app in a node.js server

Resources