Deploying back-end and front-end on AWS - node.js

I have a full-stack application with Node and express for the back-end (with Postgres in a AWS RDS created already) and Angular for the front-end.
Now, when running locally in development I have the back-end listening to port 3000 and conected a Pool client with a Postgres DB in AWS RDS. Separately, I have my front-end listenting to port 4200.
When running the server and the angular in these two different ports and opening my browser everything works fine.
Now my questions are about how to deploy this same structure in AWS all together.
Shall I deploy in AWS the back-end and front-end listening to these two different ports (as deployment) or they should listen to the same one and add a proxy server like Ngnix like I have been reading?
In the last case, how?

If you have a simple setup, and you always deploy frontend and backend at the same time, one option would be: just let you express backend also serve your static Angular files.
This avoids any unnecessary complexity and components.
So, you have both frontend and backend exposed at the same port. Maybe all your backend APIs are listening under the /api route. Everything else is served statically, with a fallback to index.html for client-side routing.
By having frontend and backend on the same origin (host:port), you also avoid any CORS issues.

Related

Is it possible to not hardcode express based server ip in a react based frontend app?

I have a backed app written in typescript which has an express server running on port 5001. I have a front end app written using react. Both these apps are hosted on same machine.
The react app needs to fetch data from the backed app using the port 5001 being served by express server.
The machine has an IP of 10.250.1.4 for example. So from the host machine, I could use localhost:5001 or 10.250.1.4:5001 in the axios requests and both will work.
The react development server in this case is running at Port 3000 for the front end app.
When another client machine, running on 10.250.1.6 for example, tries to access 10.250.1.4:3000 from its browser, the react app is sent to the client. In this case the axios requests from the client to backed server will only run if the axios is looking at the 10.250.1.4 rather than localhost.
I wish to write the axios in front end app in a way that I don't have to specify the ip of the express server. Both the front end and backed app will always be hosted at same machine.
Is there a way?
Thanks.

How does the "proxy" field in a create-react-app's package.json work?

I have a NodeJS backend running at http://localhost:4050, and I had configured my react application to make API calls to there. For deploying on heroku, I had to change the PORT variable in the backend to be process.env.PORT. As a result when I put the react app's build folder in the backend's server folder, the react application was still searching for localhost:4050 when I deployed to heroku and naturally failed to make calls, because heroku ran the application on an arbitrarily different port. But apparently adding the very same http://localhost:4050 as "proxy":"http://localhost:4050" in the package.json file worked. I'm really curious as to how doing that got it to work.
proxy field in package.json is used to proxy all requests from frontend domain to backend. For example, you have:
Backend (REST API): localhost:5000/api/user/:id
Frontend (React.JS app): localhost:3000/user
If you call axios.get('/api/user/123'), the browser will send this request to localhost:3000/api/user/123, but then react dev server will peoxy it to localhost:5000/api/user/123
Please note that this is only for development environment. If you want to deploy your React.JS, there's a better way: https://blog.heroku.com/deploying-react-with-zero-configuration

How to deploy different MERN apps to digital ocean on a single droplet?

I've always used heroku to deploy my MERN apps. For the mongo db I use MongoDB Atlas, but in my job they want to migrate all the projects to DigitalOcean. I have several questions regarding this:
Can I have mongoDB + nodejs backend + react app on a single
droplet?
Can I deploy two or more apps in a single droplet? (The
apps have different domains)
Is there a video tutorial about this
(I've read lots of documentations and got many errors while trying
to do it. My eyes hurt 🙃)
For example if I have in Heroku two apps for the same website, one app for the nodejs backend and another one for the react frontend... can I do the same on DigitalOcean?
Thanks in advance!
Yeah, you can deploy multiple services in a single server, they just need to be listening on different ports.
For example, let's consider that a MongoDB server is running on port 27017, a Node.js http server is running on port 5000, and a React app is running on port 8000.
Say, your server's IP is 13.13.13.13.
Then you can access your MongoDB server, Node.js http server, and React app using 13.13.13.13:27017, 13.13.13.13:5000, and 13.13.13.13:8000, respectively, from anywhere in the Internet where your IP isn't blocked.
Now, in your server, you set up iptables to forward all incoming connections from port 8000 to 80. Now, you can access your React app by visiting 13.13.13.13, no need to use the port anymore.
Now, let's say, you add DNS records for example.com and api.example.com to point to your IP. And since you can't have A records or CNAME records pointing to a port, both of your domains will direct you to your React app. You'll have to explicitly specify the port number along with your domain if you want to access your Node.js backend, like http://example.com:5000, or http://api.example.com:5000.
If you don't want to access your backend using the port number, you can make use of Nginx as a reverse proxy. You can set up Nginx to route all the traffic to api.example.com to your backend server listening on localhost:5000.

How to Dockerise front-end application to communicate with already dockerised api application

Currently, I am developing an application that makes use of a fully dockerised Node API service (which is served locally by Docker Nginx) and a simple React front-end which is served locally using PHP valet.
Running docker-compose up, builds the dockerized application successfully hosted on Nginx on port 80 and proxy passes requests to the node API on port 9999.
The problem that I am having is getting the react front-end to communicate with the dockerised node API.
Ideally I would like to either; move the dockerized Node api service to be served by PHP valet or move the front-end webapp to also be served by the docker nginx service
How can I fully dockerize the front-end and back-end services together so that they can talk to each other (ideally both hosted on port 80) or alternatively suggest how I can use Valet or MAMP together with containerised Node API.
How can I fully dockerize the front-end and back-end services together so that they can talk to each other (ideally both hosted on port 80) or alternatively suggest how I can use Valet or MAMP together with containerised node API.
With or without Docker, you will be confronted to same origin policy for frontend requests to the backend (XMLHttpRequest and some other requests) since two URLs have the same origin if the protocol, port and host are the same.
If the two apps were containers, hosts would differ by design since these will be two distinct containers. And if you kept the frontend app on the host, it would be also the case.
So in any case, you should enable CORS in the backend side (NodeJS) for at least requests of the frontend host. From the frontend side, nothing should be needed since CORS is a browser stuff : it sends pre-flights requests to the target host to get resource sharing agreement.
About make the frontend a Docker container, I think that if the frontend is designed to work with the Node backend, you should probably also make that a docker image/container and use docker-compose to manage them : it will make them more simple to manage/maintain.
Build the front-end to static HTML and serve it from the backend from the static directory.
The React front-end is loaded on a user's machine and executes there in their browser. It communicates to the server over HTTP(S).
How are you serving the React app now? A React app is just plain text files that are served over HTTP.

Is there way to run React app in nginx(Docker, nginx knowledge required)

I have react app and nodejs api server. React app make fetch requests to nodejs server. React app and nodejs server deployed in own containers.
The problem is I can access nodejs server directly in browser, so is there way to 'hide' nodejs backend server, and let access only through frontend.
It should work something like this
React app make fetch request
nginx intercept request and forward to nodejs server
nodejs handles request
I think it can be done with nginx reverse proxy or docker networks or somehow...
yes there is. What you do is run a docker-compose that runs 3 docker containers. One container runs nginx, the second one runs create-react-app host ui, and the 3rd one runs a node js api. You then set the nginx routing rule for all /api/* routes to be reverse proxied to the nodejs api then you make sure all other get requests /* go to the create-react-app being hosted.
Here is a similar example on medium: https://medium.com/#xiaolishen/develop-in-docker-a-node-backend-and-a-react-front-end-talking-to-each-other-5c522156f634

Resources