can you use react server to forward your request to backend? - node.js

I am working on a tool where frontend is made with react js (create-react-app and served with serve) and backend is made with express+node and both of the servers are running on different Docker container on a remote machine. now the problem is sys admin is saying he will only make frontend accessible from outside the world and backend container is only accessible by frontend container. Since I am making calls XMLHttpRequest on /API/example and backend does not have public URL so can I use frontend server to redirect my request to the backend?
To make my self clear this is what I want to do, frontend is running on https://frontend.com and I make API/example call from browser to https://frontend.com/API/example and server will redirect this call to backend and reply back with a response.

You should use some sort of webserver instead of serve npm package. The problem in your case is that you want to proxy all requests to backend and serve package is only a static server. The easiest method for you is to write a custom node js script, serve the static assets (e.g. with express) and proxy all requests to backend. (e.g. with express-http-proxy).
A sample code would look like this:
const proxy = require('express-http-proxy');
const app = require('express')();  
app.use('/api', proxy('http://backend.url')); // this will proxy all incoming requests to /api route to backend base url
app.use(express.static(‘path/to/your/static/assets’)); // these were previously served with serve
app.listen(3000, () => console.log(‘Frontend listening on port: 3000’));
Another possible way is to use nginx, apache or whatever webserver you’d like and do essentially the same.

Related

How to retrieve Nodejs/Express web server port number from a loaded-in-browser Javascript file

My situation is the following:
I have a Node/Express server.js file which gets run by Node.js. This backend is for my website, which I deploy to Heroku. The webserver serves two endpoints, from either const PORT = process.env.PORT || 8000:
'/', or root (this responds by sending file index.html)
'/transcription'
When a request is made to /transcription, an Axios post request is made to a third-party API to get a temporary token. This token is passed into the response from /transcription as res.json(data). No problems so far.
The trickiness enters here. I have JavaScript file, asr.js, which is loaded when I visit page asr.html (different from index.html). In asr.js, I fetch http://localhost:8000/transcription. Now, this works locally for me; I get my temporary token and continue with my transcription. However, when I push this to an environment like Heroku, where I don't know what the port number is going to be, I cannot successfully make that fetch request to neither http://localhost:8000/transcription nor http://localhost:${PORT}/transcription because process is not defined in my asr.js file; it's not being executed by node.
So, the question I have is how can I determine what my Node/Express server port is when I deploy my app to Heroku such that I can use it in asr.js when I visit asr.html?
Or maybe that's not the right question to be asking? Community, please help!
Here is a snapshot of my folder hierarchy, for more context. Let me know if more information is required or if my initial explanation is not enough.
To be clear, I use middleware to statically serve the public folder.
Unless I'm missing something here, why would you need the port in production?
Very rarely is node run as the webserver, typically you will have NGINX proxy the request to the backend server so that port is only valid to the NGINX server to proxy those request back to node, otherwise your app would not work with a TLD without the port.
Long story short, your frontend JS should not need the port number and be able to just use the TLD running on port 80 / 443

Deploying back-end and front-end on AWS

I have a full-stack application with Node and express for the back-end (with Postgres in a AWS RDS created already) and Angular for the front-end.
Now, when running locally in development I have the back-end listening to port 3000 and conected a Pool client with a Postgres DB in AWS RDS. Separately, I have my front-end listenting to port 4200.
When running the server and the angular in these two different ports and opening my browser everything works fine.
Now my questions are about how to deploy this same structure in AWS all together.
Shall I deploy in AWS the back-end and front-end listening to these two different ports (as deployment) or they should listen to the same one and add a proxy server like Ngnix like I have been reading?
In the last case, how?
If you have a simple setup, and you always deploy frontend and backend at the same time, one option would be: just let you express backend also serve your static Angular files.
This avoids any unnecessary complexity and components.
So, you have both frontend and backend exposed at the same port. Maybe all your backend APIs are listening under the /api route. Everything else is served statically, with a fallback to index.html for client-side routing.
By having frontend and backend on the same origin (host:port), you also avoid any CORS issues.

How to Dockerise front-end application to communicate with already dockerised api application

Currently, I am developing an application that makes use of a fully dockerised Node API service (which is served locally by Docker Nginx) and a simple React front-end which is served locally using PHP valet.
Running docker-compose up, builds the dockerized application successfully hosted on Nginx on port 80 and proxy passes requests to the node API on port 9999.
The problem that I am having is getting the react front-end to communicate with the dockerised node API.
Ideally I would like to either; move the dockerized Node api service to be served by PHP valet or move the front-end webapp to also be served by the docker nginx service
How can I fully dockerize the front-end and back-end services together so that they can talk to each other (ideally both hosted on port 80) or alternatively suggest how I can use Valet or MAMP together with containerised Node API.
How can I fully dockerize the front-end and back-end services together so that they can talk to each other (ideally both hosted on port 80) or alternatively suggest how I can use Valet or MAMP together with containerised node API.
With or without Docker, you will be confronted to same origin policy for frontend requests to the backend (XMLHttpRequest and some other requests) since two URLs have the same origin if the protocol, port and host are the same.
If the two apps were containers, hosts would differ by design since these will be two distinct containers. And if you kept the frontend app on the host, it would be also the case.
So in any case, you should enable CORS in the backend side (NodeJS) for at least requests of the frontend host. From the frontend side, nothing should be needed since CORS is a browser stuff : it sends pre-flights requests to the target host to get resource sharing agreement.
About make the frontend a Docker container, I think that if the frontend is designed to work with the Node backend, you should probably also make that a docker image/container and use docker-compose to manage them : it will make them more simple to manage/maintain.
Build the front-end to static HTML and serve it from the backend from the static directory.
The React front-end is loaded on a user's machine and executes there in their browser. It communicates to the server over HTTP(S).
How are you serving the React app now? A React app is just plain text files that are served over HTTP.

Using an auth-cookie when local server and SPA running on different ports

I have a nest.js webserver listening on localhost:3000 and an angular frontend served to localhost:4200 (with dev server). These ports are the defaults. My authentication flow consists of sending an access-token in a cookie to the frontend which doesn't get send back on subsequent calls because of different domain issues by the different ports. Can I overcome this issue somehow? I understand that if I don't run npm serve for the angular application only npm run build then a development server won't be started and I can serve the static files with nest.js which would solve the domain issue for the cookie, but this way I would loose watch mode and hot reloading for angular. Any suggestions?
Your nest.js webserver needs to set the Access-Control-Allow-Origin header, so that the browser running the frontend doesn't complain about communicating with that host.

Is there way to run React app in nginx(Docker, nginx knowledge required)

I have react app and nodejs api server. React app make fetch requests to nodejs server. React app and nodejs server deployed in own containers.
The problem is I can access nodejs server directly in browser, so is there way to 'hide' nodejs backend server, and let access only through frontend.
It should work something like this
React app make fetch request
nginx intercept request and forward to nodejs server
nodejs handles request
I think it can be done with nginx reverse proxy or docker networks or somehow...
yes there is. What you do is run a docker-compose that runs 3 docker containers. One container runs nginx, the second one runs create-react-app host ui, and the 3rd one runs a node js api. You then set the nginx routing rule for all /api/* routes to be reverse proxied to the nodejs api then you make sure all other get requests /* go to the create-react-app being hosted.
Here is a similar example on medium: https://medium.com/#xiaolishen/develop-in-docker-a-node-backend-and-a-react-front-end-talking-to-each-other-5c522156f634

Resources