Calling different Node js APIs from React Front End - node.js

Greetings,
I have a project based on Nodejs (Typescript) as Backend and React with Express in the Frontend.
In Backend, I have 3 docker containers (1-Postgres, 2-ServiceA, 3-ServiceB) with each assigned to different Port.
Then the 3 Containers/services are up and running using (docker-compose build & docker-compose up) and after that I run (npm run start:dev) on each folders of ServiceA and Service B.
My problem is how to use Express React to call functions from ServiceA api, then on another browser page call functions from ServiceB api.
Help Please.

If I understand your question, I suggest you to use an api gateway, like krakend (https://www.krakend.io/) i.e.
An api gateway aggregate your multiple backend (that now works on different port), under one single port (basically like a proxy) that you'll use to be listened by express.

Related

Deploying back-end and front-end on AWS

I have a full-stack application with Node and express for the back-end (with Postgres in a AWS RDS created already) and Angular for the front-end.
Now, when running locally in development I have the back-end listening to port 3000 and conected a Pool client with a Postgres DB in AWS RDS. Separately, I have my front-end listenting to port 4200.
When running the server and the angular in these two different ports and opening my browser everything works fine.
Now my questions are about how to deploy this same structure in AWS all together.
Shall I deploy in AWS the back-end and front-end listening to these two different ports (as deployment) or they should listen to the same one and add a proxy server like Ngnix like I have been reading?
In the last case, how?
If you have a simple setup, and you always deploy frontend and backend at the same time, one option would be: just let you express backend also serve your static Angular files.
This avoids any unnecessary complexity and components.
So, you have both frontend and backend exposed at the same port. Maybe all your backend APIs are listening under the /api route. Everything else is served statically, with a fallback to index.html for client-side routing.
By having frontend and backend on the same origin (host:port), you also avoid any CORS issues.

HTTP requests between two docker services works fine but I don't know why [duplicate]

I have a ReactJS project with its own Dockerfile, exposing port 3000:3000.
I also have a PHP project with its own Dockerfile, exposing port 80:80. The PHP app also has containers for MySQL, Redis and Nginx
For the PHP app, I have a docker-compose file that creates a network (my-net) for PHP, Nginx, MySQL and Redis to communicate on. However, I now want the ReactJS (which is in a separate project) to be able to communicate with the PHP app.
I added a docker-compose file to the React project, and added it to the network from the PHP project my-net and declared it as external so that it doesn't try to create it.
This seems to work: From the ReactJS container, I can ping app (the name of my backend service) and it works properly. However, from the ReactJS code, if I use something like axios to try and hit the backend API, it can't resolve app or http://app or any variation. It can however access the underlying IP address if I substitute that into in axios.
So there seems to be some issue with the hostname resolution, and presumably this is on the axios / JavaScript end. is there something I'm missing or a reason this isn't working?
When the JavaScript runs in a browser (outside of Docker) you can not use app because that is only available inside the Docker network (via the embedded DNS server).
To access your PHP server from outside use localhost and the exposed port (80) instead.

How to Dockerise front-end application to communicate with already dockerised api application

Currently, I am developing an application that makes use of a fully dockerised Node API service (which is served locally by Docker Nginx) and a simple React front-end which is served locally using PHP valet.
Running docker-compose up, builds the dockerized application successfully hosted on Nginx on port 80 and proxy passes requests to the node API on port 9999.
The problem that I am having is getting the react front-end to communicate with the dockerised node API.
Ideally I would like to either; move the dockerized Node api service to be served by PHP valet or move the front-end webapp to also be served by the docker nginx service
How can I fully dockerize the front-end and back-end services together so that they can talk to each other (ideally both hosted on port 80) or alternatively suggest how I can use Valet or MAMP together with containerised Node API.
How can I fully dockerize the front-end and back-end services together so that they can talk to each other (ideally both hosted on port 80) or alternatively suggest how I can use Valet or MAMP together with containerised node API.
With or without Docker, you will be confronted to same origin policy for frontend requests to the backend (XMLHttpRequest and some other requests) since two URLs have the same origin if the protocol, port and host are the same.
If the two apps were containers, hosts would differ by design since these will be two distinct containers. And if you kept the frontend app on the host, it would be also the case.
So in any case, you should enable CORS in the backend side (NodeJS) for at least requests of the frontend host. From the frontend side, nothing should be needed since CORS is a browser stuff : it sends pre-flights requests to the target host to get resource sharing agreement.
About make the frontend a Docker container, I think that if the frontend is designed to work with the Node backend, you should probably also make that a docker image/container and use docker-compose to manage them : it will make them more simple to manage/maintain.
Build the front-end to static HTML and serve it from the backend from the static directory.
The React front-end is loaded on a user's machine and executes there in their browser. It communicates to the server over HTTP(S).
How are you serving the React app now? A React app is just plain text files that are served over HTTP.

How to join a react app and an express app?

I have a react app with these directories :
-node_modules
-public
-src
when I run it ( npm start) it will be started !
On the other hand I have some nodejs database config file and server.js that I don't know where to put them.
Moreover I want to know How can I start both apps together and generally how to merge these two apps ?
I'm new to both of these apps BTW. Thanks.
Probably the concept you are trying to understand is about two applications. The first one is called backend (server.js). The second one is the frontend (react app). Usually, you will run them separately (check this tutorial). Let's suppose:
Backend will start on port 5000 and serve an API.
Frontend will serve pages (HTML + Javascript) and might run on port 3000
So, you need to open two terminals (or prompt on windows) and start 2 process:
Terminal 1 - Backend
node server.js
Terminal 2 - Frontend
yarn dev
In this case, you can make HTTP requests directly to your API (backend) calling the backend. For example: http://localhost:5000/api/something
If you hit http://localhost:3000 you should see your web page loaded by index.html file and all react application.
The frontend is just the user interface running on client's browser. So it has to make requests to the backend to actually save and load data (where the database resides).
It's also possible to serve the frontend files using your backend but it seems that the concept you need right now is the separation of frontend and backend.

How to implement create-react-app with node.js backend with heroku?

I am looking to serve out some data from some json and csv files to my front end, which is built on create-react-app.
My idea is to have them both listen at different ports (3000 for React, 3001 for express backend) and then make API calls to 3001 to get whatever data I need through ajax.
How can this work in production?
For instance with Heroku, how would I go about deploying this since it's listening at 2 different ports?
What unforeseen issues will this have/ is there a better method?
To add some more info:
All my backend is doing is parsing some csv and json and serving it out in formatted and edited json.
If you use Heroku, it is recommended to use two dynos separately. One for serving static files and react, and the other for API server.
Or you can use PM2 to do the same things in one dyno, by using fork mode of it.
In both case, as two servers don't share the same port, you will have some problems using sessions, and troubles to make API requests. I think it can be solved by using token based authorization like jwt or using separate session storage like redis

Resources