NodeJS http module vs Nginx Server - node.js

I have read that proxies can be created by Nginx server for nodejs application to listen on but I am doubtful as to what exactly this will serve additional purpose and advantages compared to http module provide by nodejs for listening purpose.

For one, you can serve multiple Node applications on one server, with host based virtual servers managed by nginx, so that requests to the same port but with different Host: HTTP header reach different Node applications.
Also nginx can be set up to serve static assets without hitting your Node app and do some caching if you need it.
Those are two things that you can achieve with adding nginx to the mix but you may not need that in your case. Also, you can run a reverse proxy with Node and without nginx if that's what you prefer.

Related

Why is a web server (i.e Nginx) REQUIRED for FastAPI but not Express API?

A typical request gets processed like this:
request -> Nginx reverse proxy -> AWS EC2 -> Express API/FastAPI -> response
but my biggest confusion is why a FastAPI absolutely NEEDS Nginx to work, but an Express API doesn't (despite nodejs and python both having the http module and hence able to make web servers). Why do I need Nginx at all for FastAPI? Can't an AWS EC2 instance act as a web server like Nginx?
This post says it is so we can hide the port number in the url, but that being the only reason sounds unreasonable to me.
Why is a web server (i.e Nginx) REQUIRED for FastAPI
Nginx is not required for FastAPI. You can listen on external ports and handle requests without a reverse proxy. Even FastAPI documentation includes setting up Nginx under the Advanced section (Ref)
This post says it's so we can hide the port number in the url, but this being the only reason seems silly to me.
You can still hide the port number in the url if you run the Express app with sudo and listen on port 80 or 443.
Can't an AWS EC2 instance act as a web server like Nginx?
Yes it can.
There are benefits of using a proxy server like Nginx. Proxy servers can handle load balancing, caching, SSL termination and they can handle large number of concurrent connections efficiently.
Using a proxy server is a best practice. However, it is not a requirement for FastAPI or Express.

Why use nginx if there is a proxy middleware for nodejs?

I'm really confused with reverse proxy. What i understood is in forward proxy the client know the destination server but the server doesn't know the client, in reverse proxy the server knows the client but the client doesn't know the "server" he's visiting is a actually proxying to some other server. And to use the reverse proxy you can use NGINX. But if we can use that, why do express framework middlewares like http-proxy-middleware
exist?
and if my understanding of proxy and reverse proxy is wrong please correct me
Lets take an abstract example:
You will agree that you must be using port 3000 or something to run NodeJS... Right?
And lets say you also use angular/react or html+css to run your frontend website which is lets say on port 4200 (default for angular).
Now what if you want to have only one server and want two different services (frontend in angular and backend in nodejs) to run on that single server only.
So you need something in between your client and server to distinguish between the requests whether to forward them to angular or nodejs or any other service as well that is running on the same server.
What reverse proxy such as NGINX will do is you will define some rules on the basis of which the administrator of the server can utilize same server to serve various services.
This is the simplest example I can think of on the top of my head.

Do I need a different server to run node.js

sorry if this is a wrong question on this forum but I am simply just stuck and need some advice. I have a shared hosting service and a cloud based hosting server with node.js installed. I want to host my website as normal but I also want to add real time chat and location tracking using node.js I am confused with what I am reading in several places because node.js is itself a server but not designed to host websites? So I have to run 2 different servers? One for the website and one to run node.js? When I setup the cloud one with a node.js script running I can no longer access the webpages.
Whats the best way for me achieve this as I am just going round in circles. Also is there a way I can set up a server on my PC and run and test both of these together before hand so I see what is needed and get it working as it will stop me ordering servers I dont need.
Many thanks for any help or advice.
Node can serve webpages using a framework like Express, but can cause conflicts if run on the same port as another webserver program (Apache, etc). One solution could be to serve your webpages through your webserver on port 80 (or 443 for HTTPS) and run your node server on a different port in order to send information back and forth.
There are a number of ways you can achieve this but here is one popular approach.
You can use NGINX as your front facing web server and proxy the requests to your backend Node service.
In NGINX, for example, you will configure your upstream service as follows:
upstream lucyservice {
server 127.0.0.1:8000;
keepalive 64;
}
The 8000 you see above is just an example, you may be running your Node service on a different port.
Further in your config (in the server config section) you will proxy the requests to your service as follows:
location / {
proxy_pass http://lucyservice;
}
You're Node service can be running in a process manager like forever / pm2 etc. You can have multiple Node services running in a cluster depending on how many processors your machine has etc.
So to recap - your front facing web server will be handling all traffic on port 80 (HTTP) and or 443 (HTTPS) and this will proxy the requests to your Node service running on whatever port(s) you define. All of this can happen on one single server or multiple if you need / desire.

How to point different subdomains to different applications on the same server? (use node.js as a proxy?)

I'm setting up a node.js server but I would also like to have Apache running on there at the same time. Node is going to be the main website, and there will be subdomains that point to Apache.
The only way I can think of how to do this is have the different applications listen to different ports and then have a proxy application that listens to port 80 and then "redirects" the port according to the subdomain used. I'm not sure if this is the right way to do it, or how to do it if it is.
Research has shown me that it could be possible to use Apache as this proxy, though I would prefer it if I didn't have to. If I could somehow use node.js to do it, that would be fantastic (my preferred solution). If that is impractical/impossible, then of course I am open to other ideas.
I would really appreciate some guidance as to how to do this.
You wanted a solution that can serve both Node.js and Apache at the same time, and you wanted to have Node.js to do the reverse proxy. However, it is best to use a program that is designed for reverse proxy (Nginx, HAProxy) for that job. Using Node.js as a reverse proxy server will be inefficient.
Nginx is something I recommend. It is simple and highly efficient. You can have the Nginx server at the very front, taking in all the requests.
Here is how you setup Nginx to reverse proxy to Node
http://www.nginxtips.com/how-to-setup-nginx-as-proxy-for-nodejs/
And here is how to setup Nginx to reverse proxy to Apache
http://www.howtoforge.com/how-to-set-up-nginx-as-a-reverse-proxy-for-apache2-on-ubuntu-12.04
Simply combine the setting files of the two will enable you to serve apache and node at the same time.
Have a look through this thread
While it discusses some issues using http-proxy with WebSockets on <= 0.8.x, if you aren't doing that, you should be fine.
You can create a very basic proxy listener like so:
var http = require('http'),
httpProxy = require('http-proxy');
httpProxy.createServer(8888, 'localhost').listen(80);
And create a back-end server like so:
var http = require('http').listen(8888);
But of course, more complex implementations can be accomplished by reading the http-proxy documentation.

Hosting PHP and Node.js apps on the same server with multiple domains

I have a Linode VPS, currently running lighttpd to serve up my PHP websites and listening on port 80.
I'm also running Node.js, which listens on port 81, and uses websockets and HTTP to interact with the client.
There's a couple of different domains that I would like to point to this server. Ideally, I would like the domains which host the PHP sites to all talk to the same lighttpd server, and the sites which use node.js would somehow redirect to the port node.js is listening on unbeknownst to the client (e.g. no 30x redirect).
example-php1.com:80 -> linodebox:80 lighttpd /var/www/example1
example-php2.com:80 -> linodebox:80 lighttpd /var/www/example2
example-node.com:80 -> linodebox:81 node.js
Is there a way to do this, either by setting DNS entries or tweaking iptables? Does lighttpd need to be a proxy for node.js? The websockets feature needs to work without any fallbacks, and visiting a non node domain, e.g. example-php1.com:81, should not expose the node application.
I feel the perfect solution wouldn't require changes to existing application code nor require proxying between software web servers, but I could be wrong.
What's up Tom!?
I recommend HA-Proxy, it's one of the most high performance proxies out there and should accomplish what you're trying to do there.
I'm doing something similar with nginx acting as a proxy, it's easy but not the fastest.
HA-Proxy's website is here http://haproxy.1wt.eu
If you wanted a 'pure' solution, you could probably get the answer from looking at ha-proxy's source code. You can't really do it with iptables. Something has to read the HTTP header to determine where the request came from to route it locally.
I had basically the same problem and I ended up using node-http-proxy (also available in npm as http-proxy).
You just need a simple config file:
{
router: {
'example-php1.com': 'linodebox:80',
'example-php2.com': 'linodebox:80',
'example-node.com': 'linodebox:81
}
}
Then just run node-http-proxy --config options.json and you're set. If you want to run lighttpd and node on the same machine, you'll have to start lighttpd on a different port (I use 81 for php and 3000 for node - adjusting the config is easy). I also use forever to manage my node instances.
Ya'll are gonna hate me...
I ended up going with a second IP address, then followed the Linode tutorial to setup multiple static IPs. Then, I configured lighttpd to bind to one IP address and Node.js to bind to another IP address.
This isn't a great solution as it doesn't scale.
Update: lighttpd 1.4.46 (released back in 2017) added multiple ways to accept WebSocket connections:
lighttpd mod_wstunnel
lighttpd mod_proxy
lighttpd mod_cgi

Resources