Hosting PHP and Node.js apps on the same server with multiple domains - node.js

I have a Linode VPS, currently running lighttpd to serve up my PHP websites and listening on port 80.
I'm also running Node.js, which listens on port 81, and uses websockets and HTTP to interact with the client.
There's a couple of different domains that I would like to point to this server. Ideally, I would like the domains which host the PHP sites to all talk to the same lighttpd server, and the sites which use node.js would somehow redirect to the port node.js is listening on unbeknownst to the client (e.g. no 30x redirect).
example-php1.com:80 -> linodebox:80 lighttpd /var/www/example1
example-php2.com:80 -> linodebox:80 lighttpd /var/www/example2
example-node.com:80 -> linodebox:81 node.js
Is there a way to do this, either by setting DNS entries or tweaking iptables? Does lighttpd need to be a proxy for node.js? The websockets feature needs to work without any fallbacks, and visiting a non node domain, e.g. example-php1.com:81, should not expose the node application.
I feel the perfect solution wouldn't require changes to existing application code nor require proxying between software web servers, but I could be wrong.

What's up Tom!?
I recommend HA-Proxy, it's one of the most high performance proxies out there and should accomplish what you're trying to do there.
I'm doing something similar with nginx acting as a proxy, it's easy but not the fastest.
HA-Proxy's website is here http://haproxy.1wt.eu
If you wanted a 'pure' solution, you could probably get the answer from looking at ha-proxy's source code. You can't really do it with iptables. Something has to read the HTTP header to determine where the request came from to route it locally.

I had basically the same problem and I ended up using node-http-proxy (also available in npm as http-proxy).
You just need a simple config file:
{
router: {
'example-php1.com': 'linodebox:80',
'example-php2.com': 'linodebox:80',
'example-node.com': 'linodebox:81
}
}
Then just run node-http-proxy --config options.json and you're set. If you want to run lighttpd and node on the same machine, you'll have to start lighttpd on a different port (I use 81 for php and 3000 for node - adjusting the config is easy). I also use forever to manage my node instances.

Ya'll are gonna hate me...
I ended up going with a second IP address, then followed the Linode tutorial to setup multiple static IPs. Then, I configured lighttpd to bind to one IP address and Node.js to bind to another IP address.
This isn't a great solution as it doesn't scale.

Update: lighttpd 1.4.46 (released back in 2017) added multiple ways to accept WebSocket connections:
lighttpd mod_wstunnel
lighttpd mod_proxy
lighttpd mod_cgi

Related

Is there a way to "host" an existing web service on port X as a network path of another web service on port 80?

What I'm trying to do is create an access website for my own services that run on my linux server at home.
The services I'm using are accessible through <my_domain>:<respective_port_num>.
For example there's a plex instance which is listening on port X and transmission-remote (a torrenting client) listening on port Y and another custom processing service on port Z
I've created a simple website using python flask which I can access remotely which redirects paths to ports (so <my_domain>/plex turns into <my_domain>:X), is there a way to display these services on the network paths I've assigned to them so I don't need to open ports for each service? I want to be able to channel an existing service on :X to <my_domain>/plex without having to modify it, I'm sure it's possible.
I have a bit of a hard time to understand your question.
You certainly can use e.g. nginx as a reverse proxy in front of your web application, listen to any port and then redirect it to the upstream application on any port - e.g. your Flask application.
Let's say, my domain is example.com.
I then can configure e.g. nginx to listen on port 80 (and 443 for SSL), and then proxy all requests to e.g. port 8000, where Flask is running locally.
Yes, this is called using nginx as a reverse proxy. It is well documented on the internet and even the official docs. Your nginx.conf would have something like:
location /my/flask/app/ {
# Assuming your flask app is at localhost:8000
proxy_pass http://localhost:8000;
}
From user's perspective, they will be connecting to your.nginx.server.com/my/flask/app/. But behind the scenes nginx will actually forward the request to your app, and serve its response back to the user.
You can deploy nginx as a Docker container, I recommend doing this as it will keep the local files and configs separate from your own work and make it easier for you to fiddle with it as you learn. Keep in mind that nginx is only HTTP though. You can't use it to proxy things like SSH or arbitrary protocols (not without a lot of hassle anyway). If the services generate their own URLs, you might also need to configure them to anticipate the nginx redirects.
BTW, usually flask is not served directly to the internet, but instead nginx talks to something like Gunicorn to handle various network related concerns: https://vsupalov.com/what-is-gunicorn/

Do I need a different server to run node.js

sorry if this is a wrong question on this forum but I am simply just stuck and need some advice. I have a shared hosting service and a cloud based hosting server with node.js installed. I want to host my website as normal but I also want to add real time chat and location tracking using node.js I am confused with what I am reading in several places because node.js is itself a server but not designed to host websites? So I have to run 2 different servers? One for the website and one to run node.js? When I setup the cloud one with a node.js script running I can no longer access the webpages.
Whats the best way for me achieve this as I am just going round in circles. Also is there a way I can set up a server on my PC and run and test both of these together before hand so I see what is needed and get it working as it will stop me ordering servers I dont need.
Many thanks for any help or advice.
Node can serve webpages using a framework like Express, but can cause conflicts if run on the same port as another webserver program (Apache, etc). One solution could be to serve your webpages through your webserver on port 80 (or 443 for HTTPS) and run your node server on a different port in order to send information back and forth.
There are a number of ways you can achieve this but here is one popular approach.
You can use NGINX as your front facing web server and proxy the requests to your backend Node service.
In NGINX, for example, you will configure your upstream service as follows:
upstream lucyservice {
server 127.0.0.1:8000;
keepalive 64;
}
The 8000 you see above is just an example, you may be running your Node service on a different port.
Further in your config (in the server config section) you will proxy the requests to your service as follows:
location / {
proxy_pass http://lucyservice;
}
You're Node service can be running in a process manager like forever / pm2 etc. You can have multiple Node services running in a cluster depending on how many processors your machine has etc.
So to recap - your front facing web server will be handling all traffic on port 80 (HTTP) and or 443 (HTTPS) and this will proxy the requests to your Node service running on whatever port(s) you define. All of this can happen on one single server or multiple if you need / desire.

Running NodeJS/Express projects on production server

1 With different apps different website domain etc, NodeJS cannot go to production with host:*some port other than 80*, right? If I am wrong, how to deal with NodeJS apps with multiple website on the same machine? ( there is no virtualhost in NodeJS/Express server, isn't there?)
2 So the solution to go prod to me, only alternative is to use some proxy forwarding requests to the NodeJS/Express server IP:port, isn't it? If yes and if it is a different server ( proxy and NodeJS), what does express to start and listen to? (Say, server.listen('port', '0.0.0.0') or server.listen('port', '::')?
3 Any other alternatives to go production with NodeJS/Express projects?
You can use 80 but with sudo. However, it's not recommended.
You're right you need a proxy (nginx, haproxy, etc..) to sit in front of your Node.js app in order to use port 80.
I think you can omit host from server.listen so it will accept connection from ::.
NGINX is the best option to do, what you expect and see the NGINX documentation in official web site.

Nginx + SSL + Rails + Juggernaut (Node.js) + Engineyard

I have two different applications on the same server. One of them is running on the 80 port (mydomain.com), another on the port 443 (sub.mydomain.com) and has wildcard certificate.
The first application is only for information purposes and don't need websockets support.
The second application should have secure websockets support (wss protocol).
I tried to set up juggernaut gem (for websockets) for my rails app with nginx server on the engineyard cloud, but i have one problem. Engineyard cloud provide only two opened ports: 80 and 443. I know that nginx do not fully support http 1.1 reverse proxing, so i can't use proxing from nginx for redirects websockets requests to the specific local port (in my case this port is 8080).
I tried use HAProxy and it's work for me when i use only unsecure websockets, but i need to support secure websockets. As i know in this case i should use something like STunnel for tunneling my https request and than use HAProxy, but when i test it - i saw that the server has to work several times slower and i still did not work to use the secure socket connection :(
Maybe I'm doing something wrong? Maybe someone will tell how to set up nginx for multiple applications (one of them should work via https) and secure websockets using only two ports (80 and 443).
p.s. Also i used a node-http-proxy, in this case i was able to set up proxy for different nginx applications but i do not get run websockets (happened only for "handshake" via nginx, not for "switching protocols")
I did some research on the various reverse proxies and websockets not too long ago. The bottom line is that websockets is new, and the reverse proxy support for it is very poor right now.
The recommendation I saw and I agree with is that you should run your websockets on a different stack than the rest of your items. That usually means putting it on a separate domain or subdomain.
You still have to deal with the complexities of getting the reverse proxies working, but it will be less complicated if you don't have to worry about breaking the other stuff.
Also, I agree that maybe you'll get better answers at serverfault or superuser.

Hosting node.js for a specific domain only on a VPS

I have a VPS where I have hosted a few sites. All based on LAMP stack, so it was no big deal. They provide WHM/cpanel for managing different sites. I decided to try node.js, bought a separate domain for it, and I need some clue how to point that domain to the node.js application.
So here are the questions:
1) What is the best way to host node.js application on a specific domain without hampering the other sites? How will I configure the domain? Yes, I'd like to use default http port (80) for node.
2) As Apache is already listening to the 80 port, is it a good idea to use Apache mod_proxy for the purpose? I mean if I want to use websocket, will apache still use separate threads for maintaining connection to node?
PS. I have already seen this question, but the answers don't seem to be convincing.
Edit:
I forgot to mention, I have an unused dedicated IP for that VPS which I can use for node.js.
Follow these steps
Goto "WHM >> Service Configuration >> Apache Configuration >> Reserved IPs Editor" and then 'Reserved' the IP that you want to use for node.js. This will release the IP from apache.
Create a new DNS entry with a A entry like - example.com A YOUR_IP_ADDRESS
Tell the node.js server to listen to your IP using server.listen(80, "YOUR_IP_ADDRESS");
If Apache is already listening to port 80, then the only thing you can do is proxy to your node instance. And yes, apache will create a new thread for each connection.
As others have mentioned, there's not a whole lot you can do here. Apache is currently driving your server and node.js won't like riding shotgun.
I'd recommend checking out things like nodester, no.de, heroku, and so on.

Resources