Load balance request traffic with muiltiple Node servers using NGINX - node.js

According to this answer:
You should run multiple Node servers on one box, 1 per core and split request traffic between them. This provides excellent CPU-affinity and will scale throughput nearly linearly with core count.
Got it, so let's say our box has 2 cores for simplicity.
I need a complete example a Hello World app being load balanced between two Node servers using NGINX.
This should include any NGINX configuration as well.

app.js
var http = require('http');
var port = parseInt(process.argv[2]);
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.end('Hello World\n');
}).listen(port);
console.log('Server running at http://localhost:' + port + '/');
nginx configuration
upstream app {
server localhost:8001;
server localhost:8002;
}
server {
location / {
proxy_pass http://app;
}
}
Launch your app
node app.js 8001
node app.js 8002
HttpUpstreamModule documentation
Additional reading material
cluster module - still experimental, but you don't need nginx
forever module - in case your app crashes
nginx and websockets - how to proxy websockets in the new nginx version

Related

I can't run my node application in my on prem VM

I'm trying to run a node js application on an on-prem VM which is running RHEL 7. I'm not experienced in RHEL 7 and can't seem to find any details in running Node JS apps on it.
My app is super simple. Is a server which returns a message...
var http = require('http');
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.end('Welcome Node.js');
}).listen(8000);
console.log('Server running on http://ip:8000');
I run the application and try accessing the IP with the port in the browser within the correct network. Am I missing something? My assumption was RHEL7 was the server and would show me the application once I open it on the browser.

How to access meteor app from outside without passing through NginX?

I am hosting a meteor app on an Ubunu Linux machine. The app is listening on port 3000. If I use a webserver, like NginX and forwards the HTTP requests from port 80 to 3000 I can browse to the server from the outside and see reach the app. However, when I try to access the app directly at port 3000, i.e. browse http://myhost:3000 it just tries to connect and nothing happens.
I have made sure that all firewalls are down and that the app is listening on all interfaces, i.e. 0.0.0.0:3000, so that is not the issue.
To verify that port was actually reachable, I created a simple node js webserver:
var http = require('http');
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/html'});
res.write('Hello World!');
res.end();
}).listen(3000);
Now browsing to the the sever, I can see "Hello World!". So obviously this works so why I can not reach meteor has nothing to do with firewalls or unopened ports.
Thus it seems that there is something strange when trying to access a meteor app directly at port 3000. But why? I use the following environment variables:
export MONGO_URL=mongodb://localhost:27017/meteor
export HOST=myhost
export PORT=3000
export ROOT_URL=http://myhost
So what am I missing? Ports are open and I can see that the node process instance is listening on port 3000 when I run netstat -tulpan
I was using the force-ssl meteor package which makes a redirect back to the ROOT_URL without port number. So solution is to remove the package to make it work with a custom port.
I was discussing the solution on the meteor forum where I got the solution:
https://forums.meteor.com/t/can-not-access-meteor-app-without-passing-through-nginx-server/40739/11

Comparing "hello world" on EC2 Micro between node and nginx, why node so slow?

I've hit the need to put a load balancer in front of some Node.js servers and I decided to compare Nginx and Node.js
To do this test I simply spun up an Ec2 Micro (running Ubuntu 14.04) and installed Nginx and Node.js
My nginx.conf file is:
user www-data;
worker_processes 1;
pid /run/nginx.pid;
http {
server {
listen 443 ssl;
return 200 "hello world!";
ssl_certificate /home/bitnami/server.crt;
ssl_certificate_key /home/bitnami/server.key;
}
}
events {
worker_connections 768;
}
And my Node.js code is:
var http = require('https');
var fs = require('fs');
var serverOptions = {
key: fs.readFileSync("/home/bitnami/server.key"),
cert: fs.readFileSync("/home/bitnami/server.crt")
};
http.createServer(serverOptions,function (req, res) {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.end('Hello World\n');
}).listen(443);
console.log('Server running');
I then used another EC2 server (m3.medium due to memory needs) to run wrk with the command
./wrk -t12 -c400 -d30s https://ec2-54-190-184-119.us-west-2.compute.amazonaws.com
The end result was that Nginx could consistently pump through 5x more reqs/second than Node.js (12,748 vs 2,458), while using less memory (both were CPU limited).
My question is, since I'm not exactly great/experienced/knowledgeable in server admin or setup, am I doing something to severely mess up Node.js? And can I confidently draw the conclusion that in this situation, Nginx is absolutely the better choice?

Running a node.js server on my VPS on port 3000 and the connection times out

In hostgator I have a VPS running centOS. I installed NodeJS and screen.
I added the following code to a file named index.js:
//1
var http = require('http');
//2
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/html'});
res.end('<html><body><h1>Hello World</h1></body></html>');
}).listen(3000);
console.log('Server running on port 3000.');
On 'screen:1' I run the following command:
node index.js
It gives me the console output stating 'Server running on port 3000.'
I switch to 'screen:0' and run the following command:
curl localhost:3000
and I get the following response:
<html><body><h1>Hello World</h1></body></html>
Yet, when I try my server's IP address (substitute the xxx for a real IP address, cause I'm not disclosing my VPS IP address):
xxx.xxx.xxx.xxx:3000
The page never comes up and eventually it times out.
I've tried various ports (8080, 7000) and to not avail.
Do I need to place the iOS project in a different directory.
Currently I have it in /root/Projects/NodeTutorial2/index.js.
What do I need to do to get a hello world response from my VPS?
If you're getting a response from on the box, but not from other boxes, it's almost certainly a firewall issue. Turning off IPTables or allowing the traffic in on the port in question is one option but an easier / more appropriate option is to simply have your app use port 80 (for HTTP) or 443 (for HTTPS). You can either do that by listening to that port on the app directly, or by having a web server that acts as a reverse-proxy for you (e.g. NGINX or Apache).

Muitliple Node.js servers using NGINX proxys

The Goal:
Use muitiple live node.js servers independent of each other under different doc roots.
Using NGINX
server {
server_name .lolwut1.com;
root /var/www/html/lolwut1;
# proxy pass to nodejs
location / {
proxy_pass http://127.0.0.1:5001/;
}
}
server {
server_name .lolwut2.com;
root /var/www/html/lolwut2;
# proxy pass to nodejs
location / {
proxy_pass http://127.0.0.1:5002/;
}
}
/var/www/html/lolwut1/app.js
var http = require('http');
var server = http.createServer(function (request, response) {
response.writeHead(200, {"Content-Type": "text/plain"});
response.end("lolwut1\n");
});
server.listen(5001);
/var/www/html/lolwut2/app.js
var http = require('http');
var server = http.createServer(function (request, response) {
response.writeHead(200, {"Content-Type": "text/plain"});
response.end("lolwut2\n");
});
server.listen(5002);
So when I...
node app.js in /var/www/html/lolwut1/app.js and hit lolwut1.com I'm all good.
Questions:
But now what If I want to start the second node server?
Is this a bad approach?... Am I thinking about this the wrong way?
What are the advantages/disadvantages of using node.js with a connect.vhost directive as a router rather than NGINX?
Use forever to start and stop your node apps.
You're doing it right! This approach has worked well for me for quite a while.
Connect vhost Advantage: You don't have to install and configure nginx. The whole stack is node.js.
Nginx Advantage: Nginx is a mature and stable web server. It's very unlikely to crash or exhibit strange behavior. It can also host your static site, PHP site, etc.
If it were me, unless I needed some particular feature of Nginx, I'd pick Connect vhost or node-http-proxy for the sake of having an all-node.js stack.
But now what If I want to start the second node server? Is this a bad approach?...
when you cd to /var/www/html/lolwut2/ and run node app.js, this should start the second server on port 5002 and lolwut2.com should work.
Am I thinking about this the wrong way?
That's a valid way to run multiple node apps on the same server if you have enough memory, and plenty of cpu power. This is also a good way to scale a single node app on the same machine to take advantage of multiple cores by running multiple nodes and using the upstream directive (like here https://serverfault.com/questions/179247/can-nginx-round-robin-to-a-server-list-on-different-ports)

Resources