Use Heroku server url at nginx.conf.erb for load balancing - node.js

I have 2 server:
Server 1 is for loading balance with Nginx - https://server1.herokuapp.com/
Server 2 is for acting RESTful APIs. - https://server2.herokuapp.com/
Here my configuration of nginx.conf.erb at Server 1: https://gist.github.com/ntvinh11586/5b6fde3e804482aa400f3f7faca3d65f
When I try call https://server1.herokuapp.com/, instead of return data from https://server2.herokuapp.com/, I reach a 400 - Bad request. I don't know somewhere in my nginx.conf.erb wrong or I need implement nginx in server 2.
Try to research some resources but I found almost these tutorials configuring in localhost instead of specific hosts like heroku.
So what should I do to make my work successfully?

You need to configure your app as follows -
#upstream nodebeats {
# server server2.herokuapp.com;
# }
server {
listen <%= ENV['PORT'] %>;
server_name herokuapp.com;
root "/app/";
large_client_header_buffers 4 32k;
location / {
proxy_redirect off;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header Host $http_host;
proxy_set_header X-NginX-Proxy true;
proxy_set_header Connection "";
proxy_http_version 1.1;
proxy_pass http://localhost:<node-app-port>;
}
My two cents.
Comment out the upstream. Work with a single server server1.herokuapp.com, get it working with the above implementation, and then you can accomplish on adding the server2.hreokuapp.com to load balance.

Related

Deployment of an e-learning project based on freecodecamp code

I made a fork of the freecodecamp project on github and modified the design to meet my client's requirements. Locally, everything works fine, but when I deploy online on a digitalocean droplet with Nginx as a proxy, There is a problem that occurs when authenticating with Auth0, the access-token is not sent to the client. Basically, the freecodecamp application uses auth0 to handle all the authentication.
Since everything is working fine locally, I thought that my online Nginx configurations might be the problem.
I created two configuration files on Nginx, one for the client and one for the api.
The configuration file for the api has the following content:
server {
listen 80;
listen [::]:80;
root /var/www/html/freeCodeCamp;
server_name my_domain_name.com;
location / {
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $host;
proxy_set_header X-NginX-Proxy true;
proxy_pass http://localhost:3000/;
proxy_redirect http://localhost:3000/ http://$server_name/;
}
}
The configuration file for the client has the following content:
server {
listen 80;
listen [::]:80;
root /var/www/html/freeCodeCamp;
server_name my_domain_name.com;
location / {
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $host;
proxy_set_header X-NginX-Proxy true;
proxy_pass http://localhost:8000/;
proxy_redirect http://localhost:8000/ http://$server_name/;
}
}
I would like to have your opinion on the subject. Thanks in advance.

Node.js API with Nginx: Block API access to the public

I'm using Nginx with Nodejs backend. I use Nodejs for authenticating users and api calls. Currently I have the following in my Nginx configuration:
location /api {
proxy_pass http://localhost:5000/api;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
proxy_redirect off;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
However, this allows anyone to access /api directly. Is there a way to configure this so that users can't directly access /api?
Update
I tried adding:
allow 127.0.0.1; deny all; to the nginx config, however, this also blocks nginx from getting the resources. In other words, users can no longer login or get the resources from api.
I added a middle-ware to express and it always receives 127.0.0.1 (localhost) as the req.ip so I cannot do anything on nodejs side to prevent this because all requests are redirected from nginx.

How to implement REST API Versioning with nodejs + NGINX?

From past few days, I m working on How to implement API versioning with help of NGINX.
At an application level, I m able to implement But this required 2 Diff controller, 2 diff route, 2 diff model etc.. I don't want to do that.
I want two different projects like v1 and v2. Using NGINX, If my URL contain v1 then it's point to v1 project and if URL contain v2 then it will Point to v2 project something like that.
I know using NGINX ALIAS or ROOT we able to do that but I don't know how?
In fact, we are talking about how to configure nginx as a reverse proxy. And do proxies for different projects, depending on the content of URL.
In your case, you need to:
Configure the sail-projects at different ports. For example:
for API.V1: sails.config.port -> 3010
for API.V2: sails.config.port -> 3020
Add to nginx configuration (nginx.conf) two upstream (for example for nginx and api-projects located on the same server).
Add to nginx configuration (nginx.conf inside server block) two locations for different api's.
Nginx configuration might look like this:
upstream api_v1 {
server 127.0.0.1:3010;
keepalive 64;
}
upstream api_v2 {
server 127.0.0.1:3020;
keepalive 64;
}
server {
listen 80;
server_name example.com;
location /api/v1 {
proxy_pass http://api_v1;
proxy_http_version 1.1;
proxy_set_header Connection "";
proxy_set_header Host $host;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Real-IP $remote_addr;
}
location /api/v2 {
proxy_pass http://api_v2;
proxy_http_version 1.1;
proxy_set_header Connection "";
proxy_set_header Host $host;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Real-IP $remote_addr;
}
}

Nginx, Node, Angular - Subfolder API/URL configuration

I currently have a node/angular app that runs as expected when pointed directly to the port configured (8081 for the purposes of explaining my situation). I'm able to post,get,put,delete as expected.
My goal is to have the node application running at mydomain.com/subfolder. When nginx is configured with the location of '/', everything works as expected. Config below:
upstream app_yourdomain {
server 127.0.0.1:8081;
}
server {
listen 0.0.0.0:80;
server_name yourdomain.com yourdomain;
access_log /var/log/nginx/yourdomain.log;
location / {
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $http_host;
proxy_set_header X-NginX-Proxy true;
proxy_pass http://app_yourdomain/;
proxy_redirect off;
}
}
As soon as I change the location to /subfolder, however, my get,post,put,delete requests return 404 responses. The index.html configured in the node application is returned though. Configuration below:
upstream app_yourdomain {
server 127.0.0.1:8081;
}
server {
listen 0.0.0.0:80;
server_name yourdomain.com yourdomain;
access_log /var/log/nginx/yourdomain.log;
location /subfolder {
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $http_host;
proxy_set_header X-NginX-Proxy true;
proxy_pass http://app_yourdomain/;
proxy_redirect off;
}
}
In my angular factory, I have my requests structured like return $http.get('/subfolder'); or return $http.post('/subfolder', {data: data});.
And, within my node application, I have the routes defined like app.get('/subfolder', somefunction); or app.post('/subfolder', somefunction);
Again, when I have the application running from the root of the domain, it works fine. When I have it configured to be in a subfolder of the domain, however, the requests no longer work.
My end goal would be to have multiple node applications running from sub-folders of a main domain. I've been fighting with this for a while, and found several articles for hosting mutliple node apps on a single server, but they seem geared toward having separate domains. I'd like (if possible) for these to run as separate apps for the same domain.
Any thoughts/tricks/pointers? Thanks!
Modify Your Nginx File to look like this:
upstream node{
server 127.0.0.1:3000;
}
listen 0.0.0.0:80;
server_name yourdomain.com yourdomain;
access_log /var/log/nginx/yourdomain.log;
location /node {
rewrite /node(.*) $1 break;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $http_host;
proxy_set_header X-NginX-Proxy true;
proxy_pass http://node;
proxy_redirect http://node/ /node;
}
I got the above from here: http://skovalyov.blogspot.com/2012/07/deploy-multiple-node-applications-on.html and it works for me

https response quite slow using NGINX for NodeJs application

enter code hereI have a web application mapped with multiple domains. One of the domain is using SSL while other one is simple.
I tried to use NGINX with nodeJs. My HTTPs response is very very slow. Please have a look at the conf file and help me to get rid of this problem.
upstream myserver {
server 127.0.0.1:4502;
server 127.0.0.1:4500;
}
server {
listen 0.0.0.0:80;
server_name a.myserver.com;
access_log /var/log/nginx/nodetest.log;
location / {
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $http_host;
proxy_set_header X-NginX-Proxy true;
proxy_pass http://myserver/;
proxy_redirect off;
}
}
server {
listen 0.0.0:443;
server_name myapps.com;
access_log off;
ssl on;
ssl_certificate /mnt/drives/ssl_certificates/daffodilapps/ssl-bundle.crt;
ssl_certificate_key /mnt/drives/ssl_certificates/daffodilapps/ryans-key.pem;
location / {
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $host;
proxy_set_header X-NginX-Proxy true;
proxy_pass http://myserver/;
proxy_redirect off;
It seems likely that your node.js application is not a factor, since you access it the same way in the end. Still, you might want to validate that by doing a performance test for http vs. https against a static page. If it turns out that nginx SSL performance is still poor, you might try changing the cipher suites that nginx offers, per the advice in Nginx Performance Tuning for SSL.
Also, you don't list what kind of VM you are using, but you might also try upgrading to a non-shared-core VM if you're currently on a f1/g1. The extra (and dedicated) CPU should help SSL performance, which is also mentioned in the article when they switched from a micro to a regular EC2 VM.

Resources