How to run multiple StrongLoop LoopBack apps on the same server? - node.js

I'm currently running two StrongLoop LoopBack apps (Nodejs apps) on a single server with different ports. Both apps were created using slc lb project and slc lb model from the command line.
Is it possible to run these apps on a single ports with different path and/or subdomain? If it is, how do I do that on a Linux machine?
Example:
http://api.server.com:3000/app1/ for first app.
http://api.server.com:3000/app2/ for second app.
thanks.

Since LoopBack applications are regular Express applications, you can mount them on a path of the master app.
var app1 = require('path/to/app1');
var app2 = require('path/to/app2');
var root = loopback(); // or express();
root.use('/app1', app1);
root.use('/app2', app2);
root.listen(3000);
The obvious drawback is high runtime coupling between app1 and app2 - whenever you are upgrading either of them, you have to restart the whole server (i.e. both of them). Also a fatal failure in one app brings down the whole server.
The solution presented by #fiskeben is more robust, since each app is isolated.
On the other hand, my solution is probably easier to manage (you have only one Node process instead of nginx + per-app Node processes) and also allows you to configure middleware shared by both apps.
var root = loopback();
root.use(express.logger());
// etc.
root.use('/app1', app1);
root.use('/app2', app2);
root.listen(3000);

You would need some sort of proxy in front of your server, for example nginx. nginx will listen to a port (say, 80) and redirect incoming requests to other servers on the machine based on some rules you define (hostname, path, headers, etc).
I'm no expert on nginx but I would configure it something like this:
server {
listen: 80;
server_name api.server.com;
location /app1 {
proxy_pass http://localhost:3000
}
location /app2 {
proxy_pass http://localhost:3001
}
}
nginx also supports passing query strings, paths and everything else, but I'll leave it up to you to put the pieces together :)
Look at the proxy server documentation for nginx.

Related

How to get SSL to play nice with non-ssl protocol data

Short background: If we go back in time to about 2006-ish: We (ie: my company) used a java client app embedded in the browser that connected via port 443 to a C program backend running on port 8068 on an in-house server. At the time when the java app was first developed, port 443 was the only port that we knew would not be blocked by our customers that used the software (ease of installation and possibly the customer in-house staff didn't have the power or knowledge to control their internal firewall).
Fast-forward to 2016, and I'm hired to help develop a NodeJS/Javascript version of that Java app. The Java app continues to be used during development of its replacement, but whoops - we learn that browsers will drop support for embedded Java in the near future. So we switch to Java Web Start, so that the customers can continue to download the app and it still connects to the in house server with it's port 443->8068 routing.
2017 rolls around and don't you know, we can't use the up-coming JS web-app with HTTPS/SSL and the Java app at the same time, 'cause they use the same port. "Ok let's use NGINX to solve the problem." But due to in house politics, customer needs, and a turn-over of web-developer staff, we never get around to truly making that work.
So here we are at 2020, ready to deploy the new web version of the client software, and the whole 443 mess rears it's ugly head again.
Essentially I am looking to allow (for the time being) the Java app to continue using 443, but now need to let the web app use HTTPS too. Back in 2017/2018 we Googled ways to let them cohabitate through NGINX, but we never really got them to work properly, or the examples and tutorials were incomplete or confusing. It seemed like we needed to either use streaming along the lines of https://www.nginx.com/blog/running-non-ssl-protocols-over-ssl-port-nginx-1-15-2/ , or look at the incoming HTTPS header and do an 'if (https) { route to nodeJS server } else { assume it must be the java app and route to port 8068 }' -sort of arrangement inside the NGINX config file.
Past Googled links appear to not exist anymore, so if anyone knows of an NGINX configuration that allows an HTTPS website to hand off to a non-SSL application that still needs to use 443, I would greatly appreciate it. And any docs and/or tutorials that point us in the right direction would be helpful too. Thanks in advance!
You can do this using ssl_preread option. Basically, this option will allow access to the variable $ssl_preread_protocol, that contains the protocol negotiated at SSL port. If no valid protocol was detected, the variable will be empty.
Using this parameters, you could use the follow configuration to your environment:
stream {
upstream java {
server __your_java_server_ip__:8068;
}
upstream nodejs {
server __your_node_js_server_ip__:443;
}
map $ssl_preread_protocol $upstream {
default java;
"TLSv1.2" nodejs;
}
server {
listen 443;
proxy_pass $upstream;
ssl_preread on;
}
}
In your case, this configuration will pass the connection directly to your nodejs and java backend servers, so, nodejs will need to negotiate the SSL. You can pass this work to NGiNX using another server context, like:
stream {
upstream java {
server __your_java_server_ip__:8068;
}
upstream nodejs {
server 127.0.0.1:444;
}
map $ssl_preread_protocol $upstream {
default java;
"TLSv1.2" nodejs;
}
server {
listen 443;
proxy_pass $upstream;
ssl_preread on;
}
}
http {
server {
listen 444 ssl;
__your_ssl_cert_configurations_here__
location / {
proxy_pass http://__your_nodejs_server_ip__:80;
}
}
}
You'll need NGiNX at least version 1.15.2 to this configuration to work, and compiled with ngx_stream_ssl_preread_module module (need to compile with --with-stream_ssl_preread_module configuration parameter, because this module is not built by default).
Source: https://www.nginx.com/blog/running-non-ssl-protocols-over-ssl-port-nginx-1-15-2/

NodeJS Express - Two NodeJS instances on same port (vhost)

I'm trying to run 2 instances of NodeJS on the same port and server from diffrent server.js files (diffrent dir, config etc). My server provider gave me an information that vhost is running for a diffrent domain, and there is the question. How to handle it in NodeJS Express app ? I've tried to use vhost from https://github.com/expressjs/vhost like that :
const app = express();
const vhost = require('vhost');
app.use(vhost('example1.org', app));
// Start up the Node server
app.listen(4100, () => {
console.log(`Node server listening on 4100`);
});
And for second application like that:
const app = express();
const vhost = require('vhost');
app.use(vhost('example2.org', app));
// Start up the Node server
app.listen(4100, () => {
console.log(`Node server listening on 4100`);
});
But when I'm trying to run second instance I'm getting EADDRINUSE ::: 4100, so vhost doesn't work here.
Do you know how to fix it ?
You can only have one process listen to one port, not just in Node.js, but generally (with exceptions that don't apply here).
You can achieve what you need to one of two ways:
Combine the node apps
You could make the apps into one application, listen once and then forward requests for each host to separate bits of code - if you wanted to achieve code separation still, the separate bits of code could be NPM modules that are actually written and maintained in isolation.
Use webserver to proxy the requests
You could run the 2 node processes on some free port, say 5000 and 5001, and use a webserver to forward requests to it automatically based on host. I'd recommend Nginx for this, as its proxying capabilities are both relatively easy to set up, and powerful. It's also fairly good at not using too many system resources. Apache and others can also be used for this, but my personal preference would be Nginx.
Conclusion
My recommendation would be that you install a webserver and forward requests on the exposed port to the separately running node processes. I'd actually recommend that you run node behind a proxy as default for a project, and only expose it directly in excpetional circumstances. You get a lot of configuration options, security, and scalability benefits if your app already involves a well hardened server setup.

Can I run multiple loopback.io apps on same port?

Referring to the following question:
Running multiple Node (Express) apps on same port
Can I run multiple apps (backend, api rest) on same port, if I am using strongloop loopback to generates my Node app?
Generally what you will be doing is running multiple instances of your app on different ports and have some sort of load balancer in front switching among the instances and thus exposing it as one port.
Assuming you've started 3 instances on ports 3001, 3002, and 3003, you can do it in nginx like this:
http {
upstream myloopbackapp {
server localhost:3001;
server localhost:3002;
server localhost:3003;
}
server {
listen 80;
location / {
proxy_pass http://myloopbackapp;
}
}
}
Further reading: http://nginx.org/en/docs/http/load_balancing.html
There are equally easy ways to do this in Apache and IIS as well.

Dividing express routes among Node JS Clusters

I have a large set of routes in a Node JS application I'm trying to scale to multiple CPU cores (via NodeJS clusters).
The plan I had in mind was to have different workers handling a different set of express.js routes. For example:
/api/ requests handled by WorkerA
/admin/ handled by WorkerB
/blog/ handled by WorkerC
etc
Simply using a conditional with the worker ID is not sufficient, since requests can still land at the wrong worker. Also, the processes all run on the same port, so I can't just match & proxy_pass on the URL from inside nginx.
At this point, I'm thinking about swapping out the cluster routing (from master to worker) to match on the URL and route to the correct worker instead of just using the built-in round-robin approach. But this seems a bit hacky and I'm wondering if anyone else has solved this, or might have any other idea.
My solution was to run multiple express apps listened on different ports, and set a Nginx server at front to proxy requests
Say you have three express apps, each one would handle a specific type of routers, and listen on separate port (8081, 8082, 8083), and of course, they should run in cluster mode:
//API app used to handle /api routing
apiApp.listen(8081);
//Admin app used to handle /admin routing
adminApp.listen(8082);
//Blog app used to handle /blog routing
blogApp.listen(8083);
And config the Nginx server to proxy the requests:
server {
# let nginx server running on a public port
listen 80;
location /api {
proxy_pass http://127.0.0.1:8081
}
location /admin {
proxy_pass http://127.0.0.1:8082
}
location /blog {
proxy_pass http://127.0.0.1:8083
}
}
proxy_pass simply tells nginx to forward requests to /api to the server listening on 8081. You can check the full document here

Running multiple sites on node.js

I'm planning to do three sites using node.js. I have got some common templates among the sites. Should I run all three sites on single node.js instance?
I'm aware of 'vhost' middleware that allows you to run multiple domains on single http server. Is there any better option to do this?
I've also got some static html templates and not sure how to deal with these in node.js?
Finally I would like to know hosting options for this kind of setup?
I myself just had to do this exact same thing. What you want to do is use some sort of reverse proxy.
The one I use is here: https://github.com/nodejitsu/node-http-proxy
Simply install the proxy package: npm install http-proxy
What I do is have the proxy running on the server on port 80. I set the DNS up on each domain to point to this server.
Each application is running on the same server (im using screens).
For example:
MySiteApplication1 - 3001
MySiteApplication2 - 3002
MySiteApplication3 - 3003
Then your proxy server file would look like this
var httpProxy = require('http-proxy');
var server = httpProxy.createServer({
router: {
'mysite1.com': 'localhost:3001',
'mysite2.com': 'localhost:3002',
'mysite3.com': 'localhost:3003'
}
});
server.listen 80

Resources