NodeJS Express - Two NodeJS instances on same port (vhost) - node.js

I'm trying to run 2 instances of NodeJS on the same port and server from diffrent server.js files (diffrent dir, config etc). My server provider gave me an information that vhost is running for a diffrent domain, and there is the question. How to handle it in NodeJS Express app ? I've tried to use vhost from https://github.com/expressjs/vhost like that :
const app = express();
const vhost = require('vhost');
app.use(vhost('example1.org', app));
// Start up the Node server
app.listen(4100, () => {
console.log(`Node server listening on 4100`);
});
And for second application like that:
const app = express();
const vhost = require('vhost');
app.use(vhost('example2.org', app));
// Start up the Node server
app.listen(4100, () => {
console.log(`Node server listening on 4100`);
});
But when I'm trying to run second instance I'm getting EADDRINUSE ::: 4100, so vhost doesn't work here.
Do you know how to fix it ?

You can only have one process listen to one port, not just in Node.js, but generally (with exceptions that don't apply here).
You can achieve what you need to one of two ways:
Combine the node apps
You could make the apps into one application, listen once and then forward requests for each host to separate bits of code - if you wanted to achieve code separation still, the separate bits of code could be NPM modules that are actually written and maintained in isolation.
Use webserver to proxy the requests
You could run the 2 node processes on some free port, say 5000 and 5001, and use a webserver to forward requests to it automatically based on host. I'd recommend Nginx for this, as its proxying capabilities are both relatively easy to set up, and powerful. It's also fairly good at not using too many system resources. Apache and others can also be used for this, but my personal preference would be Nginx.
Conclusion
My recommendation would be that you install a webserver and forward requests on the exposed port to the separately running node processes. I'd actually recommend that you run node behind a proxy as default for a project, and only expose it directly in excpetional circumstances. You get a lot of configuration options, security, and scalability benefits if your app already involves a well hardened server setup.

Related

How to run a gRPC Server and an Express Server on the Same PORT in cloud run

Are there any known techniques or hacks for running both an Express Server and a GRPC Server on the same port? I am aware Cloud Run exposes just a single Port for a service instance.
So, I'm literally just looking for hacks as it stands now.
Like below
const app = express();
const port=process.env.PORT;
app.listen(port,()=>{});
gRPCServer.bindAsync(`0.0.0.0:${port}`, grpc.ServerCredentials.createInsecure(), () => {
gRPCServer.start();
});
I wish to expose some routes to my users via express and only use GRPC for internal micro services communication.
I saw https://github.com/grpc-ecosystem/grpc-gateway but there are very few documentations on how to use it with NodeJS plus I DON'T want to generate client libraries. I prefer dynamic code generation.

Host a Node.js bot (express and botkit)

I just made a bot in node.js for the Cisco Webex Teams application. My bot uses "express" and "botkit". "Express" requires listening on the port "3000" and "Botkit" listening on the port "8080".
I tried heroku.com but it does not accept two predefined ports and does not save files dynamically (fs.write)
var PUBLIC_URL = "http://a796e3b7.ngrok.io";
var port ='3000';
var ACCESS_TOKEN ='xxx';
var SECRET = "xxx";
var express = require('express');
var app = express();
var Botkit = require('botkit');
var controller = Botkit.webexbot({
log: true,
public_address: PUBLIC_URL,
access_token: ACCESS_TOKEN,
secret: SECRET,
webhook_name: process.env.WEBHOOK_NAME || 'Email2Webex',
});
controller.setupWebserver(8080, function(err, webserver) {
controller.createWebhookEndpoints(webserver, bot, function() {
console.log("Webhooks set up!");
});
});
app.post('/mailgun', upload.any(),function(req, res, next){
res.end('ok');
});
app.listen(port);
Currently I use ngrok to host the bot locally on my computer and I want to be able to host it on a server so I do not have to worry about it. how can I do ?
You can't set the port on Heroku apps. Heroku sets the port you're supposed to use through the PORT environment variable, and you should use it via process.env.PORT. Generally speaking, deployed applications should not run on development ports like 8080 - if it's an HTTP server, it must listen on port 80, for example.
In order to have two apps listening at the same time, I suggest you refactor your code and include both your bot and your app into a single express server that will listen at the port defined by Heroku's PORT environment variable.
Concerning access to the file system, it is borderline possible to use it, but there are high security restrictions, so a code that might run on your machine is likely to break on the server. Generally speaking it's a bad idea to access the file system directly in Heroku, except for read-only actions on deployed files. That is in part because the file system is ephemeral, so dont assume your written files will always be there. Most issues related to the caveats of using the file system can be resolved by using database or file storage features provided by Heroku, though.

Formatting node apps for AWS deployment

I'm trying to deploy a node.js app on aws EC2 Beanstalk. My problem is, I can't figure out how to move from my localhost testing environment to aws standard. Right now, my app works on port 8081 by using the following code.
var server = app.listen(8081, function () {
var host = server.address().address
var port = server.address().port
})
How would I change this server variable to work on an actual domain?
Assuming your intent is to provide a public-facing web application, your code will work as is, albeit with a few caveats:
Currently your server will listen on port 8081. Once deployed to AWS users would have to browse to www.somedomain.com:8081 to reach your application. (Assuming the host instance allows traffic on that port. See below).
If your intent is to have users reach your application at www.somedomain.com - without specifying a port - you'll want the server to listen on port 80 instead.
var server = app.listen(80, function () { ... }
In either case you'll need to ensure that the security group rules for the EC2 host instance allow incoming TCP traffic on the listening port. Likewise, if your EC2 host instance is behind a load balancer you'll need to allow incoming traffic on the appropriate ports there as well.
For something a little fancier, you can try deploying your application to Elastic Beanstalk using Docker and exposing port 8081 in the dockerfile. This way users would still reach it at www.somedomain.com (via http port 80) and you could continue to develop and test locally using port 8081.
One final note: you didn't provide much information about what your application is or how you intend to use it, so I'm making quite a few assumptions based only on the information provided.
This code works great for me with node on Elastic Beanstalk, and allows me to seamlessly switch between localhost and remote development without changing any code:
var port = process.env.PORT || 8081;
var server = app.listen(port, function () {
//server is started!!!
})

How to run multiple StrongLoop LoopBack apps on the same server?

I'm currently running two StrongLoop LoopBack apps (Nodejs apps) on a single server with different ports. Both apps were created using slc lb project and slc lb model from the command line.
Is it possible to run these apps on a single ports with different path and/or subdomain? If it is, how do I do that on a Linux machine?
Example:
http://api.server.com:3000/app1/ for first app.
http://api.server.com:3000/app2/ for second app.
thanks.
Since LoopBack applications are regular Express applications, you can mount them on a path of the master app.
var app1 = require('path/to/app1');
var app2 = require('path/to/app2');
var root = loopback(); // or express();
root.use('/app1', app1);
root.use('/app2', app2);
root.listen(3000);
The obvious drawback is high runtime coupling between app1 and app2 - whenever you are upgrading either of them, you have to restart the whole server (i.e. both of them). Also a fatal failure in one app brings down the whole server.
The solution presented by #fiskeben is more robust, since each app is isolated.
On the other hand, my solution is probably easier to manage (you have only one Node process instead of nginx + per-app Node processes) and also allows you to configure middleware shared by both apps.
var root = loopback();
root.use(express.logger());
// etc.
root.use('/app1', app1);
root.use('/app2', app2);
root.listen(3000);
You would need some sort of proxy in front of your server, for example nginx. nginx will listen to a port (say, 80) and redirect incoming requests to other servers on the machine based on some rules you define (hostname, path, headers, etc).
I'm no expert on nginx but I would configure it something like this:
server {
listen: 80;
server_name api.server.com;
location /app1 {
proxy_pass http://localhost:3000
}
location /app2 {
proxy_pass http://localhost:3001
}
}
nginx also supports passing query strings, paths and everything else, but I'll leave it up to you to put the pieces together :)
Look at the proxy server documentation for nginx.

Running multiple sites on node.js

I'm planning to do three sites using node.js. I have got some common templates among the sites. Should I run all three sites on single node.js instance?
I'm aware of 'vhost' middleware that allows you to run multiple domains on single http server. Is there any better option to do this?
I've also got some static html templates and not sure how to deal with these in node.js?
Finally I would like to know hosting options for this kind of setup?
I myself just had to do this exact same thing. What you want to do is use some sort of reverse proxy.
The one I use is here: https://github.com/nodejitsu/node-http-proxy
Simply install the proxy package: npm install http-proxy
What I do is have the proxy running on the server on port 80. I set the DNS up on each domain to point to this server.
Each application is running on the same server (im using screens).
For example:
MySiteApplication1 - 3001
MySiteApplication2 - 3002
MySiteApplication3 - 3003
Then your proxy server file would look like this
var httpProxy = require('http-proxy');
var server = httpProxy.createServer({
router: {
'mysite1.com': 'localhost:3001',
'mysite2.com': 'localhost:3002',
'mysite3.com': 'localhost:3003'
}
});
server.listen 80

Resources