How to create a sails.js app that will listen on two different ports simultaneously - node.js

For example, I have a sails.js application that (by default) listens on port 1337. I want to configure it to listen on two different ports at the same time - one for SSL and one for non-SSL traffic. Is this even possible? I have scoured the documentation and cannot find an example that shows me anything other than setting a single port value.
Do I have to create an front-end that (like Apache or nginx) to do it is it it possible to stick with a pure node.js solution - perhaps with express?
I should add that I am only using the server for web sockets via socket.io
A working example would be great, but any tips and pointers would help.

the simplest would probably be to run your server twice.
Just make sure you share common data like sessions and persistent global variables - maybe using something like redis (sails can automatically base your session on redis and can even bind a model on the redis server while keeping the rest on your current database)

Related

Do I need a different server to run node.js

sorry if this is a wrong question on this forum but I am simply just stuck and need some advice. I have a shared hosting service and a cloud based hosting server with node.js installed. I want to host my website as normal but I also want to add real time chat and location tracking using node.js I am confused with what I am reading in several places because node.js is itself a server but not designed to host websites? So I have to run 2 different servers? One for the website and one to run node.js? When I setup the cloud one with a node.js script running I can no longer access the webpages.
Whats the best way for me achieve this as I am just going round in circles. Also is there a way I can set up a server on my PC and run and test both of these together before hand so I see what is needed and get it working as it will stop me ordering servers I dont need.
Many thanks for any help or advice.
Node can serve webpages using a framework like Express, but can cause conflicts if run on the same port as another webserver program (Apache, etc). One solution could be to serve your webpages through your webserver on port 80 (or 443 for HTTPS) and run your node server on a different port in order to send information back and forth.
There are a number of ways you can achieve this but here is one popular approach.
You can use NGINX as your front facing web server and proxy the requests to your backend Node service.
In NGINX, for example, you will configure your upstream service as follows:
upstream lucyservice {
server 127.0.0.1:8000;
keepalive 64;
}
The 8000 you see above is just an example, you may be running your Node service on a different port.
Further in your config (in the server config section) you will proxy the requests to your service as follows:
location / {
proxy_pass http://lucyservice;
}
You're Node service can be running in a process manager like forever / pm2 etc. You can have multiple Node services running in a cluster depending on how many processors your machine has etc.
So to recap - your front facing web server will be handling all traffic on port 80 (HTTP) and or 443 (HTTPS) and this will proxy the requests to your Node service running on whatever port(s) you define. All of this can happen on one single server or multiple if you need / desire.

NodeJS - securely connect to external redis server

On my main server, I fetch data from an external/seperate redis server which is accessed through an api https://localhost:7000/api/?token=**** which works. However token and api is not secure. And since I want to have redis server to be separate, this technique isn't suited for my case.
In my case I want to have 2 independent servers A and B.
A should load data from B without using an api or url call... Instead it should use port (e.g. //server:123). This way server B can only be accessed from A.
I want this approach to work for both development and production. AWS has "Server Groups" I believe, but that's production only...
So is there a way to create this kind of connection with nodejs? I also want to know if this is only possible having a running server already, since I don't have one yet.
Note: In case you are wondering, I use redis to store private keys for encryption, so I need a secure, separate server which can be controlled independently
It is not very clear what you're trying to do since accessing data from another server without using an API does not really make sense. Anything you do to access it is some type of API.
If you want to make it so that only server A can access server B, then you have a number of choices to make that secure:
Require authentication whenever server B is accessed and make it so that only server A has those authentication credentials.
Assuming server A and server B are in your same server infrastructure, put the server B API on a port that is not available to the outside world, but is only available from within your server infrastructure (this usually involves picking a port that your firewall to the outside is blocking access to).
On server A, only accept connections on its API from the specific IP address of server B.
You can even implement more than one of these options at once. For example, it's not uncommon to use 1) and 2) together.
Stunnel is built for that ! basically speaking it's a vpn ! but not for machines for ports ! it's a bit complicated , you will have to deal with certificates and a couple of other things (config both servers...) but once it's done it's a breeze to launch and reuse (just launch a file) give it a try !
and see this link : https://www.digitalocean.com/community/tutorials/how-to-set-up-an-ssl-tunnel-using-stunnel-on-ubuntu
you should also consider adding an ip table rule at the database server to allow access to your server only.
Edit:
Keep in mind that redis was designed to be used in a trusted environment . This means that the security layer will not be redis itself but a third party software that u'll need to setup.
For dev purpose no need to make this bulletproof. And even if you want to , it's kinda hard to do . because the security of your app is mainly depending on the infrastructure of the company that will host your app.
That being said , if you want to secure a redis instance in a localhost environment . a rule at the ip table allowing only the localhost to access your port 6379 will be suffcient.
The other thing that could compromise the security of your redis DB is the app itself . An important aspect of this is to validate EVERYTHING , it should be a good start.
Finally if you want to dive a bit deeper take a look at this link
https://www.digitalocean.com/community/tutorials/how-to-secure-your-redis-installation-on-ubuntu-14-04
hope this helps !

How to use Node js in conjunction with Webmin

I have a server running webmin (different domains pointing to different app/directories). Currently I can have my php app running from a directory and all I need to do in order to make it live is get webmin to direct that domain to that specific directory.
Can I do the same with a node js app? If not, how can I use node and webmin in the same box?
I know you didn't say this specifically, but assuming you're hosting the other web stuff through, say, Apache, you would need to leverage that, but you can probably get the effect you want. Basically, it sounds like you want to be able to use "host header" separation for services, rather than having a separate IP address for, say, Apache and Node.js to each use.
So, if you let Apache bind to the main port you're using (80/443/both), then you would run ode and have it configured to listen on a different port (say 8080 as in the example you left in another comment). You can then use mod_proxy in Apache and have it route request with certain domain names to Node. Here's a: more concrete example of this but really the idea is not specific to Node. It can apply to any other process that wants to respond to HTTP requests on your server (or even on a different server).

How to serve different node.js applications using the same port?

I have a site for hosting my dev projects. Each project is a node.js file. The problem is I can only have 1 project online at the same time - except if I host them in a different ports. But suppose I want to publish 2 projects like that: my_site.com/foo, my_site.com/bar, where the first is managed by "foo.js" and the second by "bar.js". How can I do that?
You need a proxy in front. You assign each separate node process a different port. The proxy forwards traffic on port 80 to the appropriate backend node process.
This can be accomplished with node-http-proxy (https://github.com/nodejitsu/node-http-proxy), or any web server. Nginx and lighttpd make it ridiculously easy, Apache less so but still completely doable.
Setup a Nginx process to reverse proxy to your Node processes. The Nginx process will hold onto the port and send requests for my_site.com/foo to the node foo.js backend process and send requests for my_site.com/bar to the node bar.js backend process.
This way your Node processes stay completely independent and can easily be separated out to different servers later if one of them becomes popular.
If you are using express/connect, you can use something along the lines of
var bar = require("./bar"),
foo = require("./foo");
app.use(express.vhost("my_site.com/bar", bar));
app.use(express.vhost("my_site.com/foo", foo));
Is A Separate File.
NOTE: Not Tested

Node.js introduction

Please pardon my ignorance on node.js. I have started reading on node.js and have some perception which might be wrong. So needed it to clarify
When we use createServer() method, does it creates a virtual server. Not sure whether the term "virtual" is appropriate, but it's the best I can describe it :)
I am confused that how should I deploy my application having node.js + other custom js files as a part of it. If I deploy my application in the main server, does that mean I have two servers?
Thanks for bearing with me.
I will try to answer that:
Q1:
createServer basically creates a process which listens on the specified port for the requests. So yes you can call it as a virtual server which constantly listens for request at the port.
Q2:
Yes you can say that it has now 2 servers
For eg: you server had apache initially which listens to port 80 (you can access it as http://example.com/ it by default looks for port 80)
and then you also start the node service listening on some other port for eg: port 8456 (you can access it as http://example.com:8456/ which will look for port 8456)
So yes you can there are two servers.
EDIT
Q: So what would be the difference if the page is served by the physical server and the virtual server created by node.js?
Physical Server and Node Server are 2 different things and there is no way a single request is going to both the servers.
For eg:
I use apache server to host my website running on PHP. It serves all the html contents of my website (which involves connecting to mysql for data).
Some of the requests could be:
http://example.com/reports.php
http://example.com/search.php
At the other end I might be using nodejs server for totally another purpose. For eg: I might use it for an API, which returns JSON/XML in return. I can use this API myself for some dynamic contents by making AJAX calls with javascript or simple CURL commands from PHP. Or I might also make this API available to public.
Some of the requests could be:
http://example.com:8456/getList?apikey=&param1=&param2=
My choice for NodeJs Server used as an API would be for its ability to handle concurrent request and since its asynchronous for file operations it will be much faster than PHP.
In this case I have a website which is not only working on PHP but its the combination of 2 different technologies (PHP on Apache and Nodejs) and hence 2 servers are totally different running on same server but have there own execution space.
Third Question:
So what would be the difference if the page is served by the physical server and the virtual server created by node.js?
If I might add, it's a virtual server in the sense that apache is an virtual http server listening on whatever port. Of course apache had a lot more modules and plugins and configurations to it where as Node's is lighter (kind of like WEBrick for rails), non-blocking and agile for building on. Then again apache is more stable.. in other words, it's a decision of software, both sitting on the server listening to a particular port set by you.
That said there's deployment methods that allow you to place a node application in front of software such as nginx (another server-side software) or HAproxy (load handling with a lot of power), so really it's all up to how you choose to configure it.
Maybe I'm getting to far from your question, but I hope this helps!
Also, You should give the answer to the other guy, he came first ;)

Resources