How to use Node js in conjunction with Webmin - node.js

I have a server running webmin (different domains pointing to different app/directories). Currently I can have my php app running from a directory and all I need to do in order to make it live is get webmin to direct that domain to that specific directory.
Can I do the same with a node js app? If not, how can I use node and webmin in the same box?

I know you didn't say this specifically, but assuming you're hosting the other web stuff through, say, Apache, you would need to leverage that, but you can probably get the effect you want. Basically, it sounds like you want to be able to use "host header" separation for services, rather than having a separate IP address for, say, Apache and Node.js to each use.
So, if you let Apache bind to the main port you're using (80/443/both), then you would run ode and have it configured to listen on a different port (say 8080 as in the example you left in another comment). You can then use mod_proxy in Apache and have it route request with certain domain names to Node. Here's a: more concrete example of this but really the idea is not specific to Node. It can apply to any other process that wants to respond to HTTP requests on your server (or even on a different server).

Related

How to get haproxy to use a specific cluster computer via the URI

I have successfully set haproxy on my server cluster. I have run into one snag that I can't find a solution for...
TESTING INDIVIDUAL CLUSTER COMPUTERS
It can happen that for one reason or another, one computer in the cluster gets a configuration variation. I can't find a way to tell haproxy that I want to use a specific computer out of a cluster.
Basically, mysite.com (and several other domains) are served up by boxes web1, web2 and web3. And they round-robin perfectly.
I want to add something to the URL to tell haproxy that I specifically want to talk to web2 only because in a specific case, only that server is throwing an error on one web page.
Anyone know how to do that without building a new cluster with a URI filter and only have one computer in that cluster? I am hoping to use the cluster as-is but add something to the URI that will tell haproxy which server to use out of the cluster.
Thanks!
Have you thought about using different port for this? Defining new listen section with different port, because, as I understand, you can modify your URL by any means?
Basically, haproxy cannot do what I was hoping. There is no way to add a param to the URL to suggest which host in the cluster to use.
I solved my testing issue by setting up unique ports for each server in the cluster at the firewall. This could also be done at the haproxy level.
To secure this path from the outside world, I told the firewall to only accept traffic from inside our own network.
This lets us test specific servers within the cluster. We did have to add a trap in our PHP app to deal with a session cookie that is too large because we have haproxy manipulating this cookie to keep users on the server they first hit. So when the invalid session cookie is detected, we have the page simply drop the session and reload the page.
This is working well for our testing purposes.

NodeJs instead of Apache

I have a website under domain (say example.com) which is hosted on Amazon Web Services EC 2 instance which has Apache already installed and ready to run on port 80.Now I wish to transfer from Apache to node JS (where Node JS runs on another port, say 8001).How to change the HTTP port address of EC 2 say that when i go to that URL (example.com) it should run on node JS instead of Apache (While for temporary,node JS runs on example.com:8001).
How is it possible and Kindly help?
So you cannot direct a standard web address (e.g. www.example.com) to anything other than port 80. By default http is on port 80 and https is on port 443. You can override that default by explicitly giving the port but you cannot change that default.
So your options are:
Replace Apache with Node on port 80. This will involve shutting down Apache (and making sure it doesn't auto restart on reboot), and changing your node port to port 80. This also will probably require running your node service as root (as port 80 is usually protected) and this is not recommended (Apache starts as root to get the port but then usually immediately switches to non-root user).
Have Apache proxy forward requests to Node. This means Apache is still your main webserver and listens on port 80 but certain requests are sent on to Node.
This second option could be done using mod_proxy with config like this:
ProxyPass "/foo" "http://localhost:8001/"
ProxyPassReverse "/foo" "http://localhost:8001/"
It all depends on what you want to use your set up for and making best use of the software available to you.
Typical set up is a multi-layered approach involving one or more of these:
LoadBalancer (optional for high load sites or where resiliency is key)
Webserver
Appserver
Database
Yes you could use just Node for all of these layers. However, to me, it is more an application server than a webserver.
A webserver like Apache or Nginx is specifically designed to act like a web server, and by that I mean serving static pages and doing other top layer stuff. They have several features built up over their years to provide speed and security. Now nearly everything they can do, can be done in Node but not quite as easily and not by standard and often requires pulling in 3rd party modules.
A webserver then typically offloads dynamic work to other programs. This could be scripts (PHP or Perl), or separate app servers like Tomcat, Jboss or Node. These are typically very good at specific tasks (e.g. talking to database and generating dynamic pages) but less good at severing static pages quickly.
The beauty of node to me is for micro- services, where you can have lots of independent, but potentially interlinked, node services which are all lightweight and good at one task and the web server is still needed in front of them. This compares to a bulky multi-tasking J2EE server like Tomcat or Jboss that you would use in the past which tried to do every dynamic app under one process (though admittedly often under separate WAR files).
So, without knowing your full use case, I would suggest Apache and Node instead of Node replacing Apache.

Run Ghost as an NPM module on a subdomain using Node.JS

Is there any way to run Ghost on a subdomain using Node.JS? I am able to run it normally on Node.JS like:
App.Modules.Ghost = require('ghost'); /**< Ghost module. */
App.Apps.Ghost = App.Modules.Ghost({ config: '/Assets/Ghost/Config.js'.LocalFilePath }); /**< Create Ghost app. */
Then, I am then able to go to http://example.com/ghost/ and view my blog. Although this works for now, I want to be able to view my blog at http://blog.example.com/ using Node.JS.
Sadly, the way networking works prevents this in the context you desire. In order to achieve that sort of functionality, you would need a proxy server to go in front of the entire application. I would suggest NginX for this ability, due to its speed and wide-spread use.
Why is this not possible?
In this sense, networking is the system where you bind to an IP and a port. When you bind, nothing else can bind to that same IP/port. Since a domain (and subdomain) simply point to an IP address, there is no way that you can separate these connections at the networking level. This is why the Host HTTP header was added.
How does NginX do it?
NginX parses the Host header and can send the connection to your Ghost server as you wish it to be forwarded to. This also allows you to forward the main domain (http://example.com) to whatever website you like, therefor using different applications and such on the same IP and port.
This answer contains the best directions on how to achieve this functionality.

How to serve different node.js applications using the same port?

I have a site for hosting my dev projects. Each project is a node.js file. The problem is I can only have 1 project online at the same time - except if I host them in a different ports. But suppose I want to publish 2 projects like that: my_site.com/foo, my_site.com/bar, where the first is managed by "foo.js" and the second by "bar.js". How can I do that?
You need a proxy in front. You assign each separate node process a different port. The proxy forwards traffic on port 80 to the appropriate backend node process.
This can be accomplished with node-http-proxy (https://github.com/nodejitsu/node-http-proxy), or any web server. Nginx and lighttpd make it ridiculously easy, Apache less so but still completely doable.
Setup a Nginx process to reverse proxy to your Node processes. The Nginx process will hold onto the port and send requests for my_site.com/foo to the node foo.js backend process and send requests for my_site.com/bar to the node bar.js backend process.
This way your Node processes stay completely independent and can easily be separated out to different servers later if one of them becomes popular.
If you are using express/connect, you can use something along the lines of
var bar = require("./bar"),
foo = require("./foo");
app.use(express.vhost("my_site.com/bar", bar));
app.use(express.vhost("my_site.com/foo", foo));
Is A Separate File.
NOTE: Not Tested

Node.js introduction

Please pardon my ignorance on node.js. I have started reading on node.js and have some perception which might be wrong. So needed it to clarify
When we use createServer() method, does it creates a virtual server. Not sure whether the term "virtual" is appropriate, but it's the best I can describe it :)
I am confused that how should I deploy my application having node.js + other custom js files as a part of it. If I deploy my application in the main server, does that mean I have two servers?
Thanks for bearing with me.
I will try to answer that:
Q1:
createServer basically creates a process which listens on the specified port for the requests. So yes you can call it as a virtual server which constantly listens for request at the port.
Q2:
Yes you can say that it has now 2 servers
For eg: you server had apache initially which listens to port 80 (you can access it as http://example.com/ it by default looks for port 80)
and then you also start the node service listening on some other port for eg: port 8456 (you can access it as http://example.com:8456/ which will look for port 8456)
So yes you can there are two servers.
EDIT
Q: So what would be the difference if the page is served by the physical server and the virtual server created by node.js?
Physical Server and Node Server are 2 different things and there is no way a single request is going to both the servers.
For eg:
I use apache server to host my website running on PHP. It serves all the html contents of my website (which involves connecting to mysql for data).
Some of the requests could be:
http://example.com/reports.php
http://example.com/search.php
At the other end I might be using nodejs server for totally another purpose. For eg: I might use it for an API, which returns JSON/XML in return. I can use this API myself for some dynamic contents by making AJAX calls with javascript or simple CURL commands from PHP. Or I might also make this API available to public.
Some of the requests could be:
http://example.com:8456/getList?apikey=&param1=&param2=
My choice for NodeJs Server used as an API would be for its ability to handle concurrent request and since its asynchronous for file operations it will be much faster than PHP.
In this case I have a website which is not only working on PHP but its the combination of 2 different technologies (PHP on Apache and Nodejs) and hence 2 servers are totally different running on same server but have there own execution space.
Third Question:
So what would be the difference if the page is served by the physical server and the virtual server created by node.js?
If I might add, it's a virtual server in the sense that apache is an virtual http server listening on whatever port. Of course apache had a lot more modules and plugins and configurations to it where as Node's is lighter (kind of like WEBrick for rails), non-blocking and agile for building on. Then again apache is more stable.. in other words, it's a decision of software, both sitting on the server listening to a particular port set by you.
That said there's deployment methods that allow you to place a node application in front of software such as nginx (another server-side software) or HAproxy (load handling with a lot of power), so really it's all up to how you choose to configure it.
Maybe I'm getting to far from your question, but I hope this helps!
Also, You should give the answer to the other guy, he came first ;)

Resources