Restart one app from sub-apps NODE - node.js

I have two apps
/app1
/app2
I am running both on same port
using
app
.use('/app1', require('./app1/app').app)
.use('/app2', require('./app2/app').app)
.listen(80);
How can I restart only one app.
Without affecting other.
Thanks

You are not running two apps on one port, you are running one app (one Node.js process) that handles several API paths. So if, by restart, you mean stop the process and then start the process, then it is impossible.

Related

Hosting multiple node app.js in a sub-domain

Is there a way i can host multiple nodejs app.js on single port for multiple tenancy? Either in Docker or in vps
Each tenant can only access his/her folder containing his/her app.js.
I wanted to create a package inside node_module and create multiple customers workspace folders inside the package and be importing the matched subdomain hostname request with customer workspace from main nodejs and whereby activating wild subdomain like *.mainDomain.com But i think this might look silly because customer workspace inside node_module might not look safe
You cannot use the same port for multiple node.js applications but you can look at leveraging nginx. You can configure nginx to listen on one port for multiple domain names and then redirect requests to the IP and port of the respective node.js application.
This link might help you to achieve it:
https://serverfault.com/questions/536576/nginx-how-do-i-forward-a-http-request-to-another-port
Is there a way i can host multiple nodejs app.js on single port for multiple tenancy? Either in Docker or in vps
You cannot have more than one applications listening on the same port on a single machine. If you want to route traffic from a single port to more than one applications or dockers you would need an application to listen on a port an decide where to route the incoming request every time. Each of the rest of the application would have to listen to a different port of the host machine.
As for the second part of your question, I think it quite clear what you are trying to do. What is the main nodejs, and why would you create a folder inside node_modules?

Do I need a different server to run node.js

sorry if this is a wrong question on this forum but I am simply just stuck and need some advice. I have a shared hosting service and a cloud based hosting server with node.js installed. I want to host my website as normal but I also want to add real time chat and location tracking using node.js I am confused with what I am reading in several places because node.js is itself a server but not designed to host websites? So I have to run 2 different servers? One for the website and one to run node.js? When I setup the cloud one with a node.js script running I can no longer access the webpages.
Whats the best way for me achieve this as I am just going round in circles. Also is there a way I can set up a server on my PC and run and test both of these together before hand so I see what is needed and get it working as it will stop me ordering servers I dont need.
Many thanks for any help or advice.
Node can serve webpages using a framework like Express, but can cause conflicts if run on the same port as another webserver program (Apache, etc). One solution could be to serve your webpages through your webserver on port 80 (or 443 for HTTPS) and run your node server on a different port in order to send information back and forth.
There are a number of ways you can achieve this but here is one popular approach.
You can use NGINX as your front facing web server and proxy the requests to your backend Node service.
In NGINX, for example, you will configure your upstream service as follows:
upstream lucyservice {
server 127.0.0.1:8000;
keepalive 64;
}
The 8000 you see above is just an example, you may be running your Node service on a different port.
Further in your config (in the server config section) you will proxy the requests to your service as follows:
location / {
proxy_pass http://lucyservice;
}
You're Node service can be running in a process manager like forever / pm2 etc. You can have multiple Node services running in a cluster depending on how many processors your machine has etc.
So to recap - your front facing web server will be handling all traffic on port 80 (HTTP) and or 443 (HTTPS) and this will proxy the requests to your Node service running on whatever port(s) you define. All of this can happen on one single server or multiple if you need / desire.

Watch logs from NodeJS on EC2

I have a single EC2 instance on AWS, running HTTPS server with NodeJS.
I'm starting my NodeJS server from the /etc/rc.local, so it will start automatically on every boot.
I have 2 questions:
Is there a better way to start an https server listening on port 443 without using sudo path/to/node myScript.js? What risks do I have if I run this process as root?
Where do I see my logs? When running the script from the shell, I see the logs of the process, but now when it is runs from rc.local, how do I access the output of the server?
Thanks!
Starting the application using sudo definately is not a good practice. You should not run a publicaly accessible service with root credentials. If there is a flaw in your application and someone find this out there is the danger to access more services in the machine.
Your application should start in a non-priviledged port (e.g. 5000) and then having nginx or apache as a reverse proxy that will forward the traffic internally to your application that is running on port 5000. pm2 is suggesting something like that as well: http://pm2.keymetrics.io/docs/tutorials/pm2-nginx-production-setup. Searching online you will be able to find tutorials on how to configura nginx to run on https and how to forward all the traffic from http to https. Your application should not be aware of ssl certificates etc. Remember that the pm2 module should be installed locally within your project and you have to take advantage of the package.json. In there you can define a task that will boot your application on production using the local pm2 module. The advantage is that you don't have to install the pm2 module globally and you will not mess the things again with the permissions and super users.
I don't think that the log is saved somewhere until you will tell it to happen in the rc.local script. How do you spawn the process in there? Something like that should redirect the stdout to a file:
node path/to/node myScript.js 2> /var/log/my-app.rc.local.log # send stderr from rc.local to a log file`
Don't you use a logger in your application, though? I would suggest picking one (there are a lot available like bunyan, winston etc) and substitute all of your console.logs with the logger. Then you can define explicitly in your application where the logs will be saved, you can have different log levels and more features in general.
Not a direct answer, more a small return on experience here.
We have a heavy used nodejs app in production on AWS, on a non-Docker setup (for now ;) ).
We have a user dedicated to run the node app, I guess that if you start your node process with root, it has root access, and that's not a safe thing to do.
To run the app we use pm2, as a process manager, it allow to restart the node process when it fail (and it will), and scale the number of worker to match the number of core of your EC2 instance. You also have access to log of all the workers using ./path/to/node ./node_modules/.bin/pm2 logs, and can send it to whatever you want (from ELK to slack).
My2cents.

Parallel deployment of express nodejs application to multiple application servers

I have a mean.io express/node js web application deployed on a Linode stack.
I have my 2 application servers running Ubuntu 14.04, which are accessed behind 2 Haproxy load balancers again running on Ubuntu 14.04.
Let us call Application server 1 => APP1 and Application server 2 => APP2
Currently, I deploy manually by
Removing APP1 entry from haproxy.cfg of both the load balancers and restarting.
Update the code on APP1
Remove APP2 entry from haproxy.cfg of both the load balancers and put APP1 entry back
Restart APP1
Update code on APP2
Put the APP2 entry back in both haproxy.cfg's and restart
Restart APP2
I follow this process so that at any point of time the users of our web application get consistent data even during deployment, i.e both the instances of app server are not running a different copy of the code.
I am moving to automated deployment system and the 2 options I have looked at for deployment are Capistrano and Shipit JS.
They both provide ways to mention multiple servers in their configuration, for e.g in capistrano
role :app, "APP1", "APP2"
and in Shipit JS
shipit.config.servers = ['APP1', 'APP2']
So, my question is how do these libraries make sure that both the servers are updated parallely before they are restarted? Is there a way by which they lock incoming requests to these servers during updation ?
Do these deployment systems work only for simple Client - App Server architecture or they can used in systems where there is a load balancer?
Any explanation would be invaluable. I have tried my best to explain the situation here.If you need more input please mention below in the comments.

How to serve different node.js applications using the same port?

I have a site for hosting my dev projects. Each project is a node.js file. The problem is I can only have 1 project online at the same time - except if I host them in a different ports. But suppose I want to publish 2 projects like that: my_site.com/foo, my_site.com/bar, where the first is managed by "foo.js" and the second by "bar.js". How can I do that?
You need a proxy in front. You assign each separate node process a different port. The proxy forwards traffic on port 80 to the appropriate backend node process.
This can be accomplished with node-http-proxy (https://github.com/nodejitsu/node-http-proxy), or any web server. Nginx and lighttpd make it ridiculously easy, Apache less so but still completely doable.
Setup a Nginx process to reverse proxy to your Node processes. The Nginx process will hold onto the port and send requests for my_site.com/foo to the node foo.js backend process and send requests for my_site.com/bar to the node bar.js backend process.
This way your Node processes stay completely independent and can easily be separated out to different servers later if one of them becomes popular.
If you are using express/connect, you can use something along the lines of
var bar = require("./bar"),
foo = require("./foo");
app.use(express.vhost("my_site.com/bar", bar));
app.use(express.vhost("my_site.com/foo", foo));
Is A Separate File.
NOTE: Not Tested

Resources