Can we use both NGINX and PM2 for node.js production deployment? - node.js

I am new to Node.js. I have built my first Node.js server. I am doing some research to improve performance of node js server in production. So I learned about NGINX and Process Manager(PM2).
NGINX:
It can load balance the incoming requests.
It can act as reverse proxy for our application.
PM2:
It can divide our application as clusters though it has in built load balancer.
We can monitor and restart application when crashed.
Can we use both for production?
Though load balancer is there in PM2 can I use only PM2?
What is the advantage of using NGINX over PM2?
If I use Load balancer using NGINX and clustering using PM2, will it give better performance than using only one (NGINX or PM2)?

This is a huge topic but let me help and give you some pointers.
Nginx is much more than just a reverse proxy. It can serve static content, can compress the response content, can run multiple apps on different port on the same VM and much more.
PM2 essentially helps you to scale throughput of your service by running it in cluster mode and utilizing all the cores of the box. Read this stackoverflow answer to understand more on this.
Now to answer your question
Can we use both for production?
Yes and you should. Nginx can run on port 80. PM2 can run on port 3000 (or whatever port) which can then manage traffic within the instances of the app.
gzip alone will make a huge difference in the app end user performance.
Here is a good article in case you need code help on how to set it up

Related

Load balancing in node server

I have created a node server using express. I using this architecture as follows:
-> I am serving node port as proxy on domain using apache.
-> I am using pm2 for handling node process. I have created two cluster and ran individually on different cores. (http://pm2.keymetrics.io/docs/usage/cluster-mode/)
My question is
Am i doing this correct way as production standard?
Do i need load balancing on apache level? because clusters will come
in picture after apache?
Am i correct?
Yes, that's correct architecture.
But Nginx and Pm2 go more hand in hand. Apache is okay too.

Deploy node.js in production

What are the best practices for deploying a nodejs application in production?
I would like to know how deploy for production Api's nodejs is being done today, today my application is in docker and running locally.
I wonder if I should use a Nginx inside the container and deploy my server on it or just upload my image node that is already running today.
*I need load balance
There are few main types of deployment that are popular today.
Using platform as a service like Heroku
Using a VPS like AWS, Digital Ocean etc.
Using a dedicated server
This list is in the order of growing difficulty and control. So it's easiest with PaaS but you get more control with a dedicated server - thought it gets significantly more difficult, especially when you need to scale out and build clusters.
See this answer for more details on how to install Node on a VPS or a dedicated server:
how to run node js on dedicated server?
I can only add from experience on AWS using a NAT Gateway which is a dedicated Node server with a MongoDB server behind the gateway. (Obviously this is a scalable system and project.)
With or without Docker, you need to control the production environment. This means clearly defining which NPM libraries you will need for production, how you handle environment variables and clusters for cores.
I would suggest, very strongly, using a tool like PM2 to handle clusters, server shutdowns and restarts and logs. (Workers & slaves also if you need them and code for them).
This list can go on and on, but keep in mind this is only from an AWS perspective. Setting up a Gateway correctly on AWS is also not an easy process. Be prepared for some gotcha's along the way.

Load balancing express app instances

This is my first load balancing question.
I have written a simple express app to figure out how load balancing works. Also I was taking a look at something like docker. If I had to use Nginx to load balance, should I be running 4 different express instances in 4 different docker containers and then load balance between them using Nginx where Nginx sits in its own container ?
Have I got it right? I am kind of confused
I provided an answer to a similar post some time ago, but here's the important bits in a nutshell:
Yes, it's possible to use Nginx to load balance requests between
different instances of your Node.js services. Each Node.js instance
could be running in a different Docker container.
You can then modify Nginx to load balance between the containers using for example the configuration file mentioned in the link above.
Nginx itself can perfectly run within a Docker container. For these kinds of setups, Docker compose can help you orchestrate the configuration such that you can start up all Nginx and Express containers with a simple command.

Redis deployment configuration - master slave replication

Currently I have two servers which I have deployed node.js/Express.JS based web services API. I am using Redis for caching the JSON strings.
What will be the best option deploying this setup in to production? I see here it advices to go with a dedicated server redis. OK. I take it and use a dedicated server for running redis master. Can I use existing app servers as slave nodes? Note : these app servers are running an Node/Express application.
What other other options do I have?
You can.
It all depends on the load that those other servers have, it's a problem of resource sharing. To be honest my main issue with your architecture is not the dedicated vs the non-dedicated servers, it's the fact that you are placing a Redis server (master or not) on a host that most likely will be facing the internet (expressJS app), meaning, it's quite exposed.
If you can simulate HTTP load into your Node/Express JS servers, see the difference between running some benchmark tests on your dedicated server vs the non dedicated ones:
On a running redis server type in:
redis-benchmark -q -n 100000
If the app servers are being hammered and using all cores frequently you should see a substantial difference in the benchmarks.
My suggestion is, go ahead with your first setup and add monitoring for the redis response times, and only act when you have to, which might be now if the benchmarks show very poor results.
As a side note, consider the option of not sharing hosts for services that you expose to the internet with services that perform internal functions to your application.

Expressjs to production

I am new to expressjs, I want to deploy an expressjs app to production. Based on my googling, here's the setup on rackspace I am thinking:
1 Load balancer + 2 server + Run app with forever
My questions are:
What engine shall I use to run the app? nginx?
how many app can I run per server?
Thank you.
If you are serving static files or using any of nginx's reverse proxy features, you can use nginx. But if not, since your servers are behind a load balancer, nginx isn't necessary at all.
The rule of thumb is one node.js/express.js process per core. Have a look at cluster to help you manage this. Make sure your load balancer knows about all the node.js processes you are running (and is not just load balancing between one IP/port pair on each server).
Update: Node.js now has cluster built in out of the box.
Also, if you are deploying on Ubuntu you can use upstart instead of forever if you like.
You need nodejs installed on your machine to run nodejs. nginx is a server used for reverse proxy and a load balancer. Also you can run the app through pm2 instead of forever which will handle all the clustering and running your app in background.

Resources