Multiple nodejs servers or single? - node.js

For passed year I've started around 40 independent web apps using nodejs (each is running it's own server with custom port using express + socket.io). What really buggs me is that pm2 processes list have a vertical scroll )))
The question is: is it normal running so much node servers or there is a better way?

There is no issues running multiple node servers, in fact when it comes to micro services architecture the more you break down, the better. There are a lot of pros and cons to this and you need to figure out if the cons affect you system more than you're willing to sacrifice. I assume when you started out you had no idea of following the micro services architecture but since your application is now distributed over 40+ services the following is an article might give you some insight into managing it properly.
https://derickbailey.com/2016/12/12/making-the-quantum-leap-from-node-js-to-microservices/

If they all are independent then you have to create individual server. But if they are talking to each other (I think they are talking to each other by socket.io) then you can use RPC call. Their is some lib like grpc(Google) and tchannel(Uber). I hope they can solve your problem.

You can start several express and socket.io instances in one NodeJS process, if you wish, like this (based on Express Hello world example):
const express = require('express');
// First app
const app1 = express();
app1.get('/', function (req, res) {
res.send('Hello World!');
});
app1.listen(3000, function () {
console.log('Example app1 listening on port 3000!');
});
// Second app
const app2 = express();
app2.get('/', function (req, res) {
res.send('Hello World 2!');
});
app2.listen(3001, function () {
console.log('Example app2 listening on port 3001!');
});
But there are several aspects to consider:
Performance - if each app is handled by a separate NodeJS process, then apparently each of them have more CPU time and high load on one of them will not affect others.
Logging - if several apps are handled by one NodeJS process you will need to distinguish which of them outputs somehow. Otherwise output will be chaotic.
Аault tolerance - if one NodeJS process handles several apps, then in case of a critical failure in one app all of them will crush simultaneously.
Security - probably some security issues in one app could affect other apps handled by one NodeJS process.

Related

Socket.io + REST API + REACT - is it better to separate socket.io from REST API

My question could be flagged as "opinion based" but I am wondering which approach is the best for my application as I am able to do it in both ways.
I am building chat application in which users and conversations are saved in MongoDB. I will have my react application consuming API/APIs. The question is - is it better to have REST API and Socket.io applications running separate? For example:
Have REST API running on port 3005
Have Socket.io running on port 3006
React Application consuming these 2 separately and basically they will not know about each other. My endpoints in REST API endpoints and socket.io will be invoked only in front-end.
On the other hand, I can have my socket.io application and REST API working together in 1 big application. I think it is possible to make it working without problems.
To sum up, at first glance I would take the first approach - more cleaner and easy to maintain. But I would like to hear other opinions or if somebody had a similar project. Usually how the things are made in this kind of projects when you have socket.io and REST API?
I would check the pros and cons for both scenario. For example code and resource reusability is better if you have a single application and you don't have to care about which versions are compatible with each other. On the other hand one error can kill both applications, so from security perspective it is better to have separate applications. I think the decision depends on what pros and cons are important to you.
you can make a separate file for socket.io logic like this:
// socket.mjs file
import { Server } from "socket.io"
let io = new Server()
const socketApi = {
io: io
}
io.on('connection',(socket)=>{
console.log('client connected:', socket.id)
socket.join('modbus-room')
socket.on('app-server', data=>{
console.log('**************')
console.log(data)
io.to('modbus-room').emit('modbus-client', data)
})
socket.on('disconnect',(reason)=>{
console.log(reason)
})
})
export default socketApi
and add it to your project like this:
// index.js or main file
//...
import socketApi from "../socket.mjs";
//...
//
/**
* Create HTTP server.
*/
const server = http.createServer(app);
socketApi.io.attach(server);
//

Understanding microservices using Express.js and docker

I am new to node.js and docker as well as the microservices architecture.
I am trying to understand what microservices architecture actually is and theoretically I do understand what microservices arch is.Please see the following implementation
This is the index.js file:
var express = require("express");
var app = express();
var service1 = require("./service1");
var service2 = require("./service2");
app.use("/serviceonerequest",service1);
app.use("/servicetwo",service2);
app.listen(3000,function(){
console.log("listening on port 3000");
});
The file service1:
var express = require("express");
var router = express.Router();
router.use(express.json());
router.get("/",(req,res)=>{
//perform some service here
res.send("in the get method of service 1");
res.end();
});
router.post("/letsPost",(req,res)=>{
res.send(req.body);
res.end("in post method here");
})
module.exports = router;
The file service2:
var express = require("express");
var router = express.Router();
router.use(express.json());
router.get("/",(req,res)=>{
//perform some service here
res.end("in the GET method for service 2");
});
router.post("/postservice2",(req,res)=>{
res.send(req.body);
});
module.exports = router;
Does the above qualifies as 'micro service architecture'?Since there are two services and they can be accessed through the 'api-gateway' index.js?
I have read the basic tutorial of Docker.Is it possible to have the above three "modules" in separate containers?
If the above does not qualify as a microservice what should be done to convert the above sample into microservices?
This does not really qualify as a microservice architecture.
The whole code you provided is small enough to be considered one single microservice (containing two routes), but this is not an example of a microservice architecture.
According to this definition;
"Microservices are small, autonomous services that work together"
Building Microservices <-- tip: you should read this book
Both service1 and service2 to be considered a microservice should be autonomous, what is not happening when you place them together in the same express app. For example; you cant restart one without not-affecting the other. You cant upgrade version of service1 without also having to deploy service2. They are not distributed in the sense that they can leave in separate machines.
Actually I think you are missing the concept of microservice architecture. Your services must be independent and if they need to communicate with each other they must use a service discovery mechanism that will return a healthy instance of that service. Another pattern of microservices architecture is that every single service must have an endpoint (/health) that returns the health status of the service, having this your service discovery can check if that instance is healthy and return it as a healthy instance..
Microservices is not about technology it's about the concept and implementing the right patterns. Otherwise you will have a chaos architecture :D
If you want to understande the concepts I really recommend this book: http://shop.oreilly.com/product/0636920033158.do

NodeJS Express - Two NodeJS instances on same port (vhost)

I'm trying to run 2 instances of NodeJS on the same port and server from diffrent server.js files (diffrent dir, config etc). My server provider gave me an information that vhost is running for a diffrent domain, and there is the question. How to handle it in NodeJS Express app ? I've tried to use vhost from https://github.com/expressjs/vhost like that :
const app = express();
const vhost = require('vhost');
app.use(vhost('example1.org', app));
// Start up the Node server
app.listen(4100, () => {
console.log(`Node server listening on 4100`);
});
And for second application like that:
const app = express();
const vhost = require('vhost');
app.use(vhost('example2.org', app));
// Start up the Node server
app.listen(4100, () => {
console.log(`Node server listening on 4100`);
});
But when I'm trying to run second instance I'm getting EADDRINUSE ::: 4100, so vhost doesn't work here.
Do you know how to fix it ?
You can only have one process listen to one port, not just in Node.js, but generally (with exceptions that don't apply here).
You can achieve what you need to one of two ways:
Combine the node apps
You could make the apps into one application, listen once and then forward requests for each host to separate bits of code - if you wanted to achieve code separation still, the separate bits of code could be NPM modules that are actually written and maintained in isolation.
Use webserver to proxy the requests
You could run the 2 node processes on some free port, say 5000 and 5001, and use a webserver to forward requests to it automatically based on host. I'd recommend Nginx for this, as its proxying capabilities are both relatively easy to set up, and powerful. It's also fairly good at not using too many system resources. Apache and others can also be used for this, but my personal preference would be Nginx.
Conclusion
My recommendation would be that you install a webserver and forward requests on the exposed port to the separately running node processes. I'd actually recommend that you run node behind a proxy as default for a project, and only expose it directly in excpetional circumstances. You get a lot of configuration options, security, and scalability benefits if your app already involves a well hardened server setup.

heroku: route subdirectory to a second node.js app?

I have a heroku node.js app running under the domain foo.com. I want to proxy all urls beginning with foo.com/bar/ to a second node.js process - but I want the process to be controlled within the same heroku app. Is this possible?
If not, is it possible to proxy a subdirectory to a second heroku app? I haven't been able to find much control over how to do routing outside of the web app's entry point. That is, I can easily control routing within node.js using Express for example, but that doesn't let me proxy to a different app.
My last resort is simply using a subdomain instead of a subdirectory, but I'd like to see if a subdirectory is possible first. Thanks!
Edit: I had to solve my problem using http-proxy. I have two express servers listening on different ports and then a third externally facing server that routes to either of the two depending on the url. Not ideal of course, but I couldn't get anything else to work. The wrap-app2 approach described below had some url issues that I couldn't figure out.
Just create a new express server and put a middleware in the main one to redirect to the secondary when comes a request to your desired path:
var app2 = express();
app2.use(function(req, res){
res.send('Hey, I\'m another express server');
});
app.use('/foo', app2);
I haven't tried it yet in Heroku, but it the same process and doesn't create any new TCP binding or process, so It will work. For reference, a modified plain express template.
And if you really want other express process handling the connection, you need to use cluster. Check the worker.send utility.
app.use('/foo', function(req,res){
//You can send req too if you want.
worker.send('foo', res);
});
This is possible. The most elegant way I could think is by using clustering. 1 Heroku Dyno contains four cores. Therefore, you can run four worker threads to a node process.
Here is an introduction to clustering.
What you're looking at is initializing two express apps (assuming you're using express) and serving those two in two worker threads.
if (cluster.isMaster) {
// let's make four child processes
for (var i = 0; i < 4; i++) {
if (i%2 == 0) {
cluster.fork(envForApp1);
} else {
cluster.fork(envForApp2);
}
}
} else {
// refer to NODE_ENV and see whether this should be your app1 or app2
// which should be started. This is passed from the fork() before.
app.listen(8080);
}

Can I define Express routes in a child process?

So I run a bunch of a little chatbots written in node, nothing too exciting. However, I recently decided to give them their own little web page to display information in a graphical manner. To do this, I figured I'd just run express.
However, I'm running my bots with a wrapper file that starts each chatbot as a child process. Which makes using express a little tricky. Currently I'm starting the express server in the wrapper.js file like so:
var express = require("express");
var web = express();
web.listen(3001);
And then in the child processes, I'm doing this:
var express = require("express");
var web = express();
web.get("/urlforbot",function (req,res) {
res.send("Working!");
});
However, when I navigate to :3001/urlforbot, I get Cannot GET /urlforbot.
Any idea what I'm doing wrong and how to fix this?
Edit: This is my complete wrapper file: http://snippi.com/s/3vn56m2
Edit 2: This is what I'm doing now. I'm hosting each bot on it's own port, and storing that information in the configs. This is the code I'm using, and it appears to be working:
web.get("/"+cfg.route, function (req,res) { // forward the data
res.redirect('http://url.com:'+cfg.port+"/"+cfg.route);
});
Since your bots run as separate processes (any particular reason?), you have to treat each one as having to implement their own HTTP server with Express:
var express = require("express");
var web = express();
web.get("/urlforbot",function (req,res) {
res.send("Working!");
});
web.listen(UNIQUE_PORT_NUMBER);
Each bot process needs to listen on a unique port number, it can't be shared.
Next, you need to map requests coming in on port 3001 in the 'master' process to the correct child process' Express server.
node-http-proxy has a useful option called a ProxyTable with which to create such a mapping, but it requires the master process to know what the endpoint (/urlforbot in your terms) for each bot is. It also requires that the master knows on which port the bots are listening.
EDIT: alternatively, you can use child_process.fork to fork a new process for each of your bots, and communicate between them and the master process (port numbers and such, or even all the data required to generate the /urlforbot pages) using the comm channel that Node provides, but that still sounds like an overly complex setup.
Wouldn't it be possible to create a Bot class instead? You'd instantiate the class for each bot you want to run, and that instance loads its specific configuration and adds its routes to the Express server. All from the same process.

Resources