Logging Middleware Microservice - node.js

I am required to save logs into a MySQL database of each request and response made to the backend. The issue is that we are migrating to microservices architecture. The backend was made with NodeJS and Express, and it has a middleware that does this task. Currently, it has this middleware attached to each microservice.
I would like to isolate this middleware as its own microservice. The issue is that I don't know how to redirect the traffic this way. This is how I would like to manage it:
I would like to do it this way, because we can make changes or add features to the middleware without having to implement it in each microservice. This is the middleware's code:
const connection = require("../database/db");
const viewLog = (req, res, next) => {
const oldWrite = res.write,
oldEnd = res.end,
chunks = [],
now = new Date();
res.write = function (chunk) {
chunks.push(chunk);
oldWrite.apply(res, arguments);
};
res.end = function (chunk, error) {
if (chunk) chunks.push(chunk);
const bodyRes = Buffer.concat(chunks).toString("utf8");
connection.query("CALL HospitalGatifu.insertLog(?,?,?,?,?)", [
`[${req.method}] - ${req.url}`,
`${JSON.stringify(req.body) || "{}"}`,
bodyRes,
res.statusCode === 400 ? 1 : 0,
now,
]);
oldEnd.apply(res, arguments);
};
next();
};
module.exports = viewLog;
I think there might be a way to manage this with Nginx which is the reverse proxy that we are using. I would like to get an approach of how to change the logs middleware.

Perhaps you might want to take a look at the sidecar pattern which is used in microservice architectures for common tasks (like logging).
In short, a sidecar runs in a container besides your microservice container. One task of the sidecar could be intercepting network traffic and logging requests and responses (and a lot of other possible tasks). The major advantage of this pattern is that you don't need to change any code in your microservices and you don't have to manage traffic redirection yourself. The latter will be handled by the sidecar itself.
The disadvantage is that you are required to run your microservices containerized and use some kind of container orchestration solution. I assume this being the case since you are moving towards a microservices based application.
One question about the log service in between of the webapp and the NGNIX server. What if the logging services goes down for some reason, is it acceptable for the entire application to go down?

Let me give you not exactly what you requested but something to think about.
I can think on 3 solutions for the issue of logging in microservices, each 1 have its own advantages and disadvantages:
Create a shared library that handles the logs, I think its the best choice in must cases. An article I wrote about shared libraries
You can create API gateway, it is great solution for shared logic to all the requests. So it will probably be more work but then can be used for other shared logic. Further read (not written by me :) )
A third option (which I personally don't like) is create a log microservice that listens to LogEvent or something like that. Then from your MSs publish this event whenever needed.

Related

Is it conceptually right to create a rest API using the post for every request?

I'm developing a rest API that, once you've bought a paid plan and received an apiKey, you can create a maximum of a certain number of apps depending on the paid plan chosen. I'm using node.js and to handle requests I'm using HTTPS module like this:
https.createServer(options, (req, res) => {
res.writeHead(200);
req.on('data', function (data) {
let command = data.toString();
var cmd = utils.getCommand(command);
var cmdResult = "";
switch (cmd.method) {
case 'SIGNIN':
cmdResult = auth.signin(cmd.parameters[0], cmd.parameters[1], cmd.parameters[2]);
break;
case 'LOGIN':
cmdResult = auth.login(cmd.parameters[0], cmd.parameters[1], cmd.parameters[2]);
break;
}
res.end(result.getResult(cmdResult));
});
}).listen(11200);
For debugging I'm using curl like this:
curl -X POST 'SIGNIN|username|password|apikey' https://mycurrentipaddress:11200
The above command performs a signin to add a new app that using the API; the authentication work with the apiKey
the API will offer an authentication service, a noSQL DB based on JSON and push notifications.
Was my idea and its current realization conceptually correct? or it doesn't make sense?
No, that isn't REST, you should read this for a better understanding of that pattern.
That doesn't mean it won't work, but it's not a conventional pattern, which I think is what you're looking for.
There are any number of tutorials on the internet on how to write a RESTful API, just take the time to follow some. Also, I'd also strongly suggest starting with a framework like ExpressJS rather than the Node standard libraries, you'll have plenty of opportunity to learn those in the future, starting with a framework is a much better path for learning, imo.

Socket.io + REST API + REACT - is it better to separate socket.io from REST API

My question could be flagged as "opinion based" but I am wondering which approach is the best for my application as I am able to do it in both ways.
I am building chat application in which users and conversations are saved in MongoDB. I will have my react application consuming API/APIs. The question is - is it better to have REST API and Socket.io applications running separate? For example:
Have REST API running on port 3005
Have Socket.io running on port 3006
React Application consuming these 2 separately and basically they will not know about each other. My endpoints in REST API endpoints and socket.io will be invoked only in front-end.
On the other hand, I can have my socket.io application and REST API working together in 1 big application. I think it is possible to make it working without problems.
To sum up, at first glance I would take the first approach - more cleaner and easy to maintain. But I would like to hear other opinions or if somebody had a similar project. Usually how the things are made in this kind of projects when you have socket.io and REST API?
I would check the pros and cons for both scenario. For example code and resource reusability is better if you have a single application and you don't have to care about which versions are compatible with each other. On the other hand one error can kill both applications, so from security perspective it is better to have separate applications. I think the decision depends on what pros and cons are important to you.
you can make a separate file for socket.io logic like this:
// socket.mjs file
import { Server } from "socket.io"
let io = new Server()
const socketApi = {
io: io
}
io.on('connection',(socket)=>{
console.log('client connected:', socket.id)
socket.join('modbus-room')
socket.on('app-server', data=>{
console.log('**************')
console.log(data)
io.to('modbus-room').emit('modbus-client', data)
})
socket.on('disconnect',(reason)=>{
console.log(reason)
})
})
export default socketApi
and add it to your project like this:
// index.js or main file
//...
import socketApi from "../socket.mjs";
//...
//
/**
* Create HTTP server.
*/
const server = http.createServer(app);
socketApi.io.attach(server);
//

Nodejs Parallel API Hits

I am trying to develop an node js application that acts as api-gateway / facade layer for Services(developed in Spring Boot).
Is it a good practice or not?
If yes, which nodejs framework should I use?(Async / co / Promise / Async-Await ) etc. I mean what is currently used mostly on production enviornemnt?
"Is it a good practice or not?"
What is your question related to? Using an API gateway/facade? Using spring boot? Using async/await...? What exactly is your problem?
I guess you want to develop a spring boot based microservice architecture with a nodeJS based api orchestrator as a frontcontroller and single entry point?
Don't confuse the technical side of naive routing (load balancing with nginx, round robin, reverse proxy, etc.) to increase capacity, speed, availability etc. with the semantic business integration of services through url path mapping.
An API Orchestrator addresses the semantic abstraction and integration of an underlying service landscape. API Gateway vs. API Orchestrator!
To my personal view, using an API Orchestrator is a acceptable solution in conjunction with microservices. It is the easiest and modest way to
integrate and componse an underlying service layer.
Just to state a few positive and negative aspects:
Single entry point for standard business cases such as
authentification, security issues, session mangament, logging etc.
Can also be started and managend as a microservice. Feel free to use
a 3-tier layered architecture for the API orchestrator microservice
Abstracts the complexity of an underlying microservice layer
Might become a god thing.
In context of microservices, the API Orchestrator performs to much
of business cases
High coupling, complexity...
Design trial of a nodeJS based API Orchestrator with HTTP Communication ...
Evaluate a (web) server (express.js, hapi.js, your-own-node-server)
Evaluate a http-request API (axios, node-fetch, r2, your-own-http-api). HTTP-API should resolve to a promise object!
Example of an express.js based API Orchestrator:
const express = require('express');
const http = require('http');
const path = require('path');
const app = express();
const port = 3000;
// define middleware plugins in express.js for your API gateway like session management ...
app.use(express.static(path.join(__dirname, 'public')));
// define relevant business/use case relevant semantic routes or commands e.g. /getAllUsers or REST-URL or /whatever
app.get('/whatever', (request, response) => {
//consumes whatever service
const getWhatEverToGet = () => {
return new Promise((resolve, reject) => {
//connection data should be read from a service registry or by configuration management (process level, file level, environemnt level)
http.get({
hostname: 'localhost',
port: 3001,
path: `/whatever_service_url`
}, (res) => {
// built-in HTTP-API http.get() uses streams, hence "onData"-event should be buffered, not done here!
res.on('data', (data) => {
resolve(data.toString());
});
});
});
}
// Here you can consume more services with the same code, when they are connected to each other use async/await to share data synchronized...
//consumes whatever2 service returns promise
//consumes whatever3 service returns promise
const respondWhatEverData = async () => {
let whatEver = await getWhatEverToGet();
response.send(whatEver)
}
// trigger service complete
respondWhatEverData();
})
app.listen(port, (err) => {
if (err) {
return console.log('Shit happens...', err)
}
console.log(`server listens on ${port}`)
})
TL;DR If your NodeJS application is only expected to forward the request to Spring Boot application, then NodeJS setup would probably not be worth it. You should look at Nginx revere proxy which can do all that efficiently.
Async / co / Promise / Async-Await are not frameworks. Promise / async-await are programming constructs in NodeJS; Async / co are convenience libraries to make manage asynchronous code manageable before Promises and async-await were introduced. That said there are multiple rest frameworks, that you could use to receive and pipe requests to your SpringBoot servers. Take a look at Express.JS, Restify, Sails.js all of them can add REST capabilities to NodeJS. You will also need a Rest Client library (like axios or request both of them then support Promises) to be able to forward your requests to target server.

Unique configuration per vhost for Micro

I have a few Zeit micro services. This setup is a RESTful API for multiple frontends/domains/clients
I need to, in my configs that are spread throughout the apps, differentiate between these clients. I can, in my handlers, setup a process.env.CLIENT_ID for example that I can use in my config handler to know which config to load. However this would mean launching a new http/micro process for each requesting domain (or whatever method I use - info such as client id will prob come in a header) in order to maintain the process.env.CLIENT_ID throughout the request and not have it overwritten by another simultaneous request from another client.
So I have to have each microservice check the client ID, determine if it has already launched a process for that client and use that else launch a new one.
This seems messy but not sure how else to handle things. Passing the client id around with code calls (i.e. getConfg(client, key) is not practical in my situation and I would like to avoid that.
Options:
Pass client id around everywhere
Launch new process per host
?
Is there a better way or have I made a mistake in my assumptions?
If the process per client approach is the better way I am wondering if there is an existing solution to manage this? Ive looked at http proxy, micro cluster etc but none seem to provide a solution to this issue.
Well I found this nice tool https://github.com/othiym23/node-continuation-local-storage
// Micro handler
const { createNamespace } = require('continuation-local-storage')
let namespace = createNamespace('foo')
const handler = async (req, res) => {
const clientId = // some header thing or host
namespace.run(function() {
namespace.set('clientId', clientId)
someCode()
})
})
// Some other file
const { getNamespace } = require('continuation-local-storage')
const someCode = () => {
const namespace = getNamespace('foo')
console.log(namespace.get('clientId'))
}

Architecture for microservices

I've recently started to work with node.js and I have to build an architecture that should use multiple express.js services. Some of these services will have to be located on one server, anothers - on other server machines. I want to build a base service (like API Gateway), but I don't know what the proper way to communicate between this Gateway and microservices, or between two microservices.
Currently I'm working with a solution based on this:
# inside Gateway server I call another service:
http.get('http://127.0.0.1:5001/users', (service_res) ->
data = ''
service_res.on 'data', (chunk) ->
data += chunk
service_res.on 'end', ->
# some logic on data
).end()
I have a strong feeling that this approach is not right. What the proper way to build communication logic between API Gateway and microservices?
The logic you have is not incorrect but what would probably be better is to build a layer of abstraction on top of making requests to an another service eg. the API gateway to another microservice. Lets just call that microservice B for this instance (API gateway to make a request to B).
B in this case should provide its own client on how another service should interact with it, whether its through HTTP or WebSockets, the protocol is up to B because B understands how one should communicate with it. The argument for the client and the service being implemented together is that these two components should have a higher level of cohesion since technically they are bound by a contract eg. if a requests needs to be made to a service, it needs to adhere to the contract that the service requires.
In simple pseudocode with Express:
// implemented elsewhere, ideally next to the service that it communicates with
function BServiceClient() {
// ...
}
// the API gateway's calling code
app.get('...', function(request, response, next) {
// create an instance of the service client
var bServiceClient = new BServiceClient();
// retrieving the users from an abstracted endpoint
bServiceClient.GetUsers();
// do some processing and then render a response or call next
});
In order for it to be more testable, you might have to write your own wrapper around the app to do the proper dependency injection for injecting the client to make the routes more testable. Otherwise, you might be able to create another function that can inject the client and create the client at the handler level that calls the newly created function. The newly created function could then be tested. However, I prefer the former approach of using the wrapper. Hope this helps!
What i would do is,
Create separate modules for each microservice. Depending on what microservice you want to run, just have a route for that in express.
Inject the modules you want into an instance of express().
Example + shameless plug - https://github.com/swarajgiri/express-bootstrap/blob/master/core/index.js
Disclaimer - The above solution is a highly opinionated way of solving your problem.

Resources