Reusing Redis Connection: Socket Closed Unexpectedly - node-redis - node.js

First, let me tell you how I'm using Redis connection in my NodeJS application:
I'm re-using one connection throughout the app using a singleton class.
class RDB {
static async getClient() {
if (this.client) {
return this.client
}
let startTime = Date.now();
this.client = createClient({
url: config.redis.uri
});
await this.client.connect();
return this.client;
}
}
For some reason - that I don't know - time to time my application crashes giving an error without any reason - this happens about once or twice a week:
Error: Socket closed unexpectedly
Now, my questions:
Is using Redis connections like this alright? Is there something wrong with my approach?
Why does this happen? Why is my socket closing unexpectedly?
Is there a way to catch this error (using my approach) or any other good practice for implementing Redis connections?

I solved this using the 'error' listener. Just listening to it - saves the node application from crashing.
client.on("error", function(error) {
console.error(error);
// I report it onto a logging service like Sentry.
});

I had similar issue of socket close unexpectedly. The issue started while I upgraded node-redis from 3.x to 4.x. The issue was gone after I upgraded my redis-server from 5.x to 6.x.

You should declare a private static member 'client' of the RDB class, like this:
private static client;
In a static method, you can't reference instance of 'this', you need to reference the static class member like this:
RDB.client
And it would be better to check, whether the client's connection is open, rather than simply checking if the client exists (considering you are using the 'redis' npm library). Like this:
if (RDB.client && RDB.client.isOpen)
After the changes, your code should look like this:
class RDB {
private static client;
static async getClient() {
if (RDB.client && RDB.client.isOpen) {
return RDB.client;
}
RDB.client = createClient({
url: config.redis.uri
});
await RDB.client.connect();
return RDB.client;
}
}
Note: the connect() method and isOpen property only exist in redis version ^4.0.0.

Related

Node memory leak while using redis brpop

I'm having memory leak issue in a node application. The application is subscribed to a topic in redis and on receiving a message pops a message from a list using brpop. There are a number instances of this application running in production so one instance might be blocking for a message in the redis list. Here is the code snippet which consumes a message from redis:
private doWork(): void {
this.storage.subscribe("newRoom", (message: [any, any]) => {
const [msg] = message;
if (msg === "room") {
return new Promise( async (resolve, reject) => {
process.nextTick( async () => {
const roomIdData = await this.storage.brpop("newRoomList"); // a promisified version of brpop with timeout of 5s
if (roomIdData) {
const roomId = roomIdData[1];
this.createRoom(roomId);
}
});
resolve();
});
}
});
}
I've tried debugging the memory leaks using chrome debugger and I've observed too many closure objects getting created. I suspect that it's due to this code as I'm able to see the redis client object name in the closure object but I'm not able to figure out how I might be able to fix it. I added process.nextTick but it didn't help. I'm using node-redis client for connecting to redis. Attaching an object retainer map screenshot from the chrome debugger tool.
P.S. blk is the redis client object name used exclusively for blocking commands i.e. brpop.
Edit: Replaced brpop with rpop and we're seeing a significant drop in memory growth rate but now the distribution of messages between the workers has gone skewed.

ETIMEOUT error with servicebus connection while using topics

I am creating 5 connections to servicebus and putting them in an array. Then as the new messages keep on coming I get one connection from the array and use them to send the message. When I start the service and run a load test it works fine. I leave the service ideal for sometime and run the same load test again, it starts having this error. connect ETIMEDOUT xxx.xxx.xxx.xxx\\n at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1191:14)
I am not sure if it is a good way to cache the connection and reuse them, which would be causing this issue, or it is something else that causes this.
let serviceBusConnectionArray = [];
let executed = false;
let serviceBusService;
let count = 0;
let MAX_CONNECTIONS = 5;
class ServiceBus {
static createConnections(){
if(!executed){
for(let i=0; i< MAX_CONNECTIONS; i++){
serviceBusConnectionArray.push(azure.createServiceBusService(SERVICEBUS_CONNECTION_STRING).withFilter(new azure.ExponentialRetryPolicyFilter()));
}
executed = true;
}
}
static getConnectionString(){
ServiceBus.createConnections();
if(count < MAX_CONNECTIONS){
return serviceBusConnectionArray[count++];
}else{
count = 0;
return serviceBusConnectionArray[count];
}
}
static putMessageToServiceBus(topicName, message) {
return new Promise((resolve, reject) => {
serviceBusService = ServiceBus.getConnectionString();
serviceBusService.sendTopicMessage(topicName, message, function (error) {
if (error) {
log.error('Error in putting message to service bus, message: %s', error.stack);
reject(error);
}
resolve('Message added');
});
});
}
}
I am not sure what route should I choose now, to resolve this timeout errors.
Looking into the source code for azure-sdk-for-node, specifically these lines in order
servicebusservice.js#L455
servicebusservice.js#L496
serviceclient.js#L190
The SDK is just performing REST requests to Service Bus via its REST API. So, I don't really think pooling those objects really help.
The timeout seems to be a genuine timeout at that point of time raised by the request npm module used by the SDK.
You could probably try the newer SDK which uses amqp under the hood to connect to service bus. Note that this SDK is in preview.
As PramodValavala-MSFT has mentioned about #azure/service-bus SDK in the other answer, major version 7.0.0 of #azure/service-bus SDK(which was in the preview) depends on AMQP has been released recently.
Each instance of ServiceBusClient represents a connection, all the methods under ServiceBusClient use the same connection.
#azure/service-bus - 7.0.0
Samples for 7.0.0
Guide to migrate from #azure/service-bus v1 to v7

Error: Redis connection to localhost:6379 failed - getaddrinfo EMFILE localhost:6379

I am getting the below error.
Error: Redis connection to localhost:6379 failed - getaddrinfo EMFILE localhost:6379
at Object.exports._errnoException (util.js:870:11)
at errnoException (dns.js:32:15)
at GetAddrInfoReqWrap.onlookup [as oncomplete] (dns.js:78:26)
Node.js with MySQL db and redis concept is used.
There are too many requests for fetching the data from MySQL so data is cached for 2 minutes by syncing with db. So when new requests arrives it checks in redis if found its serves from redis else data is retrieved from MySQL and cached in redis and sent as response. This keeps happening.
After some time probably 1 hour or 2 hours the server crashes resulting in above error.
But as of now pm2 is used which restarts the server.
But need to know the reason for it.
Redis installation followed the instructions from here.
https://www.digitalocean.com/community/tutorials/how-to-install-and-use-redis
Please let me know how to solve the issue...
Redis Connection File Code
var Promise = require('bluebird');
var redisClient; // Global (Avoids Duplicate Connections)
module.exports =
{
OpenRedisConnection : function()
{
if (redisClient == null)
{
redisClient = require("redis").createClient(6379, 'localhost');
redisClient.selected_db = 1;
}
},
GetRedisMultiConnection: function ()
{
return require("redis").createClient(6379, 'localhost').multi();
},
IsRedisConnectionOpened : function()
{
if (redisClient && redisClient.connected == true)
{
return true;
}
else
{
if(!redisClient)
redisClient.end(); // End and open once more
module.exports.OpenRedisConnection();
return true;
}
}
};
What I usually do with code like this is write a very thin module that loads in the correct Redis driver and returns a valid handle with a minimum of fuss:
var Redis = require("redis");
module.exports = {
open: function() {
var client = Redis.createClient(6379, 'localhost');
client.selected_db = 1;
return client;
},
close: function(client) {
client.quit();
}
};
Any code in your Node application that needs a Redis handle acquires one on-demand and it's also understood that code must close it no matter what happens. If you're not catching errors, or if you're catching errors and skipping the close you'll "leak" open Redis handles and your app will eventually crash.
So, for example:
var Redis = require('./redis'); // Path to module defined above
function doStuff() {
var redis = Redis.open();
thing.action().then(function() {
redis.ping();
}).finally(function() {
// This code runs no matter what even if there's an exception or error
Redis.close(redis);
});
}
Due to the concurrent nature of Node code having a single Redis handle that's shared by many different parts of code will be trouble and I'd strongly advise against this approach.
To expand on this template you'd have a JSON configuration file that can override which port and server to connect to. That's really easy to require and use instead of the defaults here. It also means that you don't have to hack around with any actual code when you deploy your application to another system.
You can also expand on the wrapper module to keep active connections in a small pool to avoid closing and then immediately opening a new one. With a little bit of attention you can even check that these handles are in a sane state, such as not stuck in the middle of a MULTI transaction, before handing them out, by doing a PING and testing for an immediate response. This weeds out stale/dead connections as well.

Sharing server port with modules which update client via socket.io

I am implementing a node server which in addition to serving pages also consists of a set of sub-modules which are used to report data via socket.io. Each module is pretty independent from the core server - each module has a timer which processes some data and emits the results back to the web client(s). But due to the structuring of the code/modules, I'm running to a problem related to how the port connection to the server used/shared and was wondering if there was a recommended pattern on how to do this?
The server has a very basic setup and then requires the modules:
var app = require('http').createServer(handler);
app.listen(8888);
function handler (req,res) { ... }
// Here's where the sub-processing happens
var module1 = require('./module1.js');
var module2 = require('./module2.js');
...
var moduleN = require('./moduleN.js');
Then each module has the following structure:
// Socket stuff
var io = require('socket.io').listen(port);// Not sure how to share server port???
io.sockets.on('connection', onConnect);
function onConnect(socket) { ... }
function sendUpdateToClients(type,data) {
io.sockets.emit(type,data);
}
// Timed stuff
setInterval(someProcessing,someInterval);
function someProcessing() {
... // process some data here
sendUpdateToClients(type,data);// Emit the data to clients
}
I've currently have this code running separate from my MEAN application, while I try to best figure out how to organize the code.
I guess my questions are:
- What are best practices to organize sub-modules updating modules?
- Should I be passing a socket reference from the server to each module? If so, how would this best be done?
- Or should I be returning something from the modules back to the server, so it does the updating? If so, how would this best be done?
- Should each module use their own port, separate from the server port?
- Or does this whole organization of code suck and is there a better way?
You can export your module as a function.
//module1.js
module.exports = function(port){
var io = require('socket.io').listen(port);// Not sure how to share server port???
io.sockets.on('connection', onConnect);
function onConnect(socket) { ... }
function sendUpdateToClients(type,data) {
io.sockets.emit(type,data);
}
// Timed stuff
setInterval(someProcessing,someInterval);
function someProcessing() {
... // process some data here
sendUpdateToClients(type,data);// Emit the data to clients
}
}
Then inside your server file.
module2.js
var module1 = require('./module1.js')(8888);

Correct usage of events in NodeJs - Concerning "this" context

I am designing a communication server in Node that handles incoming messages (sent by client1) and transfers them to someone else (client2), who answers the message and sends the answer back, via the server, to client1.
The communication happens via WebSockets, which implies an open connection from each client to the server.
Thus I implemented a ConnectionManager to which I can register any new connections when a new client comes online. Every connection gets assigned a messageQueue in which all incoming messages are cached before processing.
At the end of processing, I have a ServerTaskManager, who generates Output-Tasks for the server, telling him a message to send and a receiver to receive it.
This ServerTaskManager emits a Node-Event (inherits from EventEmitter) upon registering a new serverTask to which the server listens.
Now I would like my ConnectionManager to also listen to the event of the serverTaskManager, in order to make him push the next message in the messageQueue into processing.
Now the problem is, that I can catch the ServerTaskManager event within the ConnectionManager just fine, but, of course, the "this" within the listener is the ServerTaskManager, not the ConnectionManager. Thus calling any "this.someFunction()" functions that belong to the ConnectionManager won't work.
Here is some code:
/**
* ServerTaskManager - Constructor
* Implements Singleton pattern.
*/
function ServerTaskManager()
{
var __instance;
ServerTaskManager = function ServerTaskManager()
{
return __instance;
}
ServerTaskManager.prototype = this;
__instance = new ServerTaskManager();
__instance.constructor = ServerTaskManager;
return __instance;
}
util.inherits(ServerTaskManager, EventEmitter);
/**
* ConnectionManager - Constructor
* Also implements Singleton pattern.
*/
function ConnectionManager()
{
var __instance;
ConnectionManager = function ConnectionManager()
{
return __instance;
}
ConnectionManager.prototype = this;
__instance = new ConnectionManager();
__instance.constructor = ConnectionManager;
__instance.currentConnections = [];
// Listen for new serverInstructions on the serverTaskManager
serverTaskManager.on('newInstruction', function(messageObject, currentReceiver)
{
this.processNextMessage(currentReceiver);
});
return __instance;
}
util.inherits(ConnectionManager, EventEmitter);
Now when I run this and the "newInstructions" event is triggered by the serverTaskManager, node throws:
TypeError: Object #<ServerTaskManager> has no method 'processNextMessage'
Which is of course true. The function I want to call belongs to the ConnectionManager:
/**
* Starts processing the next message
*
* #param connectionId (int) - The ID of the connection, of which to process the next message.
*/
ConnectionManager.prototype.processNextMessage = function (connectionId)
{
// Some Code...
}
So obviously, when listening to the ServerTaskManager event, "this" within the listener is the ServerTaskManager. Now how do I call my ConnectionManager's function from within the listener?
I hope I am not completely misled by how events and listeners and/or prototypical extensions work (in Node). This project is by far the most advanced that I have worked on in JavaScript. Normally I am only coding PHP with a little bit of client side JS.
Thx in advance for any hints!
Worp
Like this.
serverTaskManager.on('newInstruction', function(messageObject, currentReceiver)
{
ConnectionManager.processNextMessage(currentReceiver);
});
Or like this.
serverTaskManager.on('newInstruction', function(messageObject, currentReceiver)
{
ConnectionManager().processNextMessage(currentReceiver);
});
PS: your question is unnecessarily long. When posting code, don't necessarily post your example. It is much easier to boil your code down to the simplest form that exhibits the behavior you are seeing. You'll get more quality responses this way.

Resources