SocketIO connection stop sending data after 4-5 hours - node.js

I have developed an application with ReactJS, ExpressJS, MongoDB and SocketIO.
I have two servers:- Server A || Server B
Socket Server is hosted on the Server A and application is hosted on the Server B
I am using Server A socket on Server B as a client.
Mainly work of Server A socket is to emit the data after fetching from the MongoDB database of Server A
Everything is working as expected but after 4-5-6 hours stop emitting the data but the socket connection will work.
I have checked using
socket.on('connection',function(){
console.log("Connected")
)
I am not getting whats the wrong with the code.
My code : https://jsfiddle.net/ymqxo31d/
Can anyone help me out on this

I have some programming errors.
I am getting data from MongoDB inside the setInterval() so after a little while exhausts resources and database connection starts failing every time.
Firstly i have created a Single MongoDB connection and used every place where i needed.
2ndly i removed setInterval and used setTimeout as below. (NOTE: If i continue using setInterval it execute on defined interval. It doesn't have any status that the data is emitted or not [this also cause the heavy resource usages] but i need to emit the data to socket when successfully fetched.)
setTimeout(emitData,1000);
function emitData(){
db.collection.find({}).toArray(function(data){
socket.emit('updateData',data);
setTimeout(emitData,1000);
})
}

Related

Run socket with firebase function api (same port creating issue)

-> APIs are running on firebase functions
-> socket is running on another server
so i can not use both in same port
but i want to run socket with firebase function
i am using
firebase-function : 3.14.1
node version : 16
socket version : 4.5.1
18
Actually, socket.io does not work with Cloud Functions. Cloud Functions have the following properties that make them incompatible with long-lived socket connections:
The maximum duration of a Cloud Function can only be 9 minutes. The socket will be forced closed after that time. This is counter to the normal expectation of socket.io to keep a socket connection alive indefinitely.
Cloud Functions will read the entire contents of the request, and only then will write the entire contents of the response. There is only one full round trip - a client can not "chat back and forth" over the connection with the function.

Redis Error "max number of clients reached"

I am running a nodeJS application using forever npm module.
Node application also connects to Redis DB for cache check. Quite often the API stops working with the following error on the forever log.
{ ReplyError: Ready check failed: ERR max number of clients reached
at parseError (/home/myapp/core/node_modules/redis/node_modules/redis-parser/lib/parser.js:193:12)
at parseType (/home/myapp/core/node_modules/redis/node_modules/redis-parser/lib/parser.js:303:14)
at JavascriptRedisParser.execute (/home/myapp/ecore/node_modules/redis/node_modules/redis-parser/lib/parser.js:563:20) command: 'INFO', code: 'ERR' }
when I execute the client list command on the redis server it shows too many open connections. I have also set the timeout = 3600 in my Redis configuration.
I do not have any unclosed Redis connection object on my application code.
This happens once or twice in a week depending on the application load, as a stop gap solution I am restarting the node server( it works ).
What could be the permanent solution in this case?
I have figured out why. This has nothing to do with Redis. Increasing the OS file descriptor limit was just a temporary solution. I was using Redis in a web application and the connection was created for every new request.
When the server was restarted occasionally, all the held-up connections by the express server were released.
I solved this by creating a global connection object and re-using the same. The new connection is created only when necessary.
You could do so by creating a global connection object, make a connection once, and make sure it is connected before every time you use that. Check if there is an already coded solution depending on your programming language. In my case it was perl with dancer framework and I used a module called Dancer2::Plugin::Redis
redis_plugin
Returns a Dancer2::Plugin::Redis instance. You can use redis_plugin to
pass the plugin instance to 3rd party modules (backend api) so you can
access the existing Redis connection there. You will need to access
the actual methods of the the plugin instance.
In case if you are not running a web-server and you are running a worker process or any background job process, you could do this simple helper function to re-use the connection.
perl example
sub get_redis_connection {
my $redis = Redis->new(server => "www.example.com:6372" , debug => 0);
$redis->auth('abcdefghijklmnop');
return $redis;
}
...
## when required
unless($redisclient->ping) {
warn "creating new redis connection";
$redisclient = get_redis_connection();
}
I was running into this issue in my chat app because I was creating a new Redis instance each time something connected rather than just creating it once.
// THE WRONG WAY
export const getRedisPubSub = () => new RedisPubSub({
subscriber: new Redis(REDIS_CONNECTION_CONFIG),
publisher: new Redis(REDIS_CONNECTION_CONFIG),
});
and where I wanted to use the connection I was calling
// THE WRONG WAY
getNewRedisPubsub();
I fixed it by just creating the connection once when my app loaded.
export const redisPubSub = new RedisPubSub({
subscriber: new Redis(REDIS_CONNECTION_CONFIG),
publisher: new Redis(REDIS_CONNECTION_CONFIG),
});
and then I passed the one-time initialized redisPubSub object to my createServer function.
It was this article here that helped me see my error: https://docs.upstash.com/troubleshooting/max_concurrent_connections

When should I call done() in node-postgres?

pg 4.4.3
I'm using socket.io to connect client-side to server. I've guessed, I suppose to connect server to database on server start, but there are a lot of warnings in pg "docs": 'use done() or bad things will happen'.
When should i use it? If i'm opening connection to the db and then creating socket.io server within it, and then using done() after each query, then i'm receiving error after 30 sec idle:
Error: This socket has been ended by the other party
May be I should create socket.io and then open connection to the db within each user session? or open connection to db on each query if it's not currently opened? Being honest, I don't get, why should i do this, why can't I just create single connection to the database on server start and send all queries through it instead of this open-close repetition.

node-amqp — proper way to handle connection in Express app?

I'm using Express with node-amqp. My goal is to create amqpConnection and save it to global object before server starts and in Express routes use previously created globals.amqp_connection.
### server.coffee
app.use( ... )
...
# create RabbitMQ connection before server starts
connection = require("amqp").createConnection
host: "localhost"
connection.on "ready", ->
console.log "Got connection.on(\"ready\") event from node-amqp..."
# make amqp_connection accessible from any point of application
globals.amqp_connection = connection
server = globals.http.createServer(app).listen(8082)
console.log "Express server is listening 8082..."
The problem is connection.on "ready"-event is fired eveytime I call a route. I may suppose that's because of Express way of serving http-requests — executing server.js for every route called. So, for every request, a new instance of connection is created and on it's "ready" app tries to create one more instance of Express server.
How to make amqp_connection accessible from any point of my app, but no doubling require("amqp").createConnection() in every point where I need to push something to RabbitMQ?
UPD: or maybe there is no problem with Express. node-amqp seems to fire ready event every second after creation. Don't know if it's correct behavior
Thank you.
Ready event fired multiple 'cause of connection error:
https://github.com/postwait/node-amqp/issues/255

Node JS and Socket.IO. Client getting disconnected. During request

I've ran into a fairly difficult to debug error with my node web server.
Background
I'm creating a node server with socket.io to provide a restful service, connected to mongodb which use web sockets(socket.io) for server-client messages.
Issue
In my node app, I've used an npm package called node-scheduler, in which I do some processing at set times(these are very dynamic times but work fairly well to date).
So I'll set off a job, using node-scheduler and when it ends you can provide a function.
In this function I emit a web socket message, exactly how I emit messages in the rest of the application but my client side never receives the message.
Checking the logs the client disconnections then re connections after the function has completed.
I've debugged a little further, and I send two messages to the client in this function. Only one of them is processed by the client. May be a client issue not a server issue.
Any ideas for solutions or suggestions would greatly be appreciated.
Well generally socket.io is only meant to be used as a "channel". You should have the Client exist as a separate entity in memory or something, and update the socket if and when it reconnects. Otherwise you're just sending to the past (disconnected) sockets.
Using passport you can identify a client as a user.
app.get('/', function(req, res){
// req.user;
});
Using passport.socketio you can get the same user in your socket
io.on('connection', function(socket){
// socket.request.user;
socket.request.user.socket = socket;
// this will be updated with the latest socket in case of a future reconnection
// So now you can be sure that user object will always have the latest socket
nodeScheduler(function(){
carryOutJobs(function callback(){
socket.request.user.socket.emit('done');
// will always emit to the "latest" socket.
});
});
});

Resources