Flush all active requests to mongodb from node before closing - node.js

I want to finish all active requests to mongodb from node when I need to gracefully shutdown the application (control+c / exceptions). I have tried using the .close method, which closes the connection just fine, but does not seem to wait for active requests to complete. The solutions I have seen appear to just keep a counter of live requests. Is there a more elegant solution to this problem?

Related

I am getting connection reset error when trying execute two queries at the same time in arangodb?

I am using arangojs v6.14.1 and arangodb version 3.6.4.. I also have the nodejs express app which is intended to serve client requests.
I am experiencing an issue while executing concurrent requests. The database connection hangup when I concurrently process client requests:
What the app will do when it receives a request?
Open a database connection -
db = new Database(dbConfig.url);
db.useDatabase(dbConfig.name);
db.useBasicAuth(dbConfig.username, dbConfig.password);
There are multiple middleware functions that need to access to perform various functions and accessibility checks. And for each middleware I tried to
open a new database connection ->
perform the action ->
close the connection ->
return the result to next middleware.
This works fine with single request at a time. But if I tried to call two APIs at same time, I am getting CONNECTIONRESET error. And also throwing socket hangup error.
I tried to commend out the close connection method and it started working fine for a while. But when I increased the number of connections, it again showing the same error as "CONNECTIONRESET".
I have searched the documentation of arangojs regarding the connection manipulation. But I haven't found any information regarding the same.
Any help would be deeply appreciated.
This might be too late for you, but I had a similar issue. After talking with ArangoDB support, it turned out to be an issue with sockets.
Looking at the output from netstat -aplnt I was seeing a lot of CLOSE_WAIT and TIME_WAIT messages, indicating that there were A LOT of connections that were not being closed properly.
The solution was to enable persistent connections (see arangojs config for keep-alive and maxSockets) and then re-using a single connection whenever possible. This was more than just a settings change, requiring some code modifications as well.

NodeJS Not Exiting, How to find open handlers?

I have a fairly simple NodeJS script that is not exiting gracefully when done (this is a worker and I want to fire it up using a cron job / heroku scheduler)
Once it has finished its task it just sits there waiting. It doesn't use express / a web server etc and as far as I can tell I have resolved all callbacks.
Is there some way to tell what node is still working on / what handlers are open?
My imported libraries are
Request, Q & Mongoose.
You definitely need to close the mongodb connections that mongoose has open on your behalf by calling mongoose.disconnect(). Have you tried that?

Does Mongodb automatically disconnect when the Node server closes

I just started using MongoDB. One of my confusions is, I hear it is good to have your MongoDB connection open on initiation and re-use that connection throughout your application.
However, should I ever explicitly close the MongoDB connection eventually? Or does MongoDB implicitly close the connection when the Node server goes down?
unless explicitly closed, the connection will be kept open in the event loop until the process terminates. So if you intend your app to maintain an open connection to mongodb throughout its life cycle, then there is no need to explicitly close it. It will happen automatically when the process is terminated.
now if you're writing a command script, then you should close the connection explicitly otherwise the open socket will keep your process from terminating.

Detecting if the connection to the browser client is broken

I have a web site which uses a Long Poll to wait for the server to finish processing some data. However, a timeout might occur or the user might close his browser, yet the server is continuously processing it's data.
I want the server to stop processing data as soon as the Long Poll connection is broken. There's no client who will receive the data so there's no use for this long process to continue running... How to do this?
The server is working on adding files to a ZIP archive, which takes some time since these are reasonable big files. Once it's done, it will send the final ZIP file and close the connection. But if the client disconnected before the task is finished, the server should stop it's work and discard everything again...
You should consider using he SignalR framework. It offers very comfortable events like OnConnect() and OnDisconnect(). Under the hood it works with
WebSockets
Server Sent Events
Forever Frame
Long polling
It uses whatever is available with the given environment, starting with WebSockets.

Is it possible to pause cherrypy server in order to update static files / db without stopping it?

I have an internal cherrypy server that serves static files and answers XMLRPC requests. All works fine, but 1-2 times a day i need to update this static files and database. Of course i can just stop server, run update and start server. But this is not very clean since all other code that communicate with server via XMLRPC will have disconnects and users will see "can't connect" in broswers. And this adds additional complexity - i need some external start / stop / update code, wile all updaes can be perfectly done within cherrypy server itself.
Is it possible to somehow "pause" cherrypy programmatically so it will server static "busy" page and i can update data without fear that right now someone is downloading file A from server and i will update file B he wants next, so he will get different file versions.
I have tried to implement this programmatically, but where is a problem here. Cherrypy is multithread (and this is good), so even if i introduce a global "busy" flag i need some way to wait for all threads to complete aready existing tasks before i can update data. Can't find such way :(.
CherryPy's engine controls such things. When you call engine.stop(), the HTTP server shuts down, but first it waits for existing requests to complete. This mode is designed to allow for debugging to occur while not serving requests. See this state machine diagram. Note that stop is not the same as exit, which really stops everything and exits the process.
You could call stop, then manually start up an HTTP server again with a different app to serve a "busy" page, then make your edits, then stop the interim server, then call engine.start() and engine.block() again and be on your way. Note that this will mean a certain amount of downtime as the current requests finish and the new HTTP server takes over listening on the socket, but that will guarantee all current requests are done before you start making changes.
Alternately, you could write a bit of WSGI middleware which usually passes requests through unchanged, but when tripped returns a "busy" page. Current requests would still be allowed to complete, so there might be a period in which you're not sure if your edits will affect requests that are in progress. How to write WSGI middleware doesn't fit very well in an SO reply; search for resources like this one. When you're ready to hook it up in CherryPy, see http://docs.cherrypy.org/dev/concepts/config.html#wsgi

Resources