Catching HTTP response event - node.js

I am building a Node web application (using Express) and to keep everything as DRY as possible, I automatically acquire database connection for each request but I also need to release it (back to the pool).
I could do that manually in every route but that would be just polluting the code. I figured that I should listen to server events and release it once response is either finished generating or sending. What options do I have? What events (and inside which object (app, request or whatever) are triggered when response is complete?

res is inherited from EventEmitter, and can use default events as well as you can emit own ones.
So one of the most common would be finish (in node.0.8.12 and older end) event:
res.on('finish', function() {
// request processing finished
});
You can attach that event for all requests if you want, or to specific early in their roots.
Additionally you can create middleware, that would get DB connection, and attach finish event and possibly timeouts. And then all you will need is to add that middleware into routes, so minimum of repetitive coding.
In mean time, I would recommend you not to create DB connection for each request, as it is a big overhead and unnecessary use of sockets as well as IO handlers. You can freely use onse single connection, and it will be as efficient from performance point of view.

Related

How do I send multiple request to let the client know that upload and processing has finished?

I'm trying to figure out how I can send multiple requests to let the client know that uploading and processing has finished.
For example: Let the client know that upload has finished, and processing has started. When processing has been finished, send another request to notify the client.
I only want to know the proper functions to use because it seems that sending two res.write()'s wont send until I call res.end()...
You actually want to inform the client about updates that happen asynchronously on the server / in the backend.
From the top of my head, there are a few ways to achieve getting asynchronous information to the client:
Open a stream and push updates via the stream to the client (for example, using server-sent-events)
Open a websocket and push your messages to the client (you need to manage the HTTP and websocket connections to write to the correct one)
Create a new route to let clients subscribe to information about the status of a job (some example code)
I'd select one of the solutions depending on your current client design. If you have something like websockets already available, I'd use that - but it's quite something to setup, if you don't. Streaming might not work cross-browser, but is quite easy to build. For the third option, you probably need more housekeeping in order to not create some memory leaks if a client disconnects.

Node JS Socket.io: Is it possible to wait for reply from other side of connection after an emit()?

Is anyone familiar with ways to set up Socket.io to wait for a response from the other side of the connection, after emitting a message?
For example:
Client (A) emits an event to Server (B), and waits for an event emitted from Server (B) in response to this, before continuing. If no response is received in X seconds, execute a timeout function or return a default value.
Usage case: similar to socketio-auth, where a client connects to the socket.io server and must authenticate by emitting an 'authenticate' event within X seconds, or gets disconnected. However, I'm seeking a more general case where I can create a function like awaitReply() which will emit an event along with data, and wait for another event emitted by the remote server, and return the data included with that event.
I realize a HTTP request/response cycle may be a better way to accomplish this, but it seems it may be possible with Promises: do a setTimeout() and simply reject the promise if the timeout expires, and create an io.on() listener to resolve the promise when the awaited event fires locally.
One thing I'm not sure about: how to turn off the io.on() listener after the event fires. Yes there's an io.off() function, but I don't want to erase all events associated with that event name - just the one created for that particular call. And there may be multiple calls being awaited simultaneously for the same event - don't want to delete all the listeners when one of them executes.
Any ideas? Thanks!
you will be performing a synchronous long pull, and defeat the purpose of socket.io

Why can't I use res.json() twice in one post request?

I've got an chatbot app where I want to send one message e.g. res.json("Hello") from express, then another message later e.g. res.json("How are you doing"), but want to process some code between the two.
My code seems to have some problems with this, because when I delete the first res.json() then the second one works fine and doesn't cause any problems.
Looking in my heroku logs, I get lots of gobbledy gook response from the server, with an IncomingMessage = {}, containing readableState and Server objects when I include both of these res.json() functions.
Any help would be much appreciated.
HTTP is request/response. Client sends a request, server sends ONE response. Your first res.json() is your ONE response. You can't send another response to that same request. If it's just a matter of collecting all the data before sending the one response, you can rethink your code to collect all the data before sending the one response.
But, what you appear to be looking for is "server push" where the server can send data to the client continually whenever it wants to. The usual solution for that is a webSocket connection (or socket.io which is built on top of webSocket and adds more features).
In the webSocket/socket.io architecture, the client makes a connection the server and the connection is kept open indefinitely. Then either side of the connection can send messages to the other end. This is most useful when the server wants to "push" data to the client at any time. In this case, the client establishes the connection, then the server can send data to the client over that connection at any time. The client registers a listener for incoming messages and will be notified anytime the server sends it some data.
Both webSocket and socket.io are fully supported in modern browsers and in node.js. I would personally recommend using socket.io because some of the features it adds (a messaging layer, auto-reconnect, etc...) are very useful.
To use a continuously connected socket like this, you will have to make sure your hosting infrastructure is properly configured to allow it.
res.json() always sends the response to the client immediately (calling it again will cause an error). If you need to gradually build up a response then you can progressively decorate a plain old javascript object; for example, appending items to an array. When you are done call res.json() with the constructed response.
But you should post your code so we can see what's happening.

middleware in mongoose vs replica set for things like sending emails/sms

Which one is better approach for things this sending Emails ,Sms (for account verification) , notifications bla bla via nodejs application
As per my knowledge there can be two approaches.
Execute a function after save which will do this... one can use mongoose middleware like after save..
Simulate triggers with the help of replica set in mongodb and run it through background jobs
I think second approach is better because it will be executed by some other process in background, But on the other hand node.js is asynchronous may be node.js handles these kind of stuffs in a smart way .. any idea !!!
In short: Sending Sms emails notification after user regestration should be send by nodjs middleware or by background process
and as per my knowledge background process can be executed by binding a listener to oplog
The best approach for triggering an sms/email service request after saving the document in mongoDB would be through some messaging queue.
I would recommend using RabbitMQ. It will segregate the process of sending sms/email from your req/res loop. Invoke it after the success of save function and it will get added in the queue with a completely different process. You can simply return the success result of save function without waiting for the response of message worker.
The messaging queue comes with many more features like delivery acknowledgement, scalability options, apis and guis for managing and monitoring the status of actions performed.
You can either set it up on the same server or deploy on a separate server according to the traffic.
As of MongoDB 3.6, you can do exactly this via Change Streams. They are a new feature API that let you trigger all sorts of actions based on events happening in the database.

Node.js - what exactly happens upon a new client request

I want to understand more exactly what happens when a server receives a client request on a Node.js server. With a more traditional server, a new thread is created to handle the new client session.
But in Node.js and other event-loop style servers, what exactly happens? What part of the codebase first gets executed? With node, I am almost certain something in the http module handles the new request first.
I want to know a little more about the details of how this works in a sort of compare and contrast style between the two types of handling of client connections.
In a nutshell:
Node uses libuv to manage incoming connections and data events
Events are placed in a queue to be handled on the next tick of the event loop
When bytes start arriving, they are fed in to the native-code http parser
The parser calls a callback in JS-land with the header contents
The rest of the JS HTTP code dispatches the request to user code, which may be Express

Resources