What is the best way to handle webhook APIs in Hapi.js? - webhooks

A common webhook-style API practice is for an API consumer to receive the webhook request, respond with 204 No Content, close the connection, and then process the request. Hapi.js does not send a reply object until nextTick, which means that the handler function must return.
Separating the request processing into a separate function and calling it with nextTick still causes the processing to occur before the reply object is sent. setTimeout works, but this has negative performance implications.
Hapi's request extensions seemed like an option, but are attached to all requests, regardless of path, which adds an unnecessary overhead to other requests.
Express.js has a response.send() method that immediately sends the response. Ideally, Hapi would have something like this.

A solution is to use setImmediate for any processing that should happen after the response is sent and connection closed. Hapi sends the reply on nextTick. setImmediate events will be processed after nextTick events.
See example: https://gist.github.com/jeremiahlee/3689e8b4d1513c375b1e

Related

Let an unresolved async function executing on a nodejs server

I have a "recover password" option on a website. Sometimes my sendMail function takes a few seconds to execute. I want to call this async sendMail without waiting and return something like "check you inbox in a few seconds".
Will the server keep running the method or once it responds the method will be terminated?
Whatever happens will surely happen regardless if it's on Apache, Nginx, etc?
Yes, the sendMail method will keep running. An incoming http request is just one asynchronous operation that nodejs can process. It can have thousands of others going on too. In this case, your sendMail() operation would be one additional asynchronous operation carrying on and it will finish without any regard for when the http request finishes.
Remember, nodejs is NOT a threaded architecture where each http request is a new thread that gets terminated at some point. Instead, it's a non-blocking, event driven, asynchronous architecture and you can start as many other asynchronous operations as you want and they will run to completion regardless of what happens with the http request.

Firebase functions: how best to await unblocking promises after response is sent?

I'm using Express in Firebase Functions that are on Node 10. My question is: if I have heavy Promises that I need to complete before the function terminates, but I want to do this after I sent the response (res.status(200).send...), how is the best way I could do this?
I have come up with two approaches so far but neither seem great (and I haven't tried the 'finish' approach yet to see if it works):
For every router, add a finally... clause that awaits accumulated promises. This has the major downside that it'd be easy to forget to do this when adding a new router, or might have bugs if the router already needs its own finally clause
Use the Node stream 'finish' event, https://nodejs.org/docs/latest-v10.x/api/stream.html#stream_event_finish ... but I don't know if Firebase Functions allow this to be caught, or if it'll still be caught if my function runs to its end and the event is fired afterwards.
I'm doing this to try to make my functions' responses faster. Thanks for any help.
if I have heavy Promises that I need to complete before the function terminates, but I want to do this after I sent the response (res.status(200).send...), how is the best way I could do this?
With Cloud Functions, it's not possible to send a response before promises are resolved, and have the work from those promises complete normally. When you send the response, Cloud Functions assumes that's the very last thing the function will do. When the response is sent using res.send(), Cloud Functions will terminate the function and shut down the code. Any incomplete asynchronous work might never finish.
If you want to continue work after a response is sent, you will have to first offload that work to some other component or service that is not in the critical path of the function. One option is to send a message to a pubsub function, and let that function finish in the "background" while the first function sends the response and terminates. You could also send messages to other services, such as App Engine or Compute Engine, or even kick off work in other clouds to finish asynchronously.
See also:
Continue execution after sending response (Cloud Functions for Firebase)
How to exec code after res.send() in express server at Firebase cloudFunctions

Node JS Socket.io: Is it possible to wait for reply from other side of connection after an emit()?

Is anyone familiar with ways to set up Socket.io to wait for a response from the other side of the connection, after emitting a message?
For example:
Client (A) emits an event to Server (B), and waits for an event emitted from Server (B) in response to this, before continuing. If no response is received in X seconds, execute a timeout function or return a default value.
Usage case: similar to socketio-auth, where a client connects to the socket.io server and must authenticate by emitting an 'authenticate' event within X seconds, or gets disconnected. However, I'm seeking a more general case where I can create a function like awaitReply() which will emit an event along with data, and wait for another event emitted by the remote server, and return the data included with that event.
I realize a HTTP request/response cycle may be a better way to accomplish this, but it seems it may be possible with Promises: do a setTimeout() and simply reject the promise if the timeout expires, and create an io.on() listener to resolve the promise when the awaited event fires locally.
One thing I'm not sure about: how to turn off the io.on() listener after the event fires. Yes there's an io.off() function, but I don't want to erase all events associated with that event name - just the one created for that particular call. And there may be multiple calls being awaited simultaneously for the same event - don't want to delete all the listeners when one of them executes.
Any ideas? Thanks!
you will be performing a synchronous long pull, and defeat the purpose of socket.io

How to handle slow http request in Node JS?

I'm new in nodejs, and i'm interested on event loop.But i have a confused issue in it.
If one http request need handle by 10 seconds, if all other requests must waiting for the slow request?
Short answer: Not, other requests will work in parallel. They will not wait for the response from the first request.
Long answer
Your code makes a new request.
Node.js adds a new callback to pending callbacks list. This callback will run after the response from the server will arrive.
Node.js processes next code or timers or callbacks or etc.
10 seconds have been passed. Your code receives the answer and finishes processing.
In NodeJS, most of your asynchronous tasks (like http requests, db access, etc.) will use callbacks or promises. What that means is that the event loop is free to handle other tasks until the callback is invoked or the promise resolves.
You can read more about the event loop here in the NodeJS documentation.
the event loop can only be slowed in an http request, when ever you are doing a hudge sync operation or whenever you are trying to send a file to a client without using the stream api. Whenever you read a hudge file with fs.readFile it kind of pause everything until the entire file is read, but using the stream api sends the file in chunk. To avoid slowing down your server you should always read file with the stream api for example using fs.createReadStream instead of fs.readFile depending on the size of the file

Potential pitfalls in node/express to have callbacks complete AFTER a response is returned to the client?

I want to write a callback that takes a bit of time to complete an external IO operation, but I do not want it to interfere when sending data back to the client. I don't care about waiting for callback completion for purposes of the reply back to the client, but if the callback results in an error, I would like to log it. About 80% of executions will result in this callback executing after the response is sent back to the client and the connection is closed.
My approach works well and I have not seen any problems, but I would like to know whether there are any pitfalls in this approach that I may be unaware of. I would think that node's evented IO would handle this without issue, but I want to make sure before I commit this architecture to production. Any issues that should make me reconsider this approach?
As long as you're not trying to reference that response object after the response is sent, this will not cause any problems. There's nothing special about a request handler that cares one bit about callbacks in its code being invoked after the response is generated.

Resources