Firebase functions: how best to await unblocking promises after response is sent? - node.js

I'm using Express in Firebase Functions that are on Node 10. My question is: if I have heavy Promises that I need to complete before the function terminates, but I want to do this after I sent the response (res.status(200).send...), how is the best way I could do this?
I have come up with two approaches so far but neither seem great (and I haven't tried the 'finish' approach yet to see if it works):
For every router, add a finally... clause that awaits accumulated promises. This has the major downside that it'd be easy to forget to do this when adding a new router, or might have bugs if the router already needs its own finally clause
Use the Node stream 'finish' event, https://nodejs.org/docs/latest-v10.x/api/stream.html#stream_event_finish ... but I don't know if Firebase Functions allow this to be caught, or if it'll still be caught if my function runs to its end and the event is fired afterwards.
I'm doing this to try to make my functions' responses faster. Thanks for any help.

if I have heavy Promises that I need to complete before the function terminates, but I want to do this after I sent the response (res.status(200).send...), how is the best way I could do this?
With Cloud Functions, it's not possible to send a response before promises are resolved, and have the work from those promises complete normally. When you send the response, Cloud Functions assumes that's the very last thing the function will do. When the response is sent using res.send(), Cloud Functions will terminate the function and shut down the code. Any incomplete asynchronous work might never finish.
If you want to continue work after a response is sent, you will have to first offload that work to some other component or service that is not in the critical path of the function. One option is to send a message to a pubsub function, and let that function finish in the "background" while the first function sends the response and terminates. You could also send messages to other services, such as App Engine or Compute Engine, or even kick off work in other clouds to finish asynchronously.
See also:
Continue execution after sending response (Cloud Functions for Firebase)
How to exec code after res.send() in express server at Firebase cloudFunctions

Related

Let an unresolved async function executing on a nodejs server

I have a "recover password" option on a website. Sometimes my sendMail function takes a few seconds to execute. I want to call this async sendMail without waiting and return something like "check you inbox in a few seconds".
Will the server keep running the method or once it responds the method will be terminated?
Whatever happens will surely happen regardless if it's on Apache, Nginx, etc?
Yes, the sendMail method will keep running. An incoming http request is just one asynchronous operation that nodejs can process. It can have thousands of others going on too. In this case, your sendMail() operation would be one additional asynchronous operation carrying on and it will finish without any regard for when the http request finishes.
Remember, nodejs is NOT a threaded architecture where each http request is a new thread that gets terminated at some point. Instead, it's a non-blocking, event driven, asynchronous architecture and you can start as many other asynchronous operations as you want and they will run to completion regardless of what happens with the http request.

Is there a way to trigger Firebase onUpdate() inside an HTTP endpoint?

This is my situation:
Client starts a login operation, has no way of knowing status or getting a response
Login operation has a Cloud Function callback so it saves the login status in the Realtime DB
Client polls a different Cloud Function to check if login status has been written in the Realtime DB to a specific node (key is a UUID)
I've been trying to write the last function with promise intervals but it feels off, and I've started wondering if I can use onUpdate() inside my HTTP endpoint?
Metacode of my idea:
user = ref.child(uuid)
user.onUpdate((update) => res.send(update.status))
From what I've seen in the docs/tutorials onUpdate seems something you use to deploy a function directly (since it returns a CloudFunction), so is there a way to use it as above?
If not, is there a way to do something similar in an HTTP endpoint?
You're trying to make an asynchronous operation synchronous, which is not usually a great idea in Cloud Functions.
I instead would:
Return a unique ID/location in the database to the client in their initial call.
Then have the client wait until a response appears in the database location.
And the Cloud Function responding to the auth completion can then write to that location.
The key difference with your approach is that #2 is watching a database location, instead of polling a Cloud Function. The code this is shown in this gist with code to wait for a value on various platforms.

How to handle slow http request in Node JS?

I'm new in nodejs, and i'm interested on event loop.But i have a confused issue in it.
If one http request need handle by 10 seconds, if all other requests must waiting for the slow request?
Short answer: Not, other requests will work in parallel. They will not wait for the response from the first request.
Long answer
Your code makes a new request.
Node.js adds a new callback to pending callbacks list. This callback will run after the response from the server will arrive.
Node.js processes next code or timers or callbacks or etc.
10 seconds have been passed. Your code receives the answer and finishes processing.
In NodeJS, most of your asynchronous tasks (like http requests, db access, etc.) will use callbacks or promises. What that means is that the event loop is free to handle other tasks until the callback is invoked or the promise resolves.
You can read more about the event loop here in the NodeJS documentation.
the event loop can only be slowed in an http request, when ever you are doing a hudge sync operation or whenever you are trying to send a file to a client without using the stream api. Whenever you read a hudge file with fs.readFile it kind of pause everything until the entire file is read, but using the stream api sends the file in chunk. To avoid slowing down your server you should always read file with the stream api for example using fs.createReadStream instead of fs.readFile depending on the size of the file

What is the best way to handle webhook APIs in Hapi.js?

A common webhook-style API practice is for an API consumer to receive the webhook request, respond with 204 No Content, close the connection, and then process the request. Hapi.js does not send a reply object until nextTick, which means that the handler function must return.
Separating the request processing into a separate function and calling it with nextTick still causes the processing to occur before the reply object is sent. setTimeout works, but this has negative performance implications.
Hapi's request extensions seemed like an option, but are attached to all requests, regardless of path, which adds an unnecessary overhead to other requests.
Express.js has a response.send() method that immediately sends the response. Ideally, Hapi would have something like this.
A solution is to use setImmediate for any processing that should happen after the response is sent and connection closed. Hapi sends the reply on nextTick. setImmediate events will be processed after nextTick events.
See example: https://gist.github.com/jeremiahlee/3689e8b4d1513c375b1e

Potential pitfalls in node/express to have callbacks complete AFTER a response is returned to the client?

I want to write a callback that takes a bit of time to complete an external IO operation, but I do not want it to interfere when sending data back to the client. I don't care about waiting for callback completion for purposes of the reply back to the client, but if the callback results in an error, I would like to log it. About 80% of executions will result in this callback executing after the response is sent back to the client and the connection is closed.
My approach works well and I have not seen any problems, but I would like to know whether there are any pitfalls in this approach that I may be unaware of. I would think that node's evented IO would handle this without issue, but I want to make sure before I commit this architecture to production. Any issues that should make me reconsider this approach?
As long as you're not trying to reference that response object after the response is sent, this will not cause any problems. There's nothing special about a request handler that cares one bit about callbacks in its code being invoked after the response is generated.

Resources