How does a Cloud Function instance handles multiple requests? - node.js

I trying to wrap my head around Cloud Function's instances and how they work.
I'm asking about an example of an HTTP function, but I think the concept applies to any kind of function.
Let's say I have this cloud function that handles SSR for my app, named ssrApp.
And let's assume that it takes 1 second to complete every time it gets a request.
When Cloud Function receives the 1st request, it will spin up an instance to respond it.
QUESTION
How does that instance behave when multiple requests are coming?
From: https://cloud.google.com/functions/docs/concepts/exec
Each instance of a function handles only one concurrent request at a time. This means that while your code is processing one request, there is no possibility of a second request being routed to the same instance. Thus the original request can use the full amount of resources (CPU and memory) that you requested.
Does it mean that during that 1 second when my ssrApp function is running, if somebody hits my app URL, it is guaranteed that Cloud Function will spin up another instance for that second request? Does it matter if the function does only sync calls or some async calls in its execution? What I mean is, could an async call free the instance to respond to another request in parallel?

Does it mean that during that 1 second when my ssrApp function is running, if somebody hits my app URL, it is guaranteed that Cloud Function will spin up another instance for that second request?
That's the general behavior, although there are no guarantees around scheduling.
Does it matter if the function does only sync calls or some async calls in its execution? What I mean is, could an async call free the instance to respond to another request in parallel?
No, that makes no difference. If the container is waiting for an async call, it it still considered to be in-use.

2022 Update
For future searchers, Cloud Functions Gen2 now supports concurrency: https://cloud.google.com/functions/docs/2nd-gen/configuration-settings#concurrency

Related

Aws lambda node and concurrency

I develop for first time on aws lambda with serverless
I know that my NodeJS code is not blocking so a NodeJS server can handle several requests simultaneously
My question : does Lambda create an instance for each call ? example if there are 10 simultaneous connections, will Lambda create 10 instances of NodeJS
Currently, in my tests, I have the impression that lambda creates an instance for each call because at each call, my code creates a new connection to my database while locally my code keeps in memory the connection to my database
Yes, this is a fundamental feature of AWS Lambda (and "serverless" functions in general). A new instance is created for each request.
If you have multiple parallel executions, all will be separate instances (and this, each would use its own connection to the DB).
Now, if you are invoking multiple Lambda functions one after another, that's a bit different. It is possible that subsequent invocations of the Lambda function reuse the context. That means there is a possibility of reusing some things, like DB connection in subsequent calls to the Lambda function.
There is no exact information about how long a Lambda function keeps the previous context alive. Also, in order to reuse things like DB connection, you must define and obtain a connection outside of your Handler function. If you put it in the handler function, it will certainly not be reused.
When the context is reused, you have something called a "warm" start. Lambda function is started quicker. If some time has passed and the context cannot be reused anymore, you have a "cold" start, meaning the Lambda function will take more time to start its execution (it needs to pull all the dependencies when doing the cold start)

Best way to start a background process from GCP HTTP function call?

So, according to the docs here https://cloud.google.com/functions/docs/writing/http
Terminating HTTP functions
If a function creates background tasks (such as threads, futures, Node.js Promise objects, callbacks, or system processes), you must terminate or otherwise resolve these tasks before returning an HTTP response. Any tasks not terminated prior to an HTTP response may not be completed, and may also cause undefined behavior.
So, if one needs to launch a long-running background task from within HTTP function, but still return from function fast, there is no a straightforward way.
Have tried the PubSub approach (calling await topic.publishJSON(pars)), but looks like publishing a topic is quite time-consuming operation - which takes 2-3 secs. (8-)
Then probably pubsub trigger function runs well ok, but this 2-3 seconds delay makes it useless.
P.S.: using the approach with starting Promise from inside function is actually working, but it sounds like error-prone since it's against the docs.
If you need a quick answer you have 2 type of solutions
Async
With Cloud Functions, you need to invoke (perform an HTTP call) another functions (or Cloud Run or App Engine), without waiting the answer, and answer back to the requester. The call that you performed will run in background and answer something to your cloud function that no longer listen!
With PubSub, it's similar. Instead of invoking a Cloud Functions (or Cloud Run or App Engine), you publish a message into a PubSub topic. Then create a subscription to call your long running pocess
Same idea with Cloud Task, but you create a Task in a queue
Sync
If you use Cloud Run instead of Cloud Functions, you are able to perform partial answer to the requester. Like that, you can immediately answer back to the requester with a partial response which says "OK" and continue the process in the request context, and send another partial response when you want, or at the end of the long running process to inform the user the end of their process.

Is it possible for multiple AWS Lambdas to service a single HTTP request?

On AWS, is it possible to have one HTTP request execute a Lambda, which in turn triggers a cascade of Lambdas running in serial, where the final Lambda returns the result to the user?
I know one way to achieve this is for the initial Lambda to "stay running" and orchestrate the other Lambdas, but I'd be paying for that orchestration Lambda to effectively do nothing most of the time, i.e. paying for the time it's waiting on the others. If it were non-lambda code, that would be like blocking (and paying for) an entire thread while the other threads do their work.
Unless AWS stops the billing clock while async Lambdas are "sleeping"/waiting on network IO?
Unfortunately as you've found only a single Lambda function can be invoked, this becomes an orchestrator.
This is not ideal but will have to be the case if you want to use multiple Lambda functions as you're serving a HTTP request, you can either use the Lambda to call a number of Lambda or instead create a Step Function which can can orchestrate the individual steps. You would still need the Lambda to start this, and then poll the status of it before returning the results.

Loading up clients in Azure Functions

I'm creating an Azure Function that will run in consumption mode and will get triggered by messages in a queue.
The function will typically need to make a database call when it gets triggered. I "assume" the function gets launched and loaded to memory when it gets triggered and when it's idle, it gets terminated because it's running in consumption mode.
Based on this assumption, I don't think I can load up a singleton instance of my back-end client which includes the logic for making database calls.
Is then new'ing up my back-end client the right approach every time I need to perform some back-end operations?
This is a wrong assumption. Your function will be loaded during the first call, and will be unloaded only after an idle timeout (5 or 10 minutes).
You will not pay for idling, but you will pay for the whole time that your function was running, including the wait time during the database calls (or other IO).
Singletons and statics work just fine; and you should reuse instances like HttpClient between the calls.

Node.js + express.js and thread safety

Assume I have an array of items and each GET call make a change on this array (may be add/remove/shift)
Would that be "thread-safe"? I know that Node.js is a single-threaded, yet is there a possibility that two GET requests would be handled "simultaneously"?
As node is single-threaded only one piece of code is ever being executed at any time. A callback (such as the callback from a remote HTTP GET request) will be added to the end of the event loop's message queue. When there are no more functions on the stack, the program waits for a message to be added to the queue, and runs the message's function (in this case, the request callback function).
If you are making parallel requests to a remote server then you won't get the requests completed in the same order each time unless you run the requests in series. The callback functions will never run at the same time, however - only one function can ever be executed at once.
It would be thread safe because all operation on arrays are blocking. The only operations in node.js that are not blocking are I/Os.
Since you don't have any async operation, there is no problem with your situation. (Except if you need to do something like an access to a database or such ?)

Resources