I'm new in nodejs, and i'm interested on event loop.But i have a confused issue in it.
If one http request need handle by 10 seconds, if all other requests must waiting for the slow request?
Short answer: Not, other requests will work in parallel. They will not wait for the response from the first request.
Long answer
Your code makes a new request.
Node.js adds a new callback to pending callbacks list. This callback will run after the response from the server will arrive.
Node.js processes next code or timers or callbacks or etc.
10 seconds have been passed. Your code receives the answer and finishes processing.
In NodeJS, most of your asynchronous tasks (like http requests, db access, etc.) will use callbacks or promises. What that means is that the event loop is free to handle other tasks until the callback is invoked or the promise resolves.
You can read more about the event loop here in the NodeJS documentation.
the event loop can only be slowed in an http request, when ever you are doing a hudge sync operation or whenever you are trying to send a file to a client without using the stream api. Whenever you read a hudge file with fs.readFile it kind of pause everything until the entire file is read, but using the stream api sends the file in chunk. To avoid slowing down your server you should always read file with the stream api for example using fs.createReadStream instead of fs.readFile depending on the size of the file
Related
I have a "recover password" option on a website. Sometimes my sendMail function takes a few seconds to execute. I want to call this async sendMail without waiting and return something like "check you inbox in a few seconds".
Will the server keep running the method or once it responds the method will be terminated?
Whatever happens will surely happen regardless if it's on Apache, Nginx, etc?
Yes, the sendMail method will keep running. An incoming http request is just one asynchronous operation that nodejs can process. It can have thousands of others going on too. In this case, your sendMail() operation would be one additional asynchronous operation carrying on and it will finish without any regard for when the http request finishes.
Remember, nodejs is NOT a threaded architecture where each http request is a new thread that gets terminated at some point. Instead, it's a non-blocking, event driven, asynchronous architecture and you can start as many other asynchronous operations as you want and they will run to completion regardless of what happens with the http request.
I'm using Express in Firebase Functions that are on Node 10. My question is: if I have heavy Promises that I need to complete before the function terminates, but I want to do this after I sent the response (res.status(200).send...), how is the best way I could do this?
I have come up with two approaches so far but neither seem great (and I haven't tried the 'finish' approach yet to see if it works):
For every router, add a finally... clause that awaits accumulated promises. This has the major downside that it'd be easy to forget to do this when adding a new router, or might have bugs if the router already needs its own finally clause
Use the Node stream 'finish' event, https://nodejs.org/docs/latest-v10.x/api/stream.html#stream_event_finish ... but I don't know if Firebase Functions allow this to be caught, or if it'll still be caught if my function runs to its end and the event is fired afterwards.
I'm doing this to try to make my functions' responses faster. Thanks for any help.
if I have heavy Promises that I need to complete before the function terminates, but I want to do this after I sent the response (res.status(200).send...), how is the best way I could do this?
With Cloud Functions, it's not possible to send a response before promises are resolved, and have the work from those promises complete normally. When you send the response, Cloud Functions assumes that's the very last thing the function will do. When the response is sent using res.send(), Cloud Functions will terminate the function and shut down the code. Any incomplete asynchronous work might never finish.
If you want to continue work after a response is sent, you will have to first offload that work to some other component or service that is not in the critical path of the function. One option is to send a message to a pubsub function, and let that function finish in the "background" while the first function sends the response and terminates. You could also send messages to other services, such as App Engine or Compute Engine, or even kick off work in other clouds to finish asynchronously.
See also:
Continue execution after sending response (Cloud Functions for Firebase)
How to exec code after res.send() in express server at Firebase cloudFunctions
im really new to node.js and i have a beginners question.
I plan to creat a node server that will execute a http request for a json file every 1-2 seconds.
The reason for doing a request so fast is because the json file im requesting is changing constantly.
What is the correct way doing that and not blocking the event loop?
Is it safe to put the request code in a function and call it in a setTimeout() function?
Should i run the requests in a child process?
The whole point of asynchronicity is that it is asynchronous. When you issue the HTTP request, it is sent off more or less instantly and node returns to other business, waking up only when the response is received. The only thing that could cause a "blocking" of the event loop in your case is very intensive processing either preparing the request or processing the response.
That is why on the front page of the node website it says "node uses an event-driven, non-blocking I/O model".
A common webhook-style API practice is for an API consumer to receive the webhook request, respond with 204 No Content, close the connection, and then process the request. Hapi.js does not send a reply object until nextTick, which means that the handler function must return.
Separating the request processing into a separate function and calling it with nextTick still causes the processing to occur before the reply object is sent. setTimeout works, but this has negative performance implications.
Hapi's request extensions seemed like an option, but are attached to all requests, regardless of path, which adds an unnecessary overhead to other requests.
Express.js has a response.send() method that immediately sends the response. Ideally, Hapi would have something like this.
A solution is to use setImmediate for any processing that should happen after the response is sent and connection closed. Hapi sends the reply on nextTick. setImmediate events will be processed after nextTick events.
See example: https://gist.github.com/jeremiahlee/3689e8b4d1513c375b1e
I want to understand more exactly what happens when a server receives a client request on a Node.js server. With a more traditional server, a new thread is created to handle the new client session.
But in Node.js and other event-loop style servers, what exactly happens? What part of the codebase first gets executed? With node, I am almost certain something in the http module handles the new request first.
I want to know a little more about the details of how this works in a sort of compare and contrast style between the two types of handling of client connections.
In a nutshell:
Node uses libuv to manage incoming connections and data events
Events are placed in a queue to be handled on the next tick of the event loop
When bytes start arriving, they are fed in to the native-code http parser
The parser calls a callback in JS-land with the header contents
The rest of the JS HTTP code dispatches the request to user code, which may be Express