Node JS, delayed response - node.js

var http = require('http');
var s = http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.write('Hello\n');
setInterval(function() {
res.end(' World\n');
},2000);
console.log("Hello");
});
s.listen(8080);
After starting the above server, i run,
curl http://127.0.0.1:8080
I get the required delay. output:
Hello <2 seconds> World
But in the browser the whole content loads after 2 seconds.
Hell World <together after 2s>
What am i doing wrong ?

The following piece of code opens up a response stream with the client and streams it to the client. So, in curl you'll get "Hello" first and "World" after 2 seconds (since you've set a timer of 2000 milliseconds).
res.write('Hello\n');
setTimeout(function() {
res.end(' World\n');
},2000);
But the browser renders it only after the complete response stream is recieved.
That is why you're getting the response after 2 seconds.
It is completely the browser's behavior. It doesn't utilize the response stream until the whole response is received. Once the stream is closed, the whole response will be ready to be utilized. However, in PHP there's a way to flush the response stream if need be.
However, if you're looking for streaming data on a frequent basis, this wouldn't be the best way to do it. I'd rather suggest you to use Comet technique or websockets.
I hope this is what you are looking for.

// simulate delay response
app.use((req, res, next) => {
setTimeout(() => next(), 2000);
});

browser behavior is different from curl. browser will not render the page, until you call res.end(). so if you want to load a part of a web page after a delay, you need to load that second part separately via a websocket or an ajax request. I recommend using websocets. take a look at socket.io, it's a simple way of using websockets in node.js.

Related

Sending multiple responses with the same response object in nodejs

I have a long running process which needs to send data back at multiple reponse. Is there some way to send back multiple responses with express.js
I want have "test" after 3 seconde a new reponse "bar"
app.get('/api', (req, res) =>
{
res.write("test");
setTimeout(function(){
res.write("bar");
res.end();
}, 3000);
});
with res.write a wait a 3050 of have a on response
Yes, you can do it. What you need to know is actually chunked transfer encoding.
This is one of the old technics used a decade ago, since Websockets, I haven't seen anyone using this.
Obviously you only need to send responses in different times maybe up to some events will be fired later in chunks. This is actually the default response type of express.js.
But there is a catch. When you try to test this, assuming you are using a modern browser or curl which all of them buffer chunks, so you won't be seeing expected result. Trick is filling up chunk buffers before sending consecutive response chunks. See the example below:
const express = require('express'),
app = express(),
port = 3011;
app.get('/', (req, res) => {
// res.write("Hello\n");
res.write("Hello" + " ".repeat(1024) + "\n");
setTimeout(() => {
res.write("World");
res.end();
}, 2000);
});
app.listen(port, () => console.log(`Listening on port ${port}!`));
First res.write with additional 1024 spaces forces browser to render chunk. Then you second res.write behaves as you expected. You can see difference with uncommenting the first res.write.
On network level there is no difference. Actually even in browser you can achieve this by XHR Object (first AJAX implementation) Here is the related answer
An HTTP request response has one to one mapping. You can return HTTP response once, however long it may be. There are 2 common techniques:
Comet responses - Since Response is a writable stream, you can write data to it from time to time before ending it. Ensure the connection read timeout is properly configured on the client that would be receiving this data.
Simply use web sockets - These are persistent connection that allow 2 way messaging between server and client.

What keeps event loop open when listening for I/O events?

Node.js's website says that the framework "exits the event loop when there are no more callbacks to perform."
What I haven't found clearly explained anywhere is what keeps the event loop running once you initiate an I/O module that is waiting for input. For example, in this canonical "Hello World" HTTP server example, Node.js continues listening indefinitely for incoming HTTP requests:
require('http').createServer((req, res) => {
res.writeHead(200, { 'Content-Type': 'text/plain' });
res.end('Hello World\n');
}).listen(8080);
While there is always the potential for a callback to the implicit request event handler, until an actual HTTP request comes in, there's no event being handled. Does this mean that the statement from the Node.js website isn't strictly true? Or is there some nuance here that I am missing?
The nuance is that there are callbacks "under the hood" as well.
In this case, the HTTP server is waiting for incoming connections on a TCP port, and until that port is explicitly closed (server.close()), this adds to the list of pending callbacks.
You don't even have to define a request handler to demonstrate this:
require('http').createServer().listen(8080);
And this shows what happens when you close the port after a few seconds:
var server = require('http').createServer().listen(8080);
setTimeout(function() {
server.close();
}, 2000);

Node.js Express block

My Problem is, that I'm planning to use express to cache all requests which I receiver for a certain amount of time until I send all responses at once.
But unfortunately I can't receive a second request until I've responded to the first one. So I guess node / express is somehow blocking the further processing of other requests.
I build a minimal working example for you, so you can see better what I'm talking about.
var express = require('express');
var app = express();
var ca = [];
app.get('/hello.txt', function(req, res){
ca.push(res);
console.log("Push");
});
setInterval(function(){
while (ca.length) {
var res = ca.shift();
res.send('Hello World');
console.log("Send");
}
},9000);
var server = app.listen(3000, function() {
console.log('Listening on port %d', server.address().port);
});
When I'm sending just one request to localhost:3000 and wait for 9sec I'm able to send a second one. But when I send both without waiting for the callback of the interval, the second one is blocked until the first interval triggered.
Long Story short: Why is this blocking happening and what ways are there to avoid this blocking.
PS: It seems that the default http package shows another behavior http://blog.nemikor.com/2010/05/21/long-polling-in-nodejs/
try it with firefox and chrome to prevent serializing the requests...
OK, I've got the solution.
The issue wasn't in my code, it was caused by Chrome. It seems that Chrome is serializing all requests, which target the same URL. But nevertheless it sends both request and won't serve the second request with the response of the first.
Anyway, thanks for you help!

Sending an http response outside of the route function?

So, I have a route function like the following:
var http = require('http').createServer(start);
function start(req, res){
//routing stuff
}
and below that,I have a socket.io event listener:
io.sockets.on('connection', function(socket){
socket.on('event', function(data){
//perform an http response
}
}
When the socket event 'event' is called, I would like to perform an http response like the following:
res.writeHead(200, {'Content-disposition': 'attachment; filename=file.zip'})
res.writeHead(200, { 'Content-Type': 'application/zip' });
var filestream = fs.createReadStream('file.zip');
filestream.on('data', function(chunk) {
res.write(chunk);
});
filestream.on('end', function() {
res.end();
});
This last part, when performed within the routing function works just fine, but unfortunately when it is called from the socket event, it of course does not work, because it has no reference to the 'req' or 'res' objects. How would I go about doing this? Thanks.
Hmmm... interesting problem:
It's not impossible to do something like what you're trying to do, the flow would be something like this:
Receive http request, don't respond, keep res object saved somewhere.
Receive websocket request, do your auth/"link" it to the res object saved earlier.
Respond with file via res.
BUT it's not very pretty for a few reasons:
You need to keep res objects saved, if your server restarts a whole bunch of response objects are lost.
You need to figure out how to link websocket clients to http request clients. You could do something with cookies/localstorage to do this, I think.
Scaling to another server will become a lot harder / will you proxy clients to always be served by the same server somehow? Otherwise the linking will get harder.
I would propose a different solution for you: You want to do some client/server steps using websockets before you let someone download a file?
This question has a solution to do downloads via websocket: receive file via websocket and initiate download dialog
Sounds like it won't work on older browsers / IE, but a nice option.
Also mentions downloading via hidden iframe
Check here whether this solution is cross-browser enough for you: http://caniuse.com/#feat=datauri
Another option would be to generate a unique URL for the download, and only append it to the browser's window (either as a hidden iframe download, or as a simple download button) once you've done your logic via websocket. This option would be more available cross-browser and easier to code.

How to check if connection was aborted in node.js server

I'm making some long polling with node.js.
Basically, node.js server accepts request from the user and then checks for some updates. If there're no updates, it will check them after the timeout.
But what if user has closed his tab, or went to another page? In my case, the script continues working.
Is there a way in node.js to check or detect or to catch an event when user has aborted his request (closed the connection)?
You need to use req.on('close', function(err) { ... }); instead of req.connection.on('close', function(err) { ... });
There is a very important distinction. req.on() adds a listener to this request while req.connection.on(), you add a listener to the (keep-alive) connection between the client and the server. If you use req.connection.on(), every time the client re-uses a connection, you add one more listener to the same connection. When the connection is finally aborted, all listeners are fired.
Function scoping typically keeps you safe from this screwing up your server logic, but it's a dangerous thing nevertheless. Fortunately at least NodeJS 0.10.26 is smart enough to warn the user of this:
(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace:
at Socket.EventEmitter.addListener (events.js:160:15)
at Socket.Readable.on (_stream_readable.js:689:33)
...
Thanks to Miroshko's and yojimbo87's answers I was able to catch the 'close' event, but I had to make some additional tweaks.
The reason why just catching 'close' event wasn't fixing my problem, is that when client sends the request to the node.js server, the server itself can't get information if the connection is still open until he sends something back to the client (as far as I understood - this is because of the HTTP protocol).
So, the additional tweak was to write something to the response from time to time.
One more thing that was preventing this to work, is that I had 'Content-type' as 'application/json'. Changing it to 'text/javascript' helped to stream 'white spaces' from time to time without closing the connection.
In the end, I had something like this:
var server = http.createServer(function(req,res){
res.writeHead(200, {'Content-type': 'text/javascript'});
req.connection.on('close',function(){
// code to handle connection abort
});
/**
* Here goes some long polling handler
* that performs res.write(' '); from time to time
*/
// some another code...
});
server.listen(NODE_PORT, NODE_LISTEN_HOST);
My original code is much bigger, so I had to cut it a lot just to show the sensitive parts.
I'd like to know if there are better solutions, but this is working for me at the moment.
Is there a way in node.js to check or detect or to catch an event when
user has aborted his request (closed the connection)?
You can try to use http.ServerRequest close event. Simple example:
var http = require("http"),
util = require("util");
var httpServer = http.createServer(function(req, res) {
util.log("new request...");
// notify me when client connection is lost
req.on("close", function(err) {
util.log("request closed...");
});
// wait with response for 15 seconds
setTimeout(function() {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.write("response");
res.end();
util.log("response sent...");
}, 15000);
});
httpServer.listen(8080);
util.log("Running on 8080");
I'm using Express.js (~4.10.6) and the following code is working fine for me:
//GET Request:
app.get('/', function(req, res){
req.on('close', function(){
console.log('Client closed the connection');
});
});
As soon as I close the browser's tab, the browser closes the connection, and the callback function gets executed as expected.
Seems that your question is very similar to this one:
NodeJS HTTP request connection's close event fired twice
try
request.connection.on('close', function () {
...
});

Resources