Boys and girls,
i've been messing around with node.js today and I can't seem to reproduce this concurrent magic.
i wrote this rather small server:
var http = require("http");
var server = http.createServer(function(req, res) {
setTimeout(function() {
res.writeHead(200,{"content-type":"text/plain"});
res.end("Hello world!");
}, 10000);
});
server.listen(8000);
but what's strange, when running localhost:8000 in multiple chrome tabs at the same time. its as if the request is 'queued'. 1st tab takes 10 seconds, 2nd tab takes 20 seconds, 3rd tab takes 30 seconds etc...
But when running this very example with Links it behaves how I expect it (concurrently handling requests).
P.S. This seems to occur in Chrome and Firefox
bizarre
The requests for the same URL/hostname get queued client-side in the browser. That has nothing to do with node.js, your code is fine.
If you use different URLs in each tab, you example should work. (for a few tabs)
Also have a look at: Multiple Ajax requests for same URL
Related
var http = require('http');
http.createServer(function (req, res) {
setTimeout(function () {
res.write("hello");
res.end();
}, 10000);
}).listen(8080);
this is my simple node server running on localhost.
Now if i hit this url localhost:8080 from two different browsers simultaneously, i get response at same time on both browsers i.e after around 10 secs.But on other hand when i do so from two different tabs of chrome browser, it takes 10 secs for one tab and another 10 secs for 2nd tab.
Seems like requests are being processed one after another rather than simultaneously.
can somebody explain?
It's a know browser issue, only happens when you make two requests in the same browser (or browser profile) and separate tabs (XHR requests can actually be done simultaneously).
Sources:
Chrome stalls when making multiple requests to same resource?
Multiple instances of PHP script won't load concurrently in the same browser from the same URL
My Problem is, that I'm planning to use express to cache all requests which I receiver for a certain amount of time until I send all responses at once.
But unfortunately I can't receive a second request until I've responded to the first one. So I guess node / express is somehow blocking the further processing of other requests.
I build a minimal working example for you, so you can see better what I'm talking about.
var express = require('express');
var app = express();
var ca = [];
app.get('/hello.txt', function(req, res){
ca.push(res);
console.log("Push");
});
setInterval(function(){
while (ca.length) {
var res = ca.shift();
res.send('Hello World');
console.log("Send");
}
},9000);
var server = app.listen(3000, function() {
console.log('Listening on port %d', server.address().port);
});
When I'm sending just one request to localhost:3000 and wait for 9sec I'm able to send a second one. But when I send both without waiting for the callback of the interval, the second one is blocked until the first interval triggered.
Long Story short: Why is this blocking happening and what ways are there to avoid this blocking.
PS: It seems that the default http package shows another behavior http://blog.nemikor.com/2010/05/21/long-polling-in-nodejs/
try it with firefox and chrome to prevent serializing the requests...
OK, I've got the solution.
The issue wasn't in my code, it was caused by Chrome. It seems that Chrome is serializing all requests, which target the same URL. But nevertheless it sends both request and won't serve the second request with the response of the first.
Anyway, thanks for you help!
Note: this is not a replicated post for those about settimeout, the key answer here is browser design options.
I am starting study node.js:
A simple example to test async:
var http=require('http');
http.createServer(
function(request, response){
response.writeHead(200);
response.write("Hello, dog is running");
setTimeout(
function(){
response.write("Dog is done");
response.end();
},
10000
);
}
).listen(8080);
console.log("Listen on port 8080")
One interesting thing is its behavior is differernt when in command lind with curl and in browser:
In Ubuntu 12.10, I use curl localhost:8080 in two consoles, they response in almost same 10 sends.
However, I open two browsers, make the request at almost same time, but the whole procedure took me 20 seconds?
thanks.
It's the browser waiting, not node.js
If you run the server and request http://localhost:8080/ in two tabs it takes 20 seconds because the browser waits for the first request to the same url before starting the second.
If you run the server and request http://localhost:8080/1 and http://localhost:8080/2 in two tabs it takes 10 seconds again.
I'm writing proxy in Node.js + Express 2. Proxy should:
decrypt POST payload and issue HTTP request to server based on result;
encrypt reply from server and send it back to client.
Encryption-related part works fine. The problem I'm facing is timeouts. Proxy should process requests in less than 15 secs. And most of them are under 500ms, actually.
Problem appears when I increase number of parallel requests. Most requests are completed ok, but some are failed after 15 secs + couple of millis. ab -n5000 -c300 works fine, but with concurrency of 500 it fails for some requests with timeout.
I could only speculate, but it seems thant problem is an order of callbacks exectuion. Is it possible that requests that comes first are hanging until ETIMEDOUT because of node's focus in latest ones which are still being processed in time under 500ms.
P.S.: There is no problem with remote server. I'm using request for interactions with it.
upd
The way things works with some code:
function queryRemote(req, res) {
var options = {}; // built based on req object (URI, body, authorization, etc.)
request(options, function(err, httpResponse, body) {
return err ? send500(req, res)
: res.end(encrypt(body));
});
}
app.use(myBodyParser); // reads hex string in payload
// and calls next() on 'end' event
app.post('/', [checkHeaders, // check Content-Type and Authorization headers
authUser, // query DB and call next()
parseRequest], // decrypt payload, parse JSON, call next()
function(req, res) {
req.socket.setTimeout(TIMEOUT);
queryRemote(req, res);
});
My problem is following: when ab issuing, let's say, 20 POSTs to /, express route handler gets called like thousands of times. That's not always happening, sometimes 20 and only 20 requests are processed in timely fashion.
Of course, ab is not a problem. I'm 100% sure that only 20 requests sent by ab. But route handler gets called multiple times.
I can't find reasons for such behaviour, any advice?
Timeouts were caused by using http.globalAgent which by default can process up to 5 concurrent requests to one host:port (which isn't enough in my case).
Thouthands of requests (instead of tens) were sent by ab (Wireshark approved fact under OS X; I can not reproduce this under Ubuntu inside Parallels).
You can have a look at node-http-proxy module and how it handles the connections. Make sure you don't buffer any data and everything works by streaming. And you should try to see where is the time spent for those long requests. Try instrumenting parts of your code with conosle.time and console.timeEnd and see where is taking the most time. If the time is mostly spent in javascript you should try to profile it. Basically you can use v8 profiler, by adding --prof option to your node command. Which makes a v8.log and can be processed via a v8 tool found in node-source-dir/deps/v8/tools. It only works if you have installed d8 shell via scons(scons d8). You can have a look at this article to help you further to make this working.
You can also use node-webkit-agent which uses webkit developer tools to show the profiler result. You can also have a look at my fork with a bit of sugar.
If that didn't work, you can try profiling with dtrace(only works in illumos-based systems like SmartOS).
I am trying to create a server-side solution which periodically pushes data to the client (no client-side polling) via node.js. The connection should be open permanently and whenever the server has new data, it pushes it down to the client.
Here is my simple sample script:
var sys = require('sys'),
http = require('http');
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/html'});
sys.puts('Start sending...');
setInterval(function(){
res.write("<script type='text/javascript'>document.write('test<br>')</script>");
}, 10000);
}).listen(8010);
This basically works, but it seems that only one client at a time can be connected.
If I open http://127.0.0.1:8010/ with my browser I see every 10 seconds the new output written. But when I open another tab with the same url, it just loads forever. Only if I close the first tab, I get conent from the server.
What do I need to do in order to server multiple clients?
This is definitely a bug, what happens is that the Browser re-uses the same connection due to keep-alive and HTTP/1.1 and Node screws up.
You can see this at work in Opera11, open the page twice, it's the exact same page, using the exact same connection.
Curl and everything that doesn't set Connection: keep-alive works just fine, but Browsers fail to open the same page twice. Although you can open 'localhost:8010' and 'localhost:8010/foo' and it will work on both pages.
Note: This only affects GET requests, POST requests work just fine since there's no re-using of the connection.
I've filed an issue on this.
You should use socket.io. It handles all the heavy lifting for you and is a really cool library.
Be careful with this!
node.js is non-blocking but at the same time handles only 1 connection at a time. What you did is put the node into a dead state, that's why you see data on the second client when you close the first.