why subsequent HTTP requests - node.js

My JavaScript makes that ajax call which retrieves a JSON array.
I am trying simulate long running HTTP REST call request that takes longer to return the results.
The way I do it is delay writing anything to the response object on the server side until 5 minutes elapsed since the request landed. After that I set the status to 200 and write the response with the JSON ending the stream.
Putting a breakpoint on the serve side I realize that the request shows up second time but the browser's Network tab does not show another request being made.
It may not be relevant but I am using browsersync middlewars to serve this JSON and write the bytes and end the response in setTimeout().
setTimeout(()=> {
res.statusCode = 200;
res.write(data);
res.end();
});
Question:
Anyone has any explanation as to why this is happening ? And if there is a way to simulate this in another ways ?

In most cases the browser should retry if connection is closed before response. This is a link to the details => HTTP spec Client Behavior if Server Prematurely Closes Connection
BTW it might help you use the chrome throttling options on the network section of dev tools (F12)

Related

what happens if neither res.send() nor res.end() is called in express.js?

I have a security issue that someone is trying to call random APIs that are not supported on our server but are frequently used for administrators API in general. and I set this code below to handle 404 to not respond to this attack
url-not-found-handler.js
'use strict';
module.exports = function () {
//4XX - URLs not found
return ((req, res, next) => {
});
};
what happens to client is that it waits until the server responds but I want to know if this will affect the performance of my express.js server also what happens behind the scene in the server without res.send() or res.end() ?
According to the documentation of res.end().
Ends the response process. This method actually comes from Node core,
specifically the response.end() method of http.ServerResponse.
And then response.end
This method signals to the server that all of the response headers and
body have been sent; that server should consider this message
complete. The method, response.end(), MUST be called on each response.
If you leave your request hanging, the httpserver will surely keep data about it. Which means that if you let hang many requests, your memory will grow and reduce your server performance.
About the client, he's going to have to wait until he got a request timeout.
The best to do having a bad request is to immediately reject the request, which is freeing the memory allowed for the request.
You cannot prevent bad requests (maybe have a firewall blocking requests from certains IP address?). Best you can do is to handle them as fast as possible.

Node.js http GET request takes substantially longer than browser, REST client, et al

I am try to make a simple GET request in Node.js and the request takes 3-5 seconds to resolve whereas the same request in a browser or REST client takes ~400ms. The server to which I am making the request is controlled by our server team, but before I bother them with request/resource monitoring, I was going to ping the community to see if there were any "hey, check this setting first" kind of tips you guys could offer.
The code essentially forwards incoming requests to our server:
http.createServer(function (req, res) {
http.request({
host: "our.private.host",
port: 8080,
path: req.url,
headers: req.headers
}, function () {
res.end("DONE: " + Date.now());
}).end();
}).listen(8001);
I open my browser and type in the following URL:
http://localhost:8001/path/to/some/resource
... which gets forwarded on to the final destination:
http://our.private.host:8080/path/to/some/resource
Everything is working fine and I am getting the response I want, but it takes 3-5 seconds to resolve. If I paste the final destination URL directly in the browser or a REST client, it resolves quickly. I don't know much about our server, unfortunately - but I am looking more for node tips at this point. Note, the request pool isn't maxed out as I am only making 1 request at a time from my local machine.
The first step is gather some info on where the request is taking its time by looking at the exact timing of the network activity on your node server. You can do that by getting a tool that watches all network activity. I personally use Fiddler, but I know that WireShark is popular too.
Once that tools is installed and active, you can then see how long all these various steps in the process of your request are taking:
DNS request to resolve target IP address
Time to connect to the target server
Time to send the http request
Time to receive the http request
Time to send response back to original request
Understanding which of these operations is much longer than expected will give you an idea where to look further for the problem.
FYI, there are pre-built tools such as nginx that can do this type of proxying by just setting some values in a configuration file without any custom coding.

Connecting to a Reliable Webservice with Nodejs

My application needs to receive a result from Reliable Webservice. Here is the scenario:-
First I send a CreateSequence request. Then the server replies with a CreateSequenceResponse message. Next I send the actual request to the webservice.
Then the webservice send a response with 202 accept code and sends result in a later message. All these messages contain the header Connection: keep-alive.
I made request with http.ClientRequest. I could capture all responses except the result. http.ClientRequest fires only one response event.
How can I receive the message which contains the result?
Is there any way to listen to socket for remaining data (socket.on('data') did not work). I checked this with ReliableStockQuoteService shipped with Apache Synapse. I appreciate if someone can help me.
When you get the response event, you are given a single argument, which is an http.IncomingMessage, which is a Readable stream. This means that you should bind your application logic on the data event of the response object, not on the request itself.
req.on('response', function (res) {
res.on('data', console.log);
});
Edit: Here is a good article on how to make HTTP requests using Node.

“Proxying” a lot of HTTP requests with Node.js + Express 2

I'm writing proxy in Node.js + Express 2. Proxy should:
decrypt POST payload and issue HTTP request to server based on result;
encrypt reply from server and send it back to client.
Encryption-related part works fine. The problem I'm facing is timeouts. Proxy should process requests in less than 15 secs. And most of them are under 500ms, actually.
Problem appears when I increase number of parallel requests. Most requests are completed ok, but some are failed after 15 secs + couple of millis. ab -n5000 -c300 works fine, but with concurrency of 500 it fails for some requests with timeout.
I could only speculate, but it seems thant problem is an order of callbacks exectuion. Is it possible that requests that comes first are hanging until ETIMEDOUT because of node's focus in latest ones which are still being processed in time under 500ms.
P.S.: There is no problem with remote server. I'm using request for interactions with it.
upd
The way things works with some code:
function queryRemote(req, res) {
var options = {}; // built based on req object (URI, body, authorization, etc.)
request(options, function(err, httpResponse, body) {
return err ? send500(req, res)
: res.end(encrypt(body));
});
}
app.use(myBodyParser); // reads hex string in payload
// and calls next() on 'end' event
app.post('/', [checkHeaders, // check Content-Type and Authorization headers
authUser, // query DB and call next()
parseRequest], // decrypt payload, parse JSON, call next()
function(req, res) {
req.socket.setTimeout(TIMEOUT);
queryRemote(req, res);
});
My problem is following: when ab issuing, let's say, 20 POSTs to /, express route handler gets called like thousands of times. That's not always happening, sometimes 20 and only 20 requests are processed in timely fashion.
Of course, ab is not a problem. I'm 100% sure that only 20 requests sent by ab. But route handler gets called multiple times.
I can't find reasons for such behaviour, any advice?
Timeouts were caused by using http.globalAgent which by default can process up to 5 concurrent requests to one host:port (which isn't enough in my case).
Thouthands of requests (instead of tens) were sent by ab (Wireshark approved fact under OS X; I can not reproduce this under Ubuntu inside Parallels).
You can have a look at node-http-proxy module and how it handles the connections. Make sure you don't buffer any data and everything works by streaming. And you should try to see where is the time spent for those long requests. Try instrumenting parts of your code with conosle.time and console.timeEnd and see where is taking the most time. If the time is mostly spent in javascript you should try to profile it. Basically you can use v8 profiler, by adding --prof option to your node command. Which makes a v8.log and can be processed via a v8 tool found in node-source-dir/deps/v8/tools. It only works if you have installed d8 shell via scons(scons d8). You can have a look at this article to help you further to make this working.
You can also use node-webkit-agent which uses webkit developer tools to show the profiler result. You can also have a look at my fork with a bit of sugar.
If that didn't work, you can try profiling with dtrace(only works in illumos-based systems like SmartOS).

node.js - push data to client - only one client can be connected?

I am trying to create a server-side solution which periodically pushes data to the client (no client-side polling) via node.js. The connection should be open permanently and whenever the server has new data, it pushes it down to the client.
Here is my simple sample script:
var sys = require('sys'),
http = require('http');
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/html'});
sys.puts('Start sending...');
setInterval(function(){
res.write("<script type='text/javascript'>document.write('test<br>')</script>");
}, 10000);
}).listen(8010);
This basically works, but it seems that only one client at a time can be connected.
If I open http://127.0.0.1:8010/ with my browser I see every 10 seconds the new output written. But when I open another tab with the same url, it just loads forever. Only if I close the first tab, I get conent from the server.
What do I need to do in order to server multiple clients?
This is definitely a bug, what happens is that the Browser re-uses the same connection due to keep-alive and HTTP/1.1 and Node screws up.
You can see this at work in Opera11, open the page twice, it's the exact same page, using the exact same connection.
Curl and everything that doesn't set Connection: keep-alive works just fine, but Browsers fail to open the same page twice. Although you can open 'localhost:8010' and 'localhost:8010/foo' and it will work on both pages.
Note: This only affects GET requests, POST requests work just fine since there's no re-using of the connection.
I've filed an issue on this.
You should use socket.io. It handles all the heavy lifting for you and is a really cool library.
Be careful with this!
node.js is non-blocking but at the same time handles only 1 connection at a time. What you did is put the node into a dead state, that's why you see data on the second client when you close the first.

Resources