Does the request Node module limit outbound requests? - node.js

Running into an issue when load testing a server page, where we are making outbound requests to an api using the request module, and then on completion or error, redirecting the page (using express). Under stress the page we are posting to times out, and I'm wondering if it has to do with the max maxSockets parameter in that module. Does anyone know the default maxSockets for the request module? how to I change this to something reasonable.
Thanks

Max sockets for the HTTP agent is defaulted to 5 per host. Just set it to whatever amount is useful to you, maybe 1,000.
It's documented here: https://nodejs.org/api/http.html#http_agent_maxsockets

Related

Requests being doubled if Tomcat is slow to respond

We are working with the following stack:
A node express middleware running on Nginx is communicating with an Apache, which proxies the requests to Tomcat, that are located on another server. Now, when requesting an operation that takes more than 15 seconds to complete, another identical request will be sent. There is obviously a 15-second retry policy somewhere.
So far, I have been unable to detect exactly who is doing this and my Google searches have also been fruitless. So, my question is if anyone has experience with something like this and could it be Node, Nginx or Apache that is sending the second request.
Any suggestions on where the double requests are coming from and what property I need to adjust to turn them off would be greatly appreciated.
The solution was to set the socket timeout property in apache's jk_mod to 0.

How to send requests through nginx for rate limiting ( throttle) per second?

I have script i node.js that makes requests to other api s. This script can run locally, on server, anywhere, it does not matter. The servers have limitations on requests per second. How can i send the reqs through NGINX?How to set nginx for it? should i change the urls, for example if i want to send requests to facebook, i need to configure specific nginx path like localhost:8081/facebooklocal, for it, and send reqs to nginxs localhost:8081/facebooklocal, and set req limits for it? Which way is better?
Also i need, to limit reqs with specific parameter in it*
There are a lot of tutorials about incoming requests, but i am asking about outcoming requests (from locally running server to some API s).
How to redirect requests to localhost:8085/api to theremotesite/api ?

Throttling express server

I'm using a very simple express server, with a PUT and GET routes on an Ubuntu machine, but if I use several clients (around 8) doing requests at the same time it very easily gets flooded and starts to return connect EADDRNOTAVAIL errors. I have found no way to avoid this other than reducing the number of requests per client, but is there a way to throttle answers on the server so that instead of returning error it queues petitions and serves them in due time?
Maybe it's better to check whether there are answers to requests on the client and not insert new ones if they have not been served? Client is here
Queuing seems to be wrong, you should first check your current ulimit (every connection needs a handle).
To solve your problem, just change the ulimit.

How to send the maximal possible number of GET requests using NodeJS?

My task is to send as much as possible GET requests using standart nodejs http module (with http.get) to a remote server (for the data import, not DDOS :) ) But after a certain number of requests sending stops stop or go on very slowly.
I have already set the value http.globalAgent.maxSockets = Infinity, req.setNoDelay(true); and req.setSocketKeepAlive(true);. Also i make the requests in the async queue with 10-1000 concurrency and it affects the number of connections sent to a stop. I increased ulimit -n to a maximum.
Does somebody have advice or similar experience? Maybe I do something wrong?
See my appropriate node issues on GitHub and Stackoverflow
Maybe my workaround described there works for you too. Instead of modifying the globalAgent I disabled it.

Why is node.js only processing six requests at a time?

We have a node.js server which implements a REST API as a proxy to a central server which has a slightly different, and unfortunately asymmetric REST API.
Our client, which runs in various browsers, asks the node server to get the tasks from the central server. The node server gets a list of all the task ids from the central one and returns them to the client. The client then makes two REST API calls per id through the proxy.
As far as I can tell, this stuff is all done asynchronously. In the console log, it looks like this when I start the client:
Requested GET URL under /api/v1/tasks/*: /api/v1/tasks/
This takes a couple seconds to get the list from the central server. As soon as it gets the response, the server barfs this out very quickly:
Requested GET URL under /api/v1/tasks/id/:id :/api/v1/tasks/id/438
Requested GET URL under /api/v1/workflow/id/:id :/api/v1/workflow/id/438
Requested GET URL under /api/v1/tasks/id/:id :/api/v1/tasks/id/439
Requested GET URL under /api/v1/workflow/id/:id :/api/v1/workflow/id/439
Requested GET URL under /api/v1/tasks/id/:id :/api/v1/tasks/id/441
Requested GET URL under /api/v1/workflow/id/:id :/api/v1/workflow/id/441
Then, each time a pair of these requests gets a result from the central server, another two lines is barfed out very quickly.
So it seems our node.js server is only willing to have six requests out at a time.
There are no TCP connection limits imposed by Node itself. (The whole point is that it's highly concurrent and can handle thousands of simultaneous connections.) Your OS may limit TCP connections.
It's more likely that you're either hitting some kind of limitation of your backend server, or you're hitting the builtin HTTP library's connection limit, but it's hard to say without more details about that server or your Node implementation.
Node's built-in HTTP library (and obviously any libraries built on top of it, which are most) maintains a connection pool (via the Agent class) so that it can utilize HTTP keep-alives. This helps increase performance when you're running many requests to the same server: rather than opening a TCP connection, making a HTTP request, getting a response, closing the TCP connection, and repeating; new requests can be issued on reused TCP connections.
In node 0.10 and earlier, the HTTP Agent will only open 5 simultaneous connections to a single host by default. You can change this easily: (assuming you've required the HTTP module as http)
http.globalAgent.maxSockets = 20; // or whatever
node 0.12 sets the default maxSockets to Infinity.
You may want to keep some kind of connection limit in place. You don't want to completely overwhelm your backend server with hundreds of HTTP requests under a second – performance will most likely be worse than if you just let the Agent's connection pool do its thing, throttling requests so as to not overload your server. Your best bet will be to run some experiments to see what the optimal number of concurrent requests is in your situation.
However, if you really don't want connection pooling, you can simply bypass the pool entirely – sent agent to false in the request options:
http.get({host:'localhost', port:80, path:'/', agent:false}, callback);
In this case, there will be absolutely no limit on concurrent HTTP requests.
It's the limit on number of concurrent connections in the browser:
How many concurrent AJAX (XmlHttpRequest) requests are allowed in popular browsers?
I have upvoted the other answers, as they helped me diagnose the problem. The clue was that node's socket limit was 5, and I was getting 6 at a time. 6 is the limit in Chrome, which is what I was using to test the server.
How are you getting data from the central server? "Node does not limit connections" is not entirely accurate when making HTTP requests with the http module. Client requests made in this way use the http.globalAgent instance of http.Agent, and each http.Agent has a setting called maxSockets which determines how many sockets the agent can have open to any given host; this defaults to 5.
So, if you're using http.request or http.get (or a library that relies on those methods) to get data from your central server, you might try changing the value of http.globalAgent.maxSockets (or modify that setting on whatever instance of http.Agent you're using).
See:
http.Agent documentation
agent.maxSockets documentation
http.globalAgent documentation
Options you can pass to http.request, including an agent parameter to specify your own agent
Node js can handle thousands of incoming requests - yes!
But when it comes down to ougoing requests every request has to deal with a dns lookup and dns lookup's, disk reads etc are handled by the libuv which is programmed in C++. The default value of threads for each node process is 4x threads.
If all 4x threads are busy with https requests ( dns lookup's ) other requests will be queued. That is why no matter how brilliant your code might be : you sometimes get 6 or sometimes less concurrent outgoing requests per second completed.
Learn about dns cache to reduce the amount of dns look up's and increase libuv size. If you use PM2 to manage your node processes they do have a well documentation on their side on environment variables and how to inject them. What you are looking for is the environment variable UV_THREADPOOL_SIZE = 4
You can set the value anywhere between 1 or max limit of 1024. But keep in mind libuv limit of 1024 is across all event loops.
I have seen the same problem in my server. It was only processing 4 requests.
As explained already from 0.12 maxsockets defaults to infinity. That easily overwhelms the sever. Limiting the requests to say 10 by
http.globalAgent.maxSockets = 20;
solved my problem.
Are you sure it just returns the results to the client? Node processes everything in one thread. So if you do some fancy response parsing or anything else which doesn't yield, then it would block all your requests.

Resources