I want to test how many concurrent connections a node.js application can handle. What is the best way to do this? I have an application in node.js ready. Which will be the best way to test its capacity?
If your application is using HTTP, I recommend Siege.
Related
I have an erlang app with API and i need to call these API functions from a nodejs server and process the response. For example: Nodejs sends data to app, app processes the data then sends it back to nodejs and finally nodejs processes the result.
So far my best idea was launching the app in a cmd as a child process but that is really hard to work with and when i looked up, all i found was people suggesting not to use nodejs but that is unfortunately not an option for me.
EDIT: For clarification my question is what is the best way to call the erlang functions from nodejs
Not sure I understand your request completely but for running the nodejs project on a server , I highly recommend using pm2. pm2 will manage your nodejs application. See http://pm2.keymetrics.io/
I don't know how large the data is you are sending, but if the data is large enough (i.e. processing it takes longer than 200ms) you might want to look into processing it asynchronously.
My suggestion is to implement Erlang application API as a RESTfull API using one of Erlang open source web servers Cowboy, Mochiweb, Webmachine. In this case you can invoke Erlang API from NodeJs using HTTP client (you can find a lot of implementation of HTTP client for Javascript and NodeJS especially). This way is easy for implementation and maintenance.
There is no easy way to invoke remotely Erlang functions from JavaScript. Erlang has ability to communicate with C/C++ (Erlang Port) and Java (JInterface) application only.
I think the question is pretty explicit. JavaScript is single threaded and NodeJS still achieves incredible performances. We could think obvious that multi-threading would take NodeJS performances further, but it might be wrong in some cases.
For example, I'm currently building a starter project using NextJS. I wonder if handling each request in a separate thread would be worth it.
Thank you!
As far as I know in production mode nodeJs "usually" used as:
nginx server (used as security layer and as HTTPS proxy)
number of child NodeJs processes (amount === number of cores)
That means that all cores are used,
request is processed by single core,
server processes several requests at once
=== UPDATE ===
If you want to divide single request processing into several threads - then just remember that cross-process communication is expensive in NodeJS, and you need to delegate huge tasks to other threads/webworkers
If you see the need to split app into several threads - consider using microservices architecture, e.g. using http://senecajs.org/
Does a node.js app with thousands of concurrent users really need to use connection pooling mechanism ?
EDITED:
App could be an ecommerce app that requires high volume for reading and writing to databases.
Not necessarily. It depends in what situation. You should be able to handle thousands of concurrent connections but of course it all depends on what you do in those connection handlers. This is the only answer that can really be given with so little details in the question.
I'm programming a server-side application which will manage requests from:
Game client
Website (HTTP requests)
API
As of now I'm using only one (NodeJS) application for every type of requests, the problem is that with a growing user base, this approach will generate a bottle-neck.
I would like some advices on how to develop the server-side architetture so that it'll be scalable.
The only solution that I know of is to use multiple servers with the same application running which will share the same memory (Redis server).
Is it possible in nodeJS to split the management of these types of request into multiple servers? Maybe one or more servers for each type of request?
Currently I'm using:
NodeJS
Redis
MySQL
Express
Socket.io
Thanks in advance, can you recommend some books on this matter?
On one machine to handle the power of multicore architecture you can use node.js Cluster module (https://nodejs.org/api/cluster.html).
I think it is a good idea to split API and Website on different applications. If you decide to run multiple node.js applications on one machine try to use pm2 (http://pm2.keymetrics.io/). You could probably split your API on a bunch of small applications - which called microservice architecture. I personally don't like microservice approach you can check the web for pros and cons.
Also if you deploy your application (or bunch of applications) on different virtual/phisycal machines (which is usual in production) you can use haproxy for balancing and
fault tolerance (http://www.haproxy.org/).
I am developing a chat application using the nodejs/socket.io on the server side
Now, it is time to test how scalable it is
so, i think i can simulate a large number of soket.io clients effectively using nodejs also , but running the client code this time
the question is, How can i run the socket.io client library on nodejs? is this possible?
if so, can anyone please provide a simple example
my code is running fine on the browser, with the usual development load, the issue is not about the code is running or not, actually i am not planning to run the same client code, just openning a large number of connections , and sending thousands of messages to have a preliminary figure about scalability and resource consumption
also, any suggestion on testing socket.io server scalability will be appreciated
thanks a lot
What you're looking for isn't going to really be helpful. Even if you could simulate client-side socket.io in a node process, it wouldn't have the same dynamic properties as actual access from browsers. You'd be able to determine things like the maximum number of connections you could handle without running out of resources, but your general performance metrics would be pretty artificial and not generalizable.