How to do load testing of node.js server? - node.js

I want to write one web application with node.js and MongoDB and I have got task to even test it. I would like to know if there are any tools like JMeter or anything else for load/stress testing of Node.js?
EDIT
My application is going to be information extraction kind of application and client expects extraction should not take more than 10 seconds for one document. Currently I have same application written in C# but its not scaling upto client's expectations. Then I came across this beautiful and fast Node.js. I think Node.js can help me alot.
Please enlighten !!!

Try nodeload: it's a collection of node.js modules for load testing HTTP services.
As a developer, you should be able to write load tests and get
informative reports without having to learn another framework. You
should be able to build by example and selectively use the parts of a
tool that fit your task. Being a library means that you can use as
much or as little of nodeload as makes sense, and you can create load
tests with the power of a full programming language. For example, if
you need to execute some function at a given rate, just use the
'nodeload/loop' module, and write the rest yourself
Just found out that this package is no longer under development so here are some active forks:
https://github.com/gamechanger/nodeload
https://github.com/Samuel29/NodeStressSuite

Why couldn't you test a node server with JMeter? For most load tests it doesn't matter what language your server is, you're just hitting it with a bunch of requests.
In any case, you could try loadtest which is implement in node.
Runs a load test on the selected HTTP or WebSockets URL. The API allows for easy integration in your own tests.
Edit:
This answer provides more options:
NodeJs stress testing tools/methods [closed]

Try artillery. Here are its features, the description of which is taken from the documentation:
Multiple protocols: Load test HTTP, WebSocket, Socket.io, Kinesis, HLS and more.
Scenarios: Support for complex scenarios to test multi-step interactions in your API or web app (great for ecommerce, transactional APIs, game servers etc).
Load testing & Functional testing: reuse the same scenario definitions to run performance tests or functional tests on your API or backend.
Performance metrics: get detailed performance metrics (latency, requests per second, concurrency, throughput).
Scriptable: write custom logic in JS, using any of the thousands of useful npm modules.
Integrations: statsd support out of the box for real-time reporting (integrate with Datadog, Librato, InfluxDB etc).
Extensible: write custom reporters, custom plugins, custom protocol engines etc.
and more! HTML reports, nice CLI, parameterization with CSV files.

Related

How to use Locust for UI performance testing?

I would like to use Locust for UI performance testing. How to use Locust for UI performance testing? How can I get the loading time of the HTML elements(img, lists, etc..)?
Thanks
Locust isn't a browser and doesn't parse HTML. It just does plain HTTP requests and it will not load things like images based on the response.
If you need something like that, you would need to parse the HTML in the response and do the "dependent" requests in your test script.
Locust is not made for that (as said). There are some other fancy tools which will allow to do it for you e.g:
k6.io (https://k6.io/ - previously known as LoadImpact) - allows you to perform performance checks outside of your environment and report it back to the pipeline with results. Easy to configure and integrate, great when it comes to more "clever" testing scenarios such as stress tests, load tests etc.
sitespeed.io (https://www.sitespeed.io/) - my 2nd favorite, very fun to use and easy to configure tool to track FE performance and tests (e.g. done with Selenium)
Lighthouse Reports - might be also performed as a "pointer" to the most common issues and included as a PR comments e.g. or notifications during the process (there are many Github Actions or DevOps packages doing it)
I've also gathered some of my findings in my recent talk (slides below) and is converted into the series of blogs around these topics and first of them is already published:
Slide deck from my talk on "Modern Web Performance Testing": https://slides.com/zajkowskimarcin/modern-web-performance-testing/
First blog from the series on the same topic: https://wearecogworks.com/blog/the-importance-of-modern-web-performance-testing-part-1

Testing File Streaming in Cucumber

Any idea how to test file streaming in Cucumber?
Note this is a Java microservice with a client and server architecture.
The client talks to the server on a designated port..I just dont know how to do this?
Most of the examples that I have seen are Browser Based Testing with Selenium.
I am writing Junit test cases for this and I wanted to know how this is to be done.
I am new to Behavior Driven Testing and I find this really exciting!
You have to imagine you are the client and that you are consuming the service. When you use the service what do you get back. If you are cukeing you need to think in business terms e.g. its about WHAT you are doing and WHY its important, not HOW its done. So WHAT is the point of this service, what value does it give.
If you just want to test that it works then I'd use a unit test tool instead.

Test the behavior of a java web service for multiple concurrent requests

How do I test the behavior of a java restful web service in case of multiple concurrent requests? Is there any 3rd party tool that can be leveraged?
The service accepts POST method. It expects a couple of parameters in it's request body and produces the response in the form of JSON.
The functionality of the service is to perform database read operations using the request body parameters and populate the fetched data in the JSON.
I would recommend one of the following:
SoapUI - superior tool for web service testing. Has limited load testing capabilities. However it does not scale (no clustered mode is available) and has quite poor reporting (all you get is average, min and max response times)
Apache JMeter - multiprotocol load testing tool, supports web services load testing as well. Has better load capabilities and ways to define the load patterns and can represent load test results via HTML Reporting Dashboard. Check out Testing SOAP/REST Web Services Using JMeter article to learn how to conduct a web service load test using JMeter.
You can try Gatling to generate some load.
It has nice documentation and easy QuickStart .
For advanced usage it requires some knowledge of Scala, but it also features GUI tool for simple scenarios recording, so you can run some scripts by postman or whatever browser tool you use for debugging, record it and make that scenario automated.
After running scenarios it generates nice reports using Graphite, so you can see response times and general stats.
Later you can also use Gatling for load and performance tests of your web service, it's convenient and fast as soon as you start playing with it. It can easily generate up to 5k requests per second from my old Mac, or hold up to 1k connections.
One of the bests tools to test web services is SOAPUI.
You can use it for what you want.
Link to SOAPUI
You can check this link to see how to use SOAPUI and concurrent tests.

What does building an application in Arango Foxx offer beyond a regular node application

I'm learning more about ArangoDB and it's Foxx framework. But it's not clear to me what I gain by using that framework over building my own stand alone nodejs app for API/access control, logic, etc.
What does Foxx offer that a regular nodejs app wouldn't?
Full disclosure: I'm an ArangoDB core maintainer and part of the Foxx team.
I would recommend taking a look at the webinar I gave last year for a detailed overview of the differences between Foxx and Node and the advantages of using Foxx when you are using ArangoDB. I'll try to give a quick summary here.
If you apply ideas like the Single Responsibility Principle to your architecture, your server-side code has two responsibilities:
Backend: persist and query data using the backend data storage (i.e. ArangoDB or other databases).
Frontend: transform the query results into a format acceptable for the client (e.g. HTML, JSON, XML, CSV, etc).
In most conventional applications, these two responsibilities are fulfilled by the same monolithic application code base running in the same process.
However the task of interacting with the data storage usually requires writing a lot of code that is specific to the database technology. You need to write queries (e.g. using SQL, AQL, ReQL or any other technology-specific language) or use database-specific drivers.
Additionally in many non-trivial applications you need to interact with things like stored procedures which are also part of the "backend code" but live in the database. So in addition to having the application server do two different tasks (storage and rendering), half the code for one of the tasks ends up living somewhere else, often using an entirely different language.
Foxx lets you solve this problem by allowing you to move the logic we identified as the "backend" of your server-side code into ArangoDB. Not only can you hide all the nitty gritty of query languages, edges and collections behind a more application-specific API, you also eliminate the network overhead often necessary to handle requests that would cause more than a single roundtrip to the database.
For trivial applications this may mean that you can eliminate the Node server completely and access your Foxx API directly from the client. For more complicated scenarios you may want to use Node to build external micro services your Foxx service can tap into (e.g. to interface with external non-HTTP APIs). Or you just put your conventional Node app in front of ArangoDB and use Foxx to create an HTTP API that better represents your application's problem domain than the database's raw HTTP API.
It's also worth keeping in mind that structurally Foxx services aren't entirely dissimilar from Node applications. You can use NPM dependencies and split your code up into modules and it can all live in version control and be deployed from zip bundles. If you're not convinced I'd suggest giving it a try by implementing a few of your most frequent queries as Foxx endpoints and then deciding whether you want to move more of your logic over or not.

What is the latest and greatest socket.io benchmark module currently?

I would like to script benchmark of my socket.io implementation.
After some research I have identified several NodeJS modules, but they have either not been updated for past years (wsbench), or are only supporting websocket protocol (wsbench, thor) or is not testing socket.io implementation but socket.io project (socket.io-benchmark).
Since socket.io project has been highly active the past year, I wonder what is the latest and greatest tool/module to use for benchmarking?
My requirements:
Easy to script and run the tests
Test reports giving good overview of test runs
Test reports should be easy to save in order to compare with later benchmarking
Just came across this in search of some benchmarking for my Socket.IO project.
I found socket.io-benchmark, however had some additional items that I wanted to work through but found one of the forks nearly there.
https://github.com/slowthinker/socket.io-benchmark
I also forked it to add a cap on messagse/second sent to give it more realistic parameters.
Hope that helps!
I would suggest Artillery: Artillery is a modern, powerful, easy-to-use, open-source load-testing toolkit: https://github.com/shoreditch-ops/artillery
Here some feature:
Mulitple protocols: Load-test HTTP, WebSocket and Socket.io applications
Scenarios: Specify scenarios to test multi-step interactions in your API or web app
Perfomance metrics: get detailed performance metrics (latency, requests per second, concurrency, throughput)
Scriptable: write custom logic in JS to do pretty much anything
High performance: generate serious load on modest hardware
Integrations: statsd support out of the box for real-time reporting (integrate with Datadog, Librato, InfluxDB etc)
Extensible: custom reporting plugins, custom protocol engines etc
and more! HTML reports, nice CLI, parameterization with CSV files

Resources