How to get current memory usage using nodejs?
I have a backend application. When ram usage of the operating system is greater than 7GB, I want to decline user requests.
The main tools you have built into nodejs without going to external programs are these:
process.memoryUsage()
process.memoryUsage.rss()
Probably you want the second one because resident set size is closest to the total OS RAM memory allocated to the process.
Related
I have multiple micro-services written in Node and running on pm2. Whenever there is a high traffic on any of these micro-services, the memory doesn't exceed 800 MB even though the system has more than 10GB of memory free. Instead the system becomes slow. I have used only the below command with no additional settings to start the services.
pm2 start app.js --name='app_name'
I have gone through the docs for pm2 but it only mention about limiting the memory usage using max-memory-restart. Is there a way I can make sure my micro-services use all the available system memory.
Whenever there is a high traffic on any of these micro-services, the memory doesn't exceed 800 MB even though the system has more than 10GB of memory free. Instead the system becomes slow.
You need to look at CPU metrics too, not just memory. More likely than not, those services aren't starved for memory and would begin to swap out to disk, but are just working your server's CPUs.
Profiling your services wouldn't hurt either, to find any possible bottlenecks or stalls that occur during high load.
Is there a way I can make sure my micro-services use all the available system memory.
Yes, there is: use more memory in those services. There's no intrinsic limit unless you've configured one.
I'm running Node.js on a server with only 512MB of RAM. The problem is when I run a script, it will be killed due to out of memory.
By default the Node.js memory limit is 512MB. So I think using --max-old-space-size is useless.
Follows the content of /var/log/syslog:
Oct 7 09:24:42 ubuntu-user kernel: [72604.230204] Out of memory: Kill process 6422 (node) score 774 or sacrifice child
Oct 7 09:24:42 ubuntu-user kernel: [72604.230351] Killed process 6422 (node) total-vm:1575132kB, anon-rss:396268kB, file-rss:0kB
Is there a way to get rid of out of memory without upgrading the memory? (like using persistent storage as additional RAM)
Update:
It's a scraper which uses node module request and cheerio. When it runs, it will open hundreds or thousands of webpage (but not in parallel)
If you're giving Node access to every last megabyte of the available 512, and it's still not enough, then there's 2 ways forward:
Reduce the memory requirements of your program. This may or may not be possible. If you want help with this, you should post another question detailing your functionality and memory usage.
Get more memory for your server. 512mb is not much, especially if you're running other services (such as databases or message queues) which require in-memory storage.
There is the 3rd possibility of using swap space (disk storage that acts as a memory backup), but this will have a strong impact on performance. If you still want it, Google how to set this up for your operating system, there's a lot of articles on the topic. This is OS configuration, not Node's.
Old question, but may be this answer will help people. Using --max-old-space-size is not useless.
Before Nodejs 12, versions have an heap memory size that depends on the OS (32 or 64 bits). So, following documentations, on 64-bit machines that (the old generation alone) would be 1400 MB, far away from your 512mb.
From Nodejs12, heap size take care of system RAM; however Nodejs' heap isn't the only thing in memory, especially if your server isn't dedicated to it. So set the --max-old-space-size permit to have a limit regarding the old memory heap, and if your application comes closer, the garbage collector will be triggered and will try to free memory.
I've write a post about how I've observed this: https://loadteststories.com/nodejs-kubernetes-an-oom-serial-killer-story/
I'm writing a simple cms in Node.js, Express and MongoDB. I'm planning to run a different Node.js process for every site. The problem is that after startup the process takes about 90m of RAM and for me it's too big (eight site take all server RAM). This memory is taken after the first connection to the site and other connections don't affect the memory.
Is there a guideline or a list of "best practices" to optimize this memory usage? I'm trying to track where the memory is allocated with process.memoryUsage() or a similar function but it's not simple to do this.
Is not a problem of memory leaks or something similar because the memory usage doesn't grow up after the first connection, so probably the optimization could be in loading less modules or do something differently...
The links below may help you to understand and detect memory leaks (if they do exist):
Debugging memory leaks in node.js
Detecting Memory Leaks in Node.js Applications
Tracking Down Memory Leaks in Node.js
These SO questions may also be useful:
How to monitor the memory usage of Node.js?
Node.js Memory Leak Hunting
Here is a quick fix, a node.js lib that will restart the any node process once it reaches a certain size.
https://github.com/DoryZi/memory_limiter
Set the --max_old_space_size CLI flag to control the maximum heap size. There's a post that describes Running a node.js app in a low-memory environment
tl;dr; Try setting this value, in megabytes, to about 80% of the maximum memory footprint you want node to try to remain under. e.g. to run app.js and keep it under 500MB RAM used
node --max_old_space_size=400 app.js
This setting is also described in the Node JS CLI documentation
Is there a way to find the peak memory usage for current node.js process? Best would be platform independent, but else something in Linux only? No extra tools allowed like for example valgrind or whatever.
You could take snapshots using the process.memoryUsage() method, and cache them.
Node.js Docs for process module
I am working in an embedded environment, where resources are quite limited. We are trying to use node.js, which works well, but typically consumes about 60 megabytes of virtual memory (real memory used is about 5 megabytes.) Given our constraints, this is too much virtual memory; we can only afford to allow node.js to use about 30 megabytes of VM, at most.
There are several command-line options for node.js, such as "--max_old_space_size",
"--max_executable_size", and "--max_new_space_size", but after experimentation, I find that these all control real memory usage, not maximum virtual memory size.
If it matters, I am working in a ubuntu linux variant on an ARM architecture.
Is there any option or setting that will allow one to set the maximum amount of virtual memory that a node.js process is allowed to use?
You can use softlimit to execute the node with limited size. Or you can directly use setrlimit of Linux, but not really sure how to call it from NodeJS, see this SO question