Edited Question
I was trying to understand why there is memory leak in simple function call. why node does not release memory as local scope is ended.
Thanks in advance
function somefunction()
{
var n = 20000;
var x ={};
for(var i=0; i<n; i++){
x['some'+i] = {"abc" : ("abc#yxy.com"+i)};
}
}
// Memory Leak
var init = process.memoryUsage();
somefunction();
var end = process.memoryUsage();
console.log("memory consumed 2nd Call : "+((end.rss-init.rss)/1024)+" KB");
PREVIOUS ANSWER before the question was edited to correct a code error:
The results are invalid because this code doesn't invoke the function:
(function(){
somefunction();
});
The anonymous function is declared but not invoked. So it does not use much in the way of resources.
You need to invoke the function:
(function(){
somefunction();
}());
#Mohit, Both strategy taking same memory consumption. Run each code separately and check by yourself.
EDIT:
Wait for gc. When gc will call then memory should be free. Try to call gc explicitly then check it.
Related
When i use worker_threads to handle a lot of complex logic that irrelevant with the main thread, i found that memory on the server very high.
Below is part of my simplified code.
main.js
const worker = new Worker(process.cwd() + "/worker.js")
// My business will repeat cycle infinitely, this code just an example
for (let i = 0; i < 1000000; i++) {
worker.postMessage(i)
}
woker.js
parentPort.on("message", async data => {
// a log of logic....
})
When I run this scripts on the server, the memory will keep increasing, Maybe the thread is not shut down?
I tried using "worker.removeAllListeners()", resourceLimits option and third party library "workerpool", but still didn't solve my problem.
what should i do or using other method to solve this problem? Thanks for answering me!
I've been experimenting a little with nodejs, writing a webscraper that uses proxies.
What i'm trying it to load all proxies from a proxy file. "proxies.txt" in this case.
What i don't understand is, while the webscraper is running it is giving me the following message.
(node:21470) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 pipe listeners added. Use emitter.setMaxListeners() to increase limit
This does not happen if i manually make an array of several proxies.
Here is the code where i load the proxies from file to array.
var proxy_array = [];
if (! proxy_array.length ) {
var proxy_file = fs.readFileSync('./proxies.txt', 'utf8');
var split = proxy_file.split('\n');
for(var i = 0; i < split.length; i++){
var trimmed_proxy = split[i].replace('\r', ''); //removes the \r that gets added while i split the list i think?
proxy_array.push(trimmed_proxy);
}
}
console.log(proxy_array); //it does return all proxies.
Thanks for any help in advance!
Greetings
I am creating a sailsJS webserver with a background task that needs to run continuously (if the server is idle). - This is a task to synchronize a database with some external data and pre-cache data to speed up requests.
I am using sails version 1.0. Tthe adapter is postgresql (adapter: 'sails-postgresql'), adapter version: 1.0.0-12
Now while running this application I noticed a major problem: it seems that after some time the application inexplicably crashes with an out of heap memory error. (I can't even catch this, the node process just quits).
While I tried to hunt for a memory leak I tried many different approaches, and ultimately I can reduce my code to the following function:
async DoRun(runCount=0, maxCount=undefined) {
while (maxCount === undefined || runCount < maxCount) {
this.count += 1;
runCount += 1;
console.log(`total run count: ${this.count}`);
let taskList;
try {
this.active = true;
taskList = await Task.find({}).populate('relatedTasks').populate('notBefore');
//taskList = await this.makeload();
} catch (err) {
console.error(err);
this.active = false;
return;
}
}
}
To make it "testable" I reduced the heap size allowed to be used by the application: --max-old-space-size=100; With this heapsize it always crashes about around 2000 runs. However even with an "unlimited" heap it crashes after a few (ten)thousand runs.
Now to further test this I commented out the Task.find() command and implimented a dummy that creates the "same" result".
async makeload() {
const promise = new Promise(resolve => {
setTimeout(resolve, 10, this);
});
await promise;
const ret = [];
for (let i = 0; i < 10000; i++) {
ret.push({
relatedTasks: [],
notBefore: [],
id: 1,
orderId: 1,
queueStatus: 'new',
jobType: 'test',
result: 'success',
argData: 'test',
detail: 'blah',
lastActive: new Date(),
updatedAt: Date.now(),
priority: 2 });
}
return ret;
}
This runs (so far) good even after 20000 calls, with 90 MB of heap allocated.
What am I doing wrong in the first case? This let me to believe that sails is having a memory leak? Or is node unable to free the database connections somehow?
I can't seem to see anything that is blatantly "leaking" here? As I can see in the log this.count is not a string so it's not even leaking there (same for runCount).
How can I progress from this point?
EDIT
Some further clarifications/summary:
I run on node 8.9.0
Sails version 1.0
using sails-postgresql adapter (1.0.0-12) (beta version as other version doesn't work with sails 1.0)
I run with the flag: --max-old-space-size=100
Environment variable: node_env=production
It crashes after approx 2000-2500 runs when in production environment (500 when in debug mode).
I've created a github repository containing a workable example of the code;
here. Once again to see the code at any point "soon" set the flag --max-old-space-size=80 (Or something alike)
I don't know anything about sailsJS, but I can answer the first half of the question in the title:
Does V8/Node actually garbage collect during function calls?
Yes, absolutely. The details are complicated (most garbage collection work is done in small incremental chunks, and as much as possible in the background) and keep changing as the garbage collector is improved. One of the fundamental principles is that allocations trigger chunks of GC work.
The garbage collector does not care about function calls or the event loop.
This question already has answers here:
How node.js server serve next request, if current request have huge computation?
(2 answers)
NodeJs how to create a non-blocking computation
(5 answers)
Closed 5 years ago.
All:
I am pretty new to Node async programming, I wonder how can I write some Express request handler which can handle time consuming heavy calculation task without block Express handling following request?
I thought setTimeout can do that to put the job in a event loop, but it still block other requests:
var express = require('express');
var router = express.Router();
function heavy(callback){
setTimeout(callback, 1);
}
router.get('/', function(req, res, next) {
var callback = function(req, res){
var loop = +req.query.loop;
for(var i=0; i<loop; i++){
for(var j=0; j<loop; j++){}
}
res.send("finished task: "+Date.now());
}.bind(null, req, res);
heavy(callback)
});
I guess I did not understand how setTimeout works(my understanding about setTimeout is after that 1ms delay it will fire up the callback in a seperated thread/process without blocking other call of heavy), could any one show me how to do this without blocking other request to heavy()?
Thanks
Instead of setTimeout it's better to use process.nextTick or setImmediate (depending od when you want your callback to be run). But it is not enough to put a long running code into a function because it will still block your thread, just a millisecond later.
You need to break your code and run setImmediate or process.nextTick multiple times - like in every iteration and then schedule a new iteration from that. Otherwise you will not gain anything.
Example
Instead of a code like this:
var a = 0, b = 10000000;
function numbers() {
while (a < b) {
console.log("Number " + a++);
}
}
numbers();
you can use code like this:
var a = 0, b = 10000000;
function numbers() {
var i = 0;
while (a < b && i++ < 100) {
console.log("Number " + a++);
}
if (a < b) setImmediate(numbers);
}
numbers();
The first one will block your thread (and likely overflow your call stack) and the second one will not block (or, more precisely, it will block your thread 10000000 times for a very brief moment, letting other stuff to run in between those moments).
You can also consider spawning an external process or writing a native add on in C/C++ where you can use threads.
For more info see:
How node.js server serve next request, if current request have huge computation?
Maximum call stack size exceeded in nodejs
Node; Q Promise delay
How to avoid jimp blocking the code node.js
NodeJS, Promises and performance
We know node.js provides us with great power but with great power comes great responsibility.
As far as I know the V8 engine doesn't do any garbage collection. So what are the most common mistakes we should avoid to ensure that no memory is leaking from my node server.
EDIT:
Sorry for my ignorance, V8 does have a powerful garbage collector.
As far as I know the V8 engine doesn't
do any garbage collection.
V8 has a powerful and intelligent garbage collector in build.
Your main problem is not understanding how closures maintain a reference to scope and context of outer functions. This means there are various ways you can create circular references or otherwise create variables that just do not get cleaned up.
This is because your code is ambigious and the compiler can not tell if it is safe to garbage collect it.
A way to force the GC to pick up data is to null your variables.
function(foo, cb) {
var bigObject = new BigObject();
doFoo(foo).on("change", function(e) {
if (e.type === bigObject.type) {
cb();
// bigObject = null;
}
});
}
How does v8 know whether it is safe to garbage collect big object when it's in an event handler? It doesn't so you need to tell it it's no longer used by setting the variable to null.
Various articles to read:
http://www.ibm.com/developerworks/web/library/wa-memleak/
I wanted to convince myself of the accepted answer, specifically:
not understanding how closures maintain a reference to scope and context of outer functions.
So I wrote the following code to demonstrate how variables can fail to be cleaned up, which people may find of interest.
If you have watch -n 0.2 'ps -o rss $(pgrep node)' running in another terminal you can watch the leak occurring. Note how commenting in either the buffer = null or using nextTick will allow the process to complete:
(function () {
"use strict";
var fs = require('fs'),
iterations = 0,
work = function (callback) {
var buffer = '',
i;
console.log('Work ' + iterations);
for (i = 0; i < 50; i += 1) {
buffer += fs.readFileSync('/usr/share/dict/words');
}
iterations += 1;
if (iterations < 100) {
// buffer = null;
// process.nextTick(function () {
work(callback);
// });
} else {
callback();
}
};
work(function () {
console.log('Done');
});
}());
active garbage collection with:
node --expose-gc test.js
and use with:
global.gc();
Happy Coding :)