Memory Leak in Express.js with EventSource - node.js

I think I am running into a memory leak with an Express app when connecting x number of EventSource clients to it. After connecting the clients and sending them x messages and disconnecting them, my Express app only releases a small amount of the allocated Heap/RSS.
To confirm this I saved a Heapdump when starting the server and one after connecting 7,000 clients to it and sending x messages to each client. I waited for a while to give the GC a chance to clean up before taking the heap snapshot.
To compare these heap snapshots I loaded them in the Chrome Developer Tools Profile view and chose the "Comparison" mode.
My questions are:
1) How to interpret these numbers?
(For reference see the attached heap snapshot screenshot.)
2) For instance it looks like that the Socket objects doesn't almost free any objects at all, is that correct?
3) Can you give me more tips to investigate the problem?

You could be free from memory leak and as a bonus avoid the garbage collector.
All you got to do is object polling.
You could do something like
var clientsPool = new Array(1000);
var clientsConnected = [];
When a new client connects, you do
var newClient = clientsPool.pop();
//set your props here
clientsConnected.push(newClient);
That's an awesome way to avoid the Garbage Collector and prevent memory leak. Sure, there's a little more work to it and you will have to manage that carefully but it's totally worth for performance.
There's an awesome talk about it, here you go
https://www.youtube.com/watch?v=RWmzxyMf2cE

As to my Comment...
Javascript can't clear up a section of memory should anything be pointing at it about 2 years ago some one found an exploit and it was quickly closed like that and it works like this
var someData = ["THIS IS SOME DATA SAY IT WAS THE SIZE OF A SMALL APPLICATION"];
var somePointer = someData[0];
delete someData;
they then injected an application into somePointer as it was a reference to a memory location when there was no data now. hey presto you injected memory.
So if there is a reference like above somePointer = someData[0]; you cant free the memory until you delete someData so you have to remove all references to anything you want cleaning in your case ALL_CLIENTS.push(this); on line 64 is making your system memory accessible through ALL_CLIENTS, so what you can do is
Line 157
_.each(ALL_CLIENTS, function(client, i) {
var u; // holds a undefined value (null, empty, nothing)
client.close();
//delete ALL_CLIENTS[i];
ALL_CLIENTS[i] = u;
ALL_CLIENTS.unused++;
});
On another note this is not a memory leak a memory leak is say you had this server you close it if the memory did not free up after you exited it then you have a memory leak if it does clean the memory behind it's self it's not a leak it's just poor memory management
Thanks to #Magus for pointing out that delete is not the best thing you could use however i would never recommend that you implement a limiting structure but you could try
Line 27:
ALL_CLIENTS.unused = 0;
Line 64:
var u;
if(ALL_CLIENTS.unused > 0){
for(var i = 0; i < ALL_CLIENTS.length; i++){
if(ALL_CLIENTS[i] == u){
ALL_CLIENTS[i] = this;
ALL_CLIENTS.unused--;
i = ALL_CLIENTS.length;
}
}
}else{
ALL_CLIENTS.push(this);
}

Related

How to programmatically know when NodeJS application is running out of memory

How can I know when my application is running out of memory.
For me, I'm doing some video transcoding on the server and sometimes it leads to out of memory errors.
So, I wish to know when the application is running out of memory so I can immediately kill the Video Transcoder.
Thank you.
You can see how much memory is being used with the built-in process module.
const process = require("process");
The process module has a method called memoryUsage, which shows info on the memory usage in Node.js.
console.log(process.memoryUsage());
When you run the code, you should see an object with all of the information needed for memory usage!
$ node index.js
{
rss: 4935680,
heapTotal: 1826816,
heapUsed: 650472,
external: 49879,
arrayBuffers: 9386
}
Here is some insight on each property.
rss - (Resident Set Size) Amount of space occupied in the main memory device.
heapTotal - The total amount of memory in the V8 engine.
heapUsed - The amount of memory used by the V8 engine.
external - The memory usage of C++ objects bound to JavaScript objects (managed by V8).
arrayBuffers - The memory allocated for ArrayBuffers and Buffers.
For your question, you might need to use heapTotal and heapUsed. Depending on the value, you can then shut the service down. For example:
const process = require("process");
const mem = process.memoryUsage();
const MAX_SIZE = 50; // Change the value to what you want
if ((mem.heapUsed / 1000000) >= MAX_SIZE) {
videoTranscoder.kill(); // Just an example...
}
The division by one million part just converts bytes to megabytes (B to MB).
Change the code to what you like - this is just an example.

d3dBuffer created with various sizes not released

I noticed a problem with the DirectX api (driver from AMD).
If I create a d3d buffer using createBuffer() with incremental sizes and release it in a for loop, the memory diagnosis tools shows that the process's private bytes size is constantly increasing. I think it might be because GPU mapped system memory is never released. BTW, The cpu heap size is stable.
FOR 1000 iteration buffer size starts at 1kb to 1000 MB
CreateBuffer using
D3D11_USAGE_STAGING/D3D11_USAGE_DYNAMIC
& D3D11_CPU_ACCESS_WRITE
d3dbuffer.Release & d3dbuffer = nullptr
context. cleastate() and flush() to synchronously release d3dbuffer
for (unsigned long long totalSize = 1MB; totalSize <= 1000MB ; totalSize += 1MB)
{
// create
CComPtr<ID3D11Buffer> pd3dBuffer;
D3D11_BUFFER_DESC bufferDesc;
{
ZeroMemory(&bufferDesc, sizeof(bufferDesc));
bufferDesc.ByteWidth = static_cast<UINT>(bufferSize);
bufferDesc.Usage = D3D11_USAGE_DYNAMIC;
bufferDesc.BindFlags = D3D11_BIND_SHADER_RESOURCE;
bufferDesc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE;
bufferDesc.MiscFlags = 0;
bufferDesc.StructureByteStride = 0;
}
HRESULT hr = pd3dDevice->CreateBuffer(&bufferDesc, NULL, pd3dBuffer);
if (FAILED(hr)) break;
//release
pd3dBuffer.Release();
pd3dDeviceContext->ClearState();
pd3dDeviceContext->Flush();
}
Because the process memory usage keeps going up and eventually reaches my physical memory 16gb limit and crashes. This is weird as I synchronously release buffer right after creation. The process memory usage should be relatively stable.
Is there anyone could explain how does not directx memory management work?
After some investigation, this weird behavior was caused by the DirectX implementation. Basically Microsoft as an OS company only defines the interface/specs for DirectX APIs, 3rd party GPU vendors will have to provide their own implementations of these APIs.
Nvidia will provide one, AMD will provide one, if suddenly Qualcomm wants to support DirectX, it will have to write one as well.
The buffer allocation and release mechanism is not fully defined in the DirectX specs, therefore it is up to the vendors to provide optimal memory management algorithms. Because there is a bug in the vendor's implementation to not recycle released memory handles, the system committed memory associated with the buffer is not released which causes this bug.

Node.js search for memory leak

I am trying to get rid of memory leak but my understanding of things is pretty low in this area and I have nobody to ask for help expect you guys. My script is killing server RAM and I can't figure out what is wrong with my approach.
I have this function:
function getPages(params){
gmail.users.messages.list(params, (err, resp)=>{
for (var message of resp.messages) {
message['ownerEmail'] = currentUser;
getMessage(message); // this does something with it later
var message = null;
}
if(resp.nextPageToken){
params.pageToken = resp.nextPageToken;
getPages(params);
} else {
// resolve end here...
}
})//gmail.users.messages.list
}//fetchPages
getPages(params);
Basically it gets messages from the API and should do something with it afterwards. It will execute itself as long as there is more data to fetch. (as long as nextPageToken exists in response).
Now I ran this command:
$ free -lm
total used free shared buff/cache available
Mem: 11935 1808 7643 401 2483 9368
Low: 11935 4291 7643
High: 0 0 0
Swap: 6062 0 6062
As script is running buff/cache is constantly increasing.
What is the buff/cache thing actually and how is it related to my Node script?
How do I manage what is buffered/cached and how do I kill/clear such stuff?
How do I optimize function above to forget everything that is already processed?
How do I make sure that script takes absolutely zero resources once it is finished? (I even tried process.exit at the end of the script)
How do I debug and monitor RAM usage from my Node.js script?
I don't think that there is a memory leak. I think that you are in an infinite loop with the recursion. The gmail.users.messages returns the response with the resp.nextPageToken being present (I suppose) and then you are calling the getPages(params); again. Can you put a console.log just before the getPages(params); function call? Something like that:
if (resp.nextPageToken) {
params.pageToken = resp.nextPageToken;
console.log('token', params.pageToken)
getPages(params);
}
and check how many times do you print this and if you ever get out of the recursion. Also, why do you set the message to null into the iteration? There is a redefinition of the variable.
You can use N|Solid (its free for development), you'll launch your app inside its wrapper. Its quite easy to use and it allows you to make full profile where leak occurs.
You can also do it manually with built in debugger, check memory consumption at each step.
Just to answer one of questions within the post:
How do I make sure that script takes absolutely zero resources once it
is finished? (I even tried process.exit at the end of the script)
There has been misunderstanding:
http://www.linuxatemyram.com/
Don't Panic! Your ram is fine!
What's going on? Linux is borrowing unused memory for disk caching.
This makes it looks like you are low on memory, but you are not!
Everything is fine!

Memory leak updating geometry - ArcGis Runtime .Net

I'm working with ArcGis Runtime SDK for .Net v10.2.5.
I have an UDP socket listening and waiting for image data that fires a function executed on a different thread in background.
I want to draw the image over a ellipse of arbitrary radius so i use
var filestream = System.IO.File.Open(imagepath, FileMode.Open, FileAccess.Read);
MapPoint point = new MapPoint(center.longitude, center.latitude, SpatialReferences.Wgs84);
var polySymbol = new Esri.ArcGISRuntime.Symbology.PictureFillSymbol();
await polySymbol.SetSourceAsync(filestream);
var param = new GeodesicEllipseParameters(point, 25, LinearUnits.Meters);
var ellipse = GeometryEngine.GeodesicEllipse(param);
***//HERE IS THE PROBLEM***
_graphicsLayer.Graphics.Clear();
_graphicsLayer.Graphics.Add(new Graphic { Geometry = ellipse, Symbol = polySymbol });
This is done ~5 times/second. Despite i'm clearing the layer each iteration there is a memory leak that is increasing memory use till app crashes.
I read about problems with memory using ArcGIS and Geometry process, so i'm not sure if i'm hitting a wall or just doing things badly
I also tried overwriting geometry without clear:
//this is the problematic line, if i comment that, memory doesn't increase.
_graphicsLayer.Graphics[0].Symbol = polySymbol;
_graphicsLayer.Graphics[0].Geometry = ellipse;
And using stream statement, filestream is properly closed at the end, but used RAM keep increasing till app crashes.
I would store the PictureFillSymbol in a Dictionary by fileName and reuse the symbol rather than creating a new one on every update. Changing the Symbol and Geometry is likely the best way to do it rather than creating a new Graphic every time

TCP receiving and sending buffersizes in node.js

I have been working with node.js for the last 4 month and now wants to increase tcp receving and sending buffersize.
My purpose is to speed up my application and expermantation with buffersizes may increase preformance.
I have searched on google but haven't found anything useful except that you can change the default socket buffersizes on linux as example on this website:
http://www.cyberciti.biz/faq/linux-tcp-tuning/
Is there any way to change/set tcp sending and receiving buffersizes for node.js io?
stream_wrap has an allocation callback passed to libuv that is passed a suggested_size of the allocated memory to use in the receiving the data. Right now it passes 64KB as the suggested size, and there's no way to change this afaik.
Is this along the line of your question?
I found the stream_wrap on git:
git... src/stream_wrap src file
And if you navigate src/stream_wrap.cc in node.js src folder and looking up following code:
// If less than 64kb is remaining on the slab allocate a new one.
if (SLAB_SIZE - slab_used < 64 * 1024) {
slab = NewSlab(global, wrap->object_);
} else {
wrap->object_->SetHiddenValue(slab_sym, slab_obj);
}
}
Then you might be able to change the size.
#Trev Norris you know anything about this?

Resources