I've been trying to resolve a bug in a nodejs application, and have narrowed it down to the way I've implemented event emitters. The application is an express.js app, has uses classes. There's some critical aspect of NodeJS that I must be missing, around memory usage and class / object lifecycles. I was hoping someone could point out why what I'm doing is not working as expected.
Here's the code:
// ServiceWrapper.js:
var events = require('events');
var ServiceClient = function(opts) {
this.foobar = "";
this.opts = opts;
this.hasFoo = false, this.hasBar = false;
}
ServiceClient.prototype = new events.EventEmitter();
ServiceClient.prototype.getFoo = function() {
var self = this;
self.hasFoo = true;
self.foobar += "foo";
self.emit('done','foo');
}
ServiceClient.prototype.getBar = function() {
var self = this;
self.hasBar = true;
self.foobar += "bar";
self.emit('done','bar');
}
var ServiceWrapper = function(){}
ServiceWrapper.prototype.getResponse = function(options, callback) {
var servClient = new ServiceClient({});
servClient.on('done', function(what) {
if (servClient.hasFoo && servClient.hasBar) {
console.log("foo && bar")
callback(servClient.foobar);
}
else {
console.log("Don't have everything: " + servClient.foobar);
}
});
servClient.getFoo();
servClient.getBar();
}
module.exports = ServiceWrapper
And then in my express app:
var ServiceWrapper = require('ServiceWrapper');
app.get('/serviceReponse', function(req,res) {
var servWrapper = new ServiceWrapper();
servWrapper.getResponse(function(ret) {
res.end(ret);
});
});
The behaviour on the web app works as expected: response is set to "foobar". However, looking at the logs, it looks like there's a memory leak - multiple instances of servWrapper. After starting the application, the first request generates:
Don't have everything: foo
foo && bar
However, if I refresh the page, I see this:
foo && bar
Don't have everything: foo
foo && bar
foo && bar
And with every refresh, the listener detects multiple 'done' events - foo && bar outputs keeps growing (assuming there's more and more instances of ServiceWrapper that persist in memory).
Why does this happen? (I expect to see the output that I get on the first request from every request).
Thanks to the guys on #node.js on freenode for assisting with this:
sure, but every time you attach listeners, you're attaching them to the same emitter
since you didn't localize the prototype's state to your instance, the prototype methods act upon the state of the prototype object.
I believe you can fix it by simply doing EventEmitter.call(this) in the constructor
See the following link for more info:
http://nodejs.org/api/util.html#util_util_inherits_constructor_superconstructor
Related
Not a duplicate of : this question, as I'm trying to use the link posted as answer to solve my problem.
I'm creating a little dummy socket client to help testing one of my product, it looks like so :
var ee = require('events').EventEmitter;
require('http').globalAgent.maxSockets = 1000;
function Dummy(){
this.config = require('../config/credentials.js');
this.socket = require('socket.io-client')(this.config.socketIO.url);
var self = this;
this.socket.on('task', function(task) {
self.createTask(task);
});
}
util.inherits(Dummy, ee);
module.exports = Dummy;
Dummy.prototype.createTask = function(name){
var self = this;
setInterval(function sendStuff(){
self.socket.emit("msg")
}, 1000);
};
On its own, it works fine; However, when I try to launch many of them like so :
for (var i = 0; i < 100; i++) {
fakeClients.push(new Dummy());
};
Is appears to pool connections and shows as one client only.
Based on this link, I thought that by using socket.io-client, I'd avoid the pooling behaviour, yet it doesn't work. Am I doing something wrong?
I've simplified the loop btw, I actually make sure there's a delay between creations to avoid sync heartbeats.
Ideas?
Found the answer, it goes like this :
function Dummy(){
this.config = require('../config/credentials.js');
this.socket = require('socket.io-client').connect(this.config.socketIO.url, { "force new connection": true });
var self = this;
this.socket.on('task', function(task) {
self.createTask(task);
});
}
By using the connect() function, we can set the force new connection flag to true and avoid the pooling. Simple!
I've been primarily a Perl coder for years, but also have a background in C++, so I'm coming from a "classical" OO background, and now learning node.js. I just read through The Principles of Object-Oriented JavaScript, which did a good job of explaining the JS concept of OO to classically-minded people like me. But I'm left with a question, specifically related to node.js and inheritance. Pardon me if I'm still using "classical" vocabulary to explain my problem.
Lets suppose I have a module lib/foo.js:
function foo() {
console.log('Foo was called');
}
module.exports.foo = foo;
And I want to "subclass" this in another module lib/bar.js:
var foo = require('foo.js');
// Do some magic here with *.prototype, maybe?
function bar() {
console.log('Bar was called');
}
module.exports.bar = bar;
Such that my main script can do this:
var bar = require('lib/bar.js');
bar.foo(); // Output "Foo was called"
bar.bar(); // Output "Bar was called"
Is this even possible? If so, what am I missing?
Or is this an anti-pattern? Or plain impossible? What should I do instead?
Here's how I did it, to override one method in the request module. Warning: many node modules are poorly designed for extension, including request, as they do way too much stuff in the constructor. Not just a gazillion argument options, but starting up IO, connections, etc. For example, request does the http connection (eventually) as part of the constructor. There is no explicit .call() or .goDoIt() method.
In my example, I wanted to use querystring instead of qs to format forms. My module is cleverly named "MyRequest". In a separate file named myrequest.js you have:
var Request = require('request/request.js');
var querystring = require('querystring');
MyRequest.prototype = Object.create(Request.prototype);
MyRequest.prototype.constructor = MyRequest;
// jury rig the constructor to do "just enough". Don't parse all the gazillion options
// In my case, all I wanted to patch was for a POST request
function MyRequest(options, callbackfn) {
"use strict";
if (callbackfn)
options.callback = callbackfn;
options.method = options.method || 'POST'; // used currently only for posts
Request.prototype.constructor.call(this, options);
// ^^^ this will trigger everything, including the actual http request (icky)
// so at this point you can't change anything more
}
// override form method to use querystring to do the stringify
MyRequest.prototype.form = function (form) {
"use strict";
if (form) {
this.setHeader('content-type', 'application/x-www-form-urlencoded; charset=utf-8');
this.body = querystring.stringify(form).toString('utf8');
// note that this.body and this.setHeader are fields/methods inherited from Request, not added in MyRequest.
return this;
}
else
return Request.prototype.form.apply(this, arguments);
};
Then, in your application, instead of
var Request = require("request");
Request(url, function(err, resp, body)
{
// do something here
});
you go
var MyRequest = require("lib/myrequest");
MyRequest(url, function(err, resp, body)
{
// do that same something here
});
I'm not a JavaScript guru so there may be better ways...
For reference, the specific solution I came up with to my sample code problem follows:
In lib/foo.js:
var Foo = function() {}
Foo.prototype.foo = function() {
console.log('Foo was called!');
};
module.exports = new Foo;
In lib/bar.js:
var foo = require('./foo.js');
var Bar = function() {}
Bar.prototype = Object.create(foo.__proto__);
Bar.prototype.constructor = Foo;
Bar.prototype.bar = function() {
console.log('Bar was called!');
};
module.exports = new Bar;
Then in my test script:
var bar = require('lib/bar.js');
bar.foo(); // Output "Foo was called"
bar.bar(); // Output "Bar was called"
I am using the source at http://blog.symprogress.com/2010/11/ribbon-insert-any-web-part-using-javascript/ to handle user web part button click event.
The function 'addWebPart()' calls a function 'SP.Ribbon.WebPartComponent.getWebPartAdder()' which is supposed to return adder instance but sometimes it returns undefined.
If I add a while loop to wait for the instance value to return correctly, the browser in my VM stalls for some time. When an instance is returned, the browser becomes responsive again. This only happens in some instances.
I am using SharePoint 2013 and the section of code I am referring to is:
addWebPart = function (wpCategory, wpTitle) {
var webPartAdder = SP.Ribbon.WebPartComponent.getWebPartAdder();
while (webPartAdder == undefined)
webPartAdder = SP.Ribbon.WebPartComponent.getWebPartAdder();
// ... Other stuff ...
}
How can this issue be resolved?
For anyone looking for an answer to this question, turns out you have to call 'LoadWPAdderOnDemand()' function then wait for the event '_spEventWebPartAdderReady'. Then query for 'window.WPAdder':
addWebPartDelayed = function (webPartAdder, wpCategory, wpTitle) {
var webPart = findWebPart(webPartAdder, wpCategory, wpTitle);
if (webPart) {
var zone = WPAdder._zones[0];
var wpid = WPAdder._createWebpartPlaceholderInRte();
WPAdder.addItemToPageByItemIdAndZoneId(webPart.id, zone.id, 0, wpid);
}
else
alert('ERROR: Web part not found! Please try again after sometime.');
},
addWebPart = function (wpCategory, wpTitle) {
var webPartAdder = window.WPAdder;
if (webPartAdder == undefined) {
LoadWPAdderOnDemand();
ExecuteOrDelayUntilEventNotified(
function () {
var webPartAdder = window.WPAdder;
addWebPartDelayed(webPartAdder, wpCategory, wpTitle);
},
"_spEventWebPartAdderReady");
}
else
addWebPartDelayed(webPartAdder, wpCategory, wpTitle);
};
So I am working on a project in Node.js and I want to open up some extra threads to handle the processing load more efficiently. But I am using classes with function definitions with them and when I try to send those objects to the worker thread, the functions defined in the object disappear and I am only left with the other fields in the object. Is there a way to send the worker an object and preserve the functions so they can be called within the worker?
var cluster = require('cluster');
if(cluster.isMaster){
Monster = function(species){
this.attack = function(){
console.log('CHOMP');
};
this.name = species;
};
var vamp = new Monster('vampire'),
worker = cluster.fork();
worker.send({'monster' : vamp});
}
else{
process.on('message', function(msg) {
console.log(msg.monster); //this logs "{ name: 'vampire' }"
msg.monster.attack(); //TypeError: Object #<Object> has no method 'attack'
});
}
No, there is no way to pass functions between threads. You can pass only JS plain objects (data only) and handle it with functions defined in current thread (like create new object with received data).
Charlie, I realize you asked this question a year ago, but I was wanting to do something very similar and I came across your question which you didn't mark an answer to yet. I thought I would take a "stab" at it and show you what I have done with your code. This different way of organizing code is for me a very acceptable workaround in my node.js work. I am pretty sure this gives you a way to accomplish what you want, even though you can't do it in the manner you wanted.
Declare your "class" outside the cluster code, like this:
var cluster = require('cluster');
var Monster = function(species){
this.attack = function(){
console.log('CHOMP!');
};
this.die = function() {
console.log("Oh, what did I eat? I don't feel so good....\r\n");
process.exit(0);
};
this.scare = function() {
console.log("BOO! I am a " + this.name + "!");
};
this.name = species;
};
if(cluster.isMaster){
worker = cluster.fork();
worker.send({'species' : 'Vampire'});
}
else{
process.on('message', function(msg) {
if(typeof msg.species !== "undefined") {
myMonster = new Monster(msg.species);
myMonster.scare();
myMonster.attack();
myMonster.die();
}
});
}
Give that a whirl and see if this is an answer you can accept!
Ok, stumbled upon this answer, and I found it strange that no one brought this up, but it might be a more modern feature than the question:
eval
let str = "() => { console.log('test') }"
let func = eval(str)
func()
Think it's obvious what's going on here, you can parse any string to javascript, and you can send strings to workers, so you can build and object with functions:
let obj = { a: "() => { ... }" }
and send the object over. (JSON.stringify(obj) first, and than you will have to parse the object first, and than all the substrings seperately)
I'm having some issues using Node.js as a http client against an existing long polling server. I'm using 'http' and 'events' as requires.
I've created a wrapper object that contains the logic for handling the http.clientrequest. Here's a simplified version of the code. It works exactly as expected. When I call EndMe it aborts the request as anticipated.
var http = require('http');
var events = require('events');
function lpTest(urlHost,urlPath){
this.options = {
host: urlHost,
port: 80,
path: urlPath,
method: 'GET'
};
var req = {};
events.EventEmitter.call(this);
}
lpTest.super_ = events.EventEmitter;
lpTest.prototype = Object.create(events.EventEmitter.prototype, {
constructor: {
value: lpTest,
enumerable: false
}
});
lpTest.prototype.getData = function getData(){
this.req = http.request(this.options, function(res){
var httpData = "";
res.on('data', function(chunk){
httpData += chunk;
});
res.on('end', function(){
this.emit('res_complete', httpData);
}
};
}
lpTest.prototype.EndMe = function EndMe(){
this.req.abort();
}
module.exports = lpTest;
Now I want to create a bunch of these objects and use them to long poll a bunch of URL's. So I create an object to contain them all, generate each object individually, initiate it, then store it in my containing object. This works a treat, all of the stored long-polling objects fire events and return the data as expected.
var lpObject = require('./lpTest.js');
var objWatchers = {};
function DoSomething(hostURL, hostPath){
var tempLP = new lpObject(hostURL,hostPath);
tempLP.on('res_complete', function(httpData){
console.log(httpData);
this.getData();
});
objWatchers[hosturl + hostPath] = tempLP;
}
DoSomething('firsturl.com','firstpath');
DoSomething('secondurl.com','secondpath);
objWatchers['firsturl.com' + 'firstpath'].getData();
objWatchers['secondurl.com' + 'secondpath'].getData();
Now here's where it fails... I want to be able to stop a long-polling object while leaving the rest going. So naturally I try adding:
objWatchers['firsturl.com' + 'firstpath'].EndMe();
But this causes the entire node execution to cease and return me to the command line. All of the remaining long-polling objects, that are happily doing what they're supposed to do, suddenly stop.
Any ideas?
Could it have something to do with the fact that you are only calling getData() when the data is being returned?
Fixed code:
function DoSomething(hostURL, hostPath){
var tempLP = new lpObject(hostURL,hostPath);
tempLP.on('res_complete', function(httpData){
console.log(httpData);
});
tempLP.getData();
objWatchers[hosturl + hostPath] = tempLP;
}
I have seemingly solved this, although I'm note entirely happy with how it works:
var timeout = setTimeout(function(){
objWatchers['firsturl.com' + 'firstpath'].EndMe();
}, 100);
By calling the closing function on the object after a delay I seem to be able to preserve the program execution. Not exactly ideal, but I'll take it! If anyone can offer a better method please feel free to let me know :)