Generators in Node.js - Fiber or pure JavaScript? - node.js

I am trying to implement generators in Node.js. I came across node-fiber and node-lazy. Node-lazy deals with arrays and streams, but does not generate lazy things inherently (except numbers).
While using fiber looks cleaner, it has its cons, and as such, I prefer pure Javascript with closures as it's more explicit. My question is: are there memory or perf problems using closure to generate an iterator?
As an example, I'm trying to iterate through a tree depth-first, for as long as the caller asks for it. I want to find 'waldo' and stop at the first instance.
Fiber:
var depthFirst = Fiber(function iterate(tree) {
tree.children.forEach(iterate);
Fiber.yield(tree.value);
});
var tree = ...;
depthFirst.run(tree);
while (true) {
if (depthFirst.run() === 'waldo')
console.log('Found waldo');
}
Pure JavaScript with closures:
function iterate(tree) {
var childIndex = 0;
var childIter = null;
var returned = false;
return function() {
if (!childIter && childIndex < tree.children.length)
childIter = iterate(tree.children[childIndex++]);
var result = null;
if (childIter && (result = childIter()))
return result;
if (!returned) {
returned = true;
return tree.value;
}
};
}
var tree = ...;
var iter = iterate(tree);
while (true) {
if (iter() === 'waldo')
console.log('found waldo');
}

Related

Why is the first function call is executed two times faster than all other sequential calls?

I have a custom JS iterator implementation and code for measuring performance of the latter implementation:
const ITERATION_END = Symbol('ITERATION_END');
const arrayIterator = (array) => {
let index = 0;
return {
hasValue: true,
next() {
if (index >= array.length) {
this.hasValue = false;
return ITERATION_END;
}
return array[index++];
},
};
};
const customIterator = (valueGetter) => {
return {
hasValue: true,
next() {
const nextValue = valueGetter();
if (nextValue === ITERATION_END) {
this.hasValue = false;
return ITERATION_END;
}
return nextValue;
},
};
};
const map = (iterator, selector) => customIterator(() => {
const value = iterator.next();
return value === ITERATION_END ? value : selector(value);
});
const filter = (iterator, predicate) => customIterator(() => {
if (!iterator.hasValue) {
return ITERATION_END;
}
let currentValue = iterator.next();
while (iterator.hasValue && currentValue !== ITERATION_END && !predicate(currentValue)) {
currentValue = iterator.next();
}
return currentValue;
});
const toArray = (iterator) => {
const array = [];
while (iterator.hasValue) {
const value = iterator.next();
if (value !== ITERATION_END) {
array.push(value);
}
}
return array;
};
const test = (fn, iterations) => {
const times = [];
for (let i = 0; i < iterations; i++) {
const start = performance.now();
fn();
times.push(performance.now() - start);
}
console.log(times);
console.log(times.reduce((sum, x) => sum + x, 0) / times.length);
}
const createData = () => Array.from({ length: 9000000 }, (_, i) => i + 1);
const testIterator = (data) => () => toArray(map(filter(arrayIterator(data), x => x % 2 === 0), x => x * 2))
test(testIterator(createData()), 10);
The output of the test function is very weird and unexpected - the first test run is constantly executed two times faster than all the other runs. One of the results, where the array contains all execution times and the number is the mean (I ran it on Node):
[
147.9088459983468,
396.3472499996424,
374.82447600364685,
367.74555300176144,
363.6300039961934,
362.44370299577713,
363.8418449983001,
390.86111199855804,
360.23125199973583,
358.4788999930024
]
348.6312940984964
Similar results can be observed using Deno runtime, however I could not reproduce this behaviour on other JS engines. What can be the reason behind it on the V8?
Environment:
Node v13.8.0, V8 v7.9.317.25-node.28,
Deno v1.3.3, V8 v8.6.334
(V8 developer here.) In short: it's inlining, or lack thereof, as decided by engine heuristics.
For an optimizing compiler, inlining a called function can have significant benefits (e.g.: avoids the call overhead, sometimes makes constant folding possible, or elimination of duplicate computations, sometimes even creates new opportunities for additional inlining), but comes at a cost: it makes the compilation itself slower, and it increases the risk of having to throw away the optimized code ("deoptimize") later due to some assumption that turns out not to hold. Inlining nothing would waste performance, inlining everything would waste performance, inlining exactly the right functions would require being able to predict the future behavior of the program, which is obviously impossible. So compilers use heuristics.
V8's optimizing compiler currently has a heuristic to inline functions only if it was always the same function that was called at a particular place. In this case, that's the case for the first iterations. Subsequent iterations then create new closures as callbacks, which from V8's point of view are new functions, so they don't get inlined. (V8 actually knows some advanced tricks that allow it to de-duplicate function instances coming from the same source in some cases and inline them anyway; but in this case those are not applicable [I'm not sure why]).
So in the first iteration, everything (including x => x % 2 === 0 and x => x * 2) gets inlined into toArray. From the second iteration onwards, that's no longer the case, and instead the generated code performs actual function calls.
That's probably fine; I would guess that in most real applications, the difference is barely measurable. (Reduced test cases tend to make such differences stand out more; but changing the design of a larger app based on observations made on a small test is often not the most impactful way to spend your time, and at worst can make things worse.)
Also, hand-optimizing code for engines/compilers is a difficult balance. I would generally recommend not to do that (because engines improve over time, and it really is their job to make your code fast); on the other hand, there clearly is more efficient code and less efficient code, and for maximum overall efficiency, everyone involved needs to do their part, i.e. you might as well make the engine's job simpler when you can.
If you do want to fine-tune performance of this, you can do so by separating code and data, thereby making sure that always the same functions get called. For example like this modified version of your code:
const ITERATION_END = Symbol('ITERATION_END');
class ArrayIterator {
constructor(array) {
this.array = array;
this.index = 0;
}
next() {
if (this.index >= this.array.length) return ITERATION_END;
return this.array[this.index++];
}
}
function arrayIterator(array) {
return new ArrayIterator(array);
}
class MapIterator {
constructor(source, modifier) {
this.source = source;
this.modifier = modifier;
}
next() {
const value = this.source.next();
return value === ITERATION_END ? value : this.modifier(value);
}
}
function map(iterator, selector) {
return new MapIterator(iterator, selector);
}
class FilterIterator {
constructor(source, predicate) {
this.source = source;
this.predicate = predicate;
}
next() {
let value = this.source.next();
while (value !== ITERATION_END && !this.predicate(value)) {
value = this.source.next();
}
return value;
}
}
function filter(iterator, predicate) {
return new FilterIterator(iterator, predicate);
}
function toArray(iterator) {
const array = [];
let value;
while ((value = iterator.next()) !== ITERATION_END) {
array.push(value);
}
return array;
}
function test(fn, iterations) {
for (let i = 0; i < iterations; i++) {
const start = performance.now();
fn();
console.log(performance.now() - start);
}
}
function createData() {
return Array.from({ length: 9000000 }, (_, i) => i + 1);
};
function even(x) { return x % 2 === 0; }
function double(x) { return x * 2; }
function testIterator(data) {
return function main() {
return toArray(map(filter(arrayIterator(data), even), double));
};
}
test(testIterator(createData()), 10);
Observe how there are no more dynamically created functions on the hot path, and the "public interface" (i.e. the way arrayIterator, map, filter, and toArray compose) is exactly the same as before, only under-the-hood details have changed. A benefit of giving all functions names is that you get more useful profiling output ;-)
Astute readers will notice that this modification only shifts the issue away: if you have several places in your code that call map and filter with different modifiers/predicates, then the inlineability issue will come up again. As I said above: microbenchmarks tend to be misleading, as real apps typically have different behavior...
(FWIW, this is pretty much the same effect as at Why is the execution time of this function call changing? .)
Just to add to this investigation, I compared the OP's original code with the predicate and selector functions declared as separate functions as suggested by jmrk to two other implementations. So, this code has three implementations:
OP's code with predicate and selector functions declared separately as named functions (not inline).
Using standard array.map() and .filter() (which you would think would be slower because of the extra creation of intermediate arrays)
Using a custom iteration that does both filtering and mapping in one iteration
The OP's attempt at saving time and making things faster is actually the slowest (on average). The custom iteration is the fastest.
I guess the lesson here is that it's not necessarily intuitive how you make things faster with the optimizing compiler so if you're tuning performance, you have to measure against the "typical" way of doing things (which may benefit from the most optimizations).
Also, note that in the method #3, the first two iterations are the slowest and then it gets faster - the opposite effect from the original code. Go figure.
The results are here:
[
99.90320014953613,
253.79690098762512,
271.3091011047363,
247.94990015029907,
247.457200050354,
261.9487009048462,
252.95090007781982,
250.8520998954773,
270.42809987068176,
249.340900182724
]
240.59370033740998
[
222.14270091056824,
220.48679995536804,
224.24630093574524,
237.07260012626648,
218.47070002555847,
218.1493010520935,
221.50559997558594,
223.3587999343872,
231.1618001461029,
243.55419993400574
]
226.01488029956818
[
147.81360006332397,
144.57479882240295,
73.13350009918213,
79.41700005531311,
77.38950109481812,
78.40880012512207,
112.31539988517761,
80.87990117073059,
76.7899010181427,
79.79679894447327
]
95.05192012786866
The code is here:
const { performance } = require('perf_hooks');
const ITERATION_END = Symbol('ITERATION_END');
const arrayIterator = (array) => {
let index = 0;
return {
hasValue: true,
next() {
if (index >= array.length) {
this.hasValue = false;
return ITERATION_END;
}
return array[index++];
},
};
};
const customIterator = (valueGetter) => {
return {
hasValue: true,
next() {
const nextValue = valueGetter();
if (nextValue === ITERATION_END) {
this.hasValue = false;
return ITERATION_END;
}
return nextValue;
},
};
};
const map = (iterator, selector) => customIterator(() => {
const value = iterator.next();
return value === ITERATION_END ? value : selector(value);
});
const filter = (iterator, predicate) => customIterator(() => {
if (!iterator.hasValue) {
return ITERATION_END;
}
let currentValue = iterator.next();
while (iterator.hasValue && currentValue !== ITERATION_END && !predicate(currentValue)) {
currentValue = iterator.next();
}
return currentValue;
});
const toArray = (iterator) => {
const array = [];
while (iterator.hasValue) {
const value = iterator.next();
if (value !== ITERATION_END) {
array.push(value);
}
}
return array;
};
const test = (fn, iterations) => {
const times = [];
let result;
for (let i = 0; i < iterations; i++) {
const start = performance.now();
result = fn();
times.push(performance.now() - start);
}
console.log(times);
console.log(times.reduce((sum, x) => sum + x, 0) / times.length);
return result;
}
const createData = () => Array.from({ length: 9000000 }, (_, i) => i + 1);
const cache = createData();
const comp1 = x => x % 2 === 0;
const comp2 = x => x * 2;
const testIterator = (data) => () => toArray(map(filter(arrayIterator(data), comp1), comp2))
// regular array filter and map
const testIterator2 = (data) => () => data.filter(comp1).map(comp2);
// combine filter and map in same operation
const testIterator3 = (data) => () => {
let result = [];
for (let value of data) {
if (comp1(value)) {
result.push(comp2(value));
}
}
return result;
}
const a = test(testIterator(cache), 10);
const b = test(testIterator2(cache), 10);
const c = test(testIterator3(cache), 10);
function compareArrays(a1, a2) {
if (a1.length !== a2.length) return false;
for (let [i, val] of a1.entries()) {
if (a2[i] !== val) return false;
}
return true;
}
console.log(a.length);
console.log(compareArrays(a, b));
console.log(compareArrays(a, c));

I just started learning node.js and i implemented a demo code exhibiting event listeners. I got an error

I am getting this error in my code
TypeError: account.on() is not a function
Where did i go wrong?
Code
var events = require('events');
function Account() {
this.balance = 0;
events.EventEmitter.call(this);
this.deposit = function(amount) {
this.balance += amount;
this.emit('balanceChanged');
};
this.withdraw = function(amount) {
this.balance -= amount;
this.emit('balanceChanged');
};
}
Account.prototype._proto_ = events.EventEmitter.prototype;
function displayBalance() {
console.log('Account balance : $%d', this.balance);
}
function checkOverdraw() {
if (this.balance < 0) {
console.log('Account overdrawn!!!');
}
}
function checkgoal(acc, goal) {
if (acc.balance > goal) {
console.log('Goal Achieved!!!');
}
}
var account = new Account();
account.on('balanceChanged', displayBalance);
account.on('balanceChanged', checkOverdraw);
account.on('balanceChanged', function() {
checkgoal(this, 1000);
});
account.deposit(220);
account.deposit(320);
account.deposit(600);
account.withdraw(1200);
Your example code is not idiomatic Node JS.
I'd strongly recommend you follow the recommended best practices when creating new inheritable objects, as in:
var util=require('util');
var EventEmitter = require('events').EventEmitter;
var Account = function(){
EventEmitter.call(this); // should be first
this.balance=0; // instance var
};
util.inherits(Account,EventEmitter);
Account.prototype.deposit = function(amount){
this.balance += amount;
this.emit('balanceChanged');
};
Account.prototype.withdraw = function(amount){
this.balance -= amount;
this.emit('balanceChanged');
};
var account = new Account();
var displayBalance = function(){
console.log("Account balance : $%d", this.balance);
};
account.on('balanceChanged',displayBalance);
account.deposit(200);
account.withdraw(40);
// ... etc. ....
Which, when run displays:
Account balance : $200
Account balance : $160
Best practices are there so that
your code can be expressed in a way that is easy for others to understand
you don't run into unexpected problems when you try to replicate functionality that is already defined, possibly complex and difficult to understand.
The reason that util.inherits exists is so you don't have to worry about how the prototype chain is constructed. By constructing it yourself, you will often run into the problem you experienced.
Also, since the current Node runtime (>6.0) also includes most of the ES6 spec, you can also (and really should) write your code as:
const util = require('util');
const EventEmitter = require('events').EventEmitter;
const Account = () => {
EventEmitter.call(this);
this.balance = 0;
};
util.inherits(Account,EventEmitter);
Account.prototype.deposit = (val) => {
this.balance += val;
this.emit('balanceChanged');
};
Account.prototype.withdraw = (val) => {
this.balance -= val;
this.emit('balanceChanged');
};
The use of the const keyword assures the variables you create cannot be changed inadvertently or unexpectedly.
And the use of the "fat arrow" function definition idiom (() => {}) is more succinct and thus quicker to type, but also carries the added benefit that it preserves the value of this from the surrounding context so you never have to write something like:
Account.prototype.doSomething = function() {
var self = this;
doSomething(val, function(err,res){
if(err) {
throw err;
}
self.result=res;
});
};
which, using the 'fat arrow' construct becomes:
Account.prototype.doSomething = () => {
doSomething(val, (err,res) => {
if(err) {
throw err;
}
this.result=res; // where 'this' is the instance of Account
});
};
The "fat arrow" idiom also allows you to do some things more succinctly like:
// return the result of a single operation
const add = (a,b) => a + b;
// return a single result object
const getSum = (a,b) => {{a:a,b:b,sum:a+b}};
Another way to create inheritable "classes" in ES6 is to use its class construction notation:
const EventEmitter = require('events');
class Account extends EventEmitter {
constructor() {
super();
this._balance = 0; // start instance vars with an underscore
}
get balance() { // and add a getter
return this._balance;
}
deposit(amount) {
this._balance += amount;
this.emit('balanceChanged');
}
withdraw(amount) {
this._balance -= amount;
this.emit('balanceChanged');
}
}
It should be noted that both ways of constructing inheritable prototypal objects is really the same, except that the new class construction idiom adds syntactic "sugar" to bring the declaration more in-line with other languages that support more classical object orientation.
The ES6 extensions to node offer many other benefits worthy of study.

How to cache outer scope values to be used inside of Async NodeJS calls?

Something like the below code illustrates my intention, if you can imagine how a naive programmer would probably try to write this the first time:
function (redisUpdatesHash) {
var redisKeys = Object.keys(redisUpdatesHash);
for (var i = 0; i < redisKeys.length; i++) {
var key = redisKeys[i];
redisClient.get(key, function (err, value) {
if (value != redisUpdatesHash[key]) {
redisClient.set(key, redisUpdatesHash[key]);
redisClient.publish(key + "/notifications", redisUpdatesHash[key]);
}
});
}
}
The problem is, predictably, key is the wrong value in the callback scopes of the asynchronous nature of the node_redis callbacks. The method of detection is really primitive because of security restrictions out of my control - so the only option for me was to resort to polling the source for it's state. So the intention above is to store that state in Redis so that I can compare during the next poll to determine if it changed. If it has, I publish an event and store off the new value to update the comparison value for the next polling cycle.
It appears that there's no good way to do this in NodeJS... I'm open to any suggestions - whether it's fixing the above code to somehow be able to perform this check, or to suggest a different method of doing this entirely.
I solved this problem through using function currying to cache the outer values in a closure.
In vanilla Javascript/NodeJS
asyncCallback = function (newValue, redisKey, redisValue) {
if (newValue != redisValue) {
redisClient.set(redisKey, newValue, handleRedisError);
redisClient.publish(redisKey + '/notifier', newValue, handleRedisError);
}
};
curriedAsyncCallback = function (newValue) {
return function (redisKey) {
return function (redisValue) {
asyncCallback(newValue, redisKey, redisValue);
};
};
};
var newResults = getNewResults(),
redisKeys = Object.keys(newResults);
for (var i = 0; i < redisKeys.length; i++) {
redisClient.get(redisKeys[i], curriedAsyncCallback(newResults[redisKeys[i]])(redisKeys[i]));
}
However, I ended up using HighlandJS to help with the currying and iteration.
var _ = require('highland'),
//...
asyncCallback = function (newValue, redisKey, redisValue) {
if (newValue != redisValue) {
redisClient.set(redisKey, newValue, handleRedisError);
redisClient.publish(redisKey + '/notifier', newValue, handleRedisError);
}
};
var newResults = getNewResults(),
redisKeys = Object.keys(newResults),
curriedAsyncCallback = _.curry(asyncCallback),
redisGet = _(redisClient.get.bind(redisClient));
redisKeys.each(function (key) {
redisGet(key).each(curriedAsyncCallback(newResults[key], key));
});

How to properly delete a box2d body in version: Box2dWeb-2.1.a.3, Box2D_v2.3.1r3? Box2D bug?

Update
Since the problem has been found I've also find out that Box2D for web is leaking on every side :/
To show this I made a simple circle moving in a static polygon and here is the result after some time.
Notice how the following items are leaking as I'm not creating any body or changing the world in any way:
b2Vec2
Features
b2ManifoldPoint
b2ContactID
b2Manifold
b2ContactEdge
b2PolyAndCircleContact
Array
...
Original post
I have a problem because I'm profiling my game and the garbage collector doesnt' delete my bodies, contacts and other stuff. Then I've looked at what are they keeping from the GC and was the Box2D itself. This might lead to 2 options: I'm doing it bad or Box2D is leaking. I consider is my cause.
What exactly is keeping it?
contact.m_nodeA.other was appearing to be the most used to keep it from GC.
other times: m_fixtureB in a contact... see image
You can see that the body has a __destroyed property. That is set manually before deleting it with world.DestroyBody(body)
When I destroy a body I call it after I call the step method on the world.
As you can see from the box2d method it doesn't get rid of the other variable nor it changes it to another body and my body is not GC.
Any idea of what I'm missing here?
Now I can fix the problem only if the world.Step is not ran:
var gravity = new Box2D.Vec2(0, 0);
var doSleep = true;
var world = new Box2D.World(gravity, doSleep);
var step = false;
var fixtureDef = new Box2D.FixtureDef();
fixtureDef.density = 1.0;
fixtureDef.friction = 0.5;
fixtureDef.restitution = 0.2;
fixtureDef.shape = new Box2D.PolygonShape();
fixtureDef.shape.SetAsBox(1, 1);
var bodyDef = new Box2D.BodyDef;
bodyDef.type = Box2D.Body.b2_dynamicBody;
bodyDef.position.x = 0.4;
bodyDef.position.y = 0.4;
var bodies = []
var fix = [];
window.c = function(){
for(var i = 0; i < 100; i++){
var body = world.CreateBody(bodyDef);
body._id = i;
fix.push(body.CreateFixture(fixtureDef));
bodies.push(body);
}
if(step){world.Step(1/60, 3, 3); world.ClearForces();}
console.log('Created', bodies)
fixtureDef = null;
bodyDef = null;
}
window.d = function(){
_.each(bodies, function(body, i){
body.DestroyFixture(fix[i]);
world.DestroyBody(body);
fix[i] = null;
bodies[i] = null;
})
if(step){world.Step(1/60, 3, 3); world.ClearForces();}
bodies = null;
fix = null;
}
Change the step to true and the memory leak problem appears again.
Reproduce the memory leak problem:
Code in your file:
var gravity = new Box2D.Vec2(0, 0);
var doSleep = true;
var world = new Box2D.World(gravity, doSleep);
var bodies = []
window.c = function(){
for(var i = 0; i < 100; i++){
var bodyDef = new Box2D.BodyDef();
bodyDef.type = 2;
var shape = new Box2D.PolygonShape();
shape.SetAsBox(1, 1);
var fixtureDef = new Box2D.FixtureDef();
fixtureDef.shape = shape;
var body = world.CreateBody(bodyDef);
body._id = i;
body.CreateFixture(fixtureDef);
bodies.push(body);
}
world.Step(0.3, 3, 3);
console.log('Created', bodies)
}
window.d = function(){
_.each(bodies, function(body, i){
world.DestroyBody(body);
bodies[i] = null;
})
world.Step(0.3, 3, 3);
bodies = null;
}
Open google chrome:
Then open your profile and make a snapshot.
Now run the c() method in your console to create 100 bodies
Now snapshot 2
Search in snapshot for b2Body and you'll find 100 Object count
Now run d() to delete all your bodies;
Force Garbage collection by clicking on the garbage can
Make a snapshot 3
Search for b2Body and you'll also find 100 Object count
At the last step should only be 0 objects as they have been destroyed. Instead of this you'll find this:
Now you can see there are a lot of references from b2ContactEdge. Now if you remove the world.Step part of the code you will only see 2 references to the body.
If you remove this line
body.CreateFixture(fixtureDef);
or making the body static is not leaking anymore.
My game loop
...gameLoop = function(o){
// used a lot here
var world = o.world;
// calculate the new positions
var worldStepSeconds = o.worldStepMs / 1000;
// step world
world.Step(worldStepSeconds, o.velocityIterations, o.positionIterations)
// render debug
if(o.renderDebug){
world.DrawDebugData();
}
// always to not accumulate forces, maybe some bug occurs
world.ClearForces();
// tick all ticking entities
_.each(o.getTickEntitiesFn(), function(actor){
if(!actor) return;
actor.tick(o.worldStepMs, o.lastFrameMs);
})
// update PIXI entities
var body = world.GetBodyList();
var worldScale = world.SCALE;
var destroyBody = world.DestroyBody.bind(world);
while(body){
var actor = null;
var visualEntity = null;
var box2DEntity = o.getBox2DEntityByIdFn(body.GetUserData());
if(box2DEntity){
visualEntity = o.getVisualEntityByIdFn(box2DEntity.getVisualEntityId());
if(box2DEntity.isDestroying()){
// optimization
body.__destroyed = true;
world.DestroyBody(body);
box2DEntity.completeDestroy();
}
}
if(visualEntity){
if(visualEntity.isDestroying()){
visualEntity.completeDestroy();
}else{
var inverseY = true;
var bodyDetails = Utils.getScreenPositionAndRotationOfBody(world, body, inverseY);
visualEntity.updateSprite(bodyDetails.x, bodyDetails.y, bodyDetails.rotation);
}
}
// this delegates out functionality for each body processed
if(o.triggersFn.eachBody) o.triggersFn.eachBody(world, body, visualEntity);
body = body.GetNext();
}
// when a joint is created is then also created it's visual counterpart and then set to userData.
var joint = world.GetJointList();
while(joint){
var pixiGraphics = joint.GetUserData();
if(pixiGraphics){
// In order to draw a distance joint we need to know the start and end positions.
// The joint saves the global (yes) anchor positions for each body.
// After that we need to scale to our screen and invert y axis.
var anchorA = joint.GetAnchorA();
var anchorB = joint.GetAnchorB();
var screenPositionA = anchorA.Copy();
var screenPositionB = anchorB.Copy();
// scale
screenPositionA.Multiply(world.SCALE);
screenPositionB.Multiply(world.SCALE);
// invert y
screenPositionA.y = world.CANVAS_HEIGHT - screenPositionA.y
screenPositionB.y = world.CANVAS_HEIGHT - screenPositionB.y
// draw a black line
pixiGraphics.clear();
pixiGraphics.lineStyle(1, 0x000000, 0.7);
pixiGraphics.moveTo(screenPositionA.x, screenPositionA.y);
pixiGraphics.lineTo(screenPositionB.x, screenPositionB.y);
}
joint = joint.GetNext();
}
// render the PIXI scene
if(o.renderPixi){
o.renderer.render(o.stage)
}
// render next frame
requestAnimFrame(o.requestAnimFrameFn);
}
Code from Box2d:
b2ContactManager.prototype.Destroy = function (c) {
var fixtureA = c.GetFixtureA();
var fixtureB = c.GetFixtureB();
var bodyA = fixtureA.GetBody();
var bodyB = fixtureB.GetBody();
if (c.IsTouching()) {
this.m_contactListener.EndContact(c);
}
if (c.m_prev) {
c.m_prev.m_next = c.m_next;
}
if (c.m_next) {
c.m_next.m_prev = c.m_prev;
}
if (c == this.m_world.m_contactList) {
this.m_world.m_contactList = c.m_next;
}
if (c.m_nodeA.prev) {
c.m_nodeA.prev.next = c.m_nodeA.next;
}
if (c.m_nodeA.next) {
c.m_nodeA.next.prev = c.m_nodeA.prev;
}
if (c.m_nodeA == bodyA.m_contactList) {
bodyA.m_contactList = c.m_nodeA.next;
}
if (c.m_nodeB.prev) {
c.m_nodeB.prev.next = c.m_nodeB.next;
}
if (c.m_nodeB.next) {
c.m_nodeB.next.prev = c.m_nodeB.prev;
}
if (c.m_nodeB == bodyB.m_contactList) {
bodyB.m_contactList = c.m_nodeB.next;
}
this.m_contactFactory.Destroy(c);
--this.m_contactCount;
}
b2ContactFactory.prototype.Destroy = function (contact) {
if (contact.m_manifold.m_pointCount > 0) {
contact.m_fixtureA.m_body.SetAwake(true);
contact.m_fixtureB.m_body.SetAwake(true);
}
var type1 = parseInt(contact.m_fixtureA.GetType());
var type2 = parseInt(contact.m_fixtureB.GetType());
var reg = this.m_registers[type1][type2];
if (true) {
reg.poolCount++;
contact.m_next = reg.pool;
reg.pool = contact;
}
var destroyFcn = reg.destroyFcn;
destroyFcn(contact, this.m_allocator);
}
I have the same problem, but I think I find out from where it comes.
Instead of m_* try functions, like GetFixtureA() instead of m_fixtureA.
Totti did you ever figure this out? It looks like box2dweb requires manual destruction and memory management.
I think I have found your leaks, un-implemented ( static class ) destruction functions:
b2Joint.Destroy = function (joint, allocator) {}
b2CircleContact.Destroy = function (contact, allocator) {}<
b2PolygonContact.Destroy = function (contact, allocator) {}
b2EdgeAndCircleContact.Destroy = function (contact, allocator) {}<
b2PolyAndCircleContact.Destroy = function (contact, allocator) {}
b2PolyAndEdgeContact.Destroy = function (contact, allocator) {}
[UPDATE...]
b2DestructionListener.b2DestructionListener = function () {};
b2DestructionListener.prototype.SayGoodbyeJoint = function (joint) {}
b2DestructionListener.prototype.SayGoodbyeFixture = function (fixture) {}
b2Contact.prototype.Reset(fixtureA, fixtureB)
called with with one/both fixture arguments resets passed in fixture/s BUT ALSO pass in NO arguments and it 'nulls' all the the b2Contact properties! (UNTESTED:) but I suggest set your YOURcontactListener class up to handle all contact callbacks EVERY call with Reset(??) dynamically configureable as logic requies EVERY call (there are more than you'd imagine each and every world step).
Also take Colt McAnlis clever advice and strategically pre allocate all the memory the life of your game will need (by creating game and box2d object pools now you know objects can be reset) so the garbage collector NEVER runs until, you destroy object pools at times of your own convenience.... i.e when you close the tab, or your device needs recharging! ;D [...UPDATE]
// you can define and assign your own contact listener ...via...
YOUR.b2world.b2ContactManager.m_world.m_contactList = new YOURcontactlistener();<br>[edit]...if you dont it actually does have Box2D.Dynamics.b2ContactListener.b2_defaultListener.
// box2d in the worldStep calls YOURcontactlistener.update() via:
this.b2world.b2ContactManager.m_world.m_contactList.Update(this.m_contactListener) // this.m_contactListener being YOURS || b2_defaultListener;
// which instantiates ALL your listed leaking object like so:
{b2Contact which instantiates {b2ContactEdge} and {b2Manifold which instantiates {b2ManifoldPoint{which instantiates m_id.key == ContactID{which instantiates Features}}}} along with {B2Vec2} are instantiated in b2ContactResult ...which I can not actually find but assume it must be instantiated in the Solver.
// There is a Contacts.destroyFcn callback is CREATED in....
b2ContactFactory.prototype.Destroy = function (contact) {...}
// then Contacts.destroyFcn callback(s) are privately REGISTERED in....
b2ContactFactory.prototype.InitializeRegisters() {...}
...via...
this.AddType = function (createFcn, destroyFcn, type1, type2) {...}
...BUT... THOSE privately registered ARE four of the un-implimented static class function from above...
b2PolygonContact.Destroy = function (contact, allocator) {}
b2EdgeAndCircleContact.Destroy = function (contact, allocator) {}
b2PolyAndCircleContact.Destroy = function (contact, allocator) {}
b2PolyAndEdgeContact.Destroy = function (contact, allocator) {}
So I havn't tested it yet but it looks like box2dweb just gives you the Destroy callback/handler functions and you have to read the source to find all the properties you need to null. [Edit] In combination with b2Contact.prototype.Reset(fixtureA, fixtureB)
But either way pretty confident the functions above(possibly incomplete) are callback/handlers, and can be used to null your way back to performance for anyone else who stumbles across this problem. Pretty sure Totti's moved on(dont forget to handle your 'this' scope in callbacks).

Writing a javascript library

I want to write a JS library and handle it like this:
var c1 = Module.Class();
c1.init();
var c1 = Module.Class();
c2.init();
And of course, c1 and c2 can not share the same variables.
I think I know how to do this with objects, it would be:
var Module = {
Class = {
init = function(){
...
}
}
}
But the problem is I can't have multiple instances of Class if I write in this way.
So I'm trying to achieve the same with function, but I don't think I'm doing it right.
(function() {
var Module;
window.Module = Module = {};
function Class( i ) {
//How can "this" refer to Class instead of Module?
this.initial = i;
}
Class.prototype.execute = function() {
...
}
//Public
Module.Class = Class;
})();
I don't have a clue if it's even possible, but I accept suggestions of other way to create this module.
I don't know if it's relevant also, but I'm using jQuery inside this library.
Usage:
var c1 = Module.Class("c");
var c2 = Module.Class("a");
var n = c1.initial(); // equals 'c'
c1.initial("s");
n = c1.initial(); // equals 's'
Module Code:
(function(window) {
var Module = window.Module = {};
var Class = Module.Class = function(initial)
{
return new Module.Class.fn.init(initial);
};
Class.fn = Class.prototype = {
init: function(initial) {
this._initial = initial;
},
initial: function(v){
if (v !== undefined) {
this._initial = v;
return this;
}
return this._initial;
}
};
Class.fn.init.prototype = Class.fn;
})(window || this);
This is using the JavaScript "Module" Design Pattern; which is the same design pattern used by JavaScript libraries such as jQuery.
Here's a nice tutorial on the "Module" pattern:
JavaScript Module Pattern: In-Depth

Resources