Suppose I've got a JavaScript function that I'm treating like a class - that is, I want to make many instances of it:
function Blerg() {
this._a = 5;
}
Blerg.prototype.getA = function() {
return this._a;
}
Blerg.prototype.setA = function(val) {
this._a = val;
}
This class has one attribute, a
, which the constructor instantiates to a default value of 5
. It is accessed with the getter and setter.
Now suppose I have such a class but it has 30+ attributes (a
, b
, c
, etc), and suppose like a
that these all have unique default values, but also that it is uncommon for them to be changed from the default.
Suppose also that I need to make 10,000 or more Blerg
instances, and so a design goal is to save space.
I'm wondering if it is a good idea to put all of the default values for my 30+ attributes on the prototype
of my class instead. That way, when I create 10,000+ instances of my class, none of them have a
, b
, c
, etc attributes, but calling getA()
will still return the correct default value.
So I present this modified Blerg
function, Blerg2
:
function Blerg2() {
// nothing!
}
Blerg2.prototype._a = 5;
Blerg2.prototype.getA = function() {
return this._a;
}
Blerg2.prototype.setA = function(val) {
this._a = val;
}
Are there downsides to taking this approach?
Some notes
The prototype way seems faster to create, see: http://jsperf.com/blergs
For that matter, the worst-case scenario does not look that bad: http://jsperf.com/blergs/2
And creating many of them using the code in the jsperf test (in its own html page) and adding:
var a = [];
for (var i = 0; i < 400000; i++) {
a.push(new Blerg()); // or Blerg2
}
Suggests that the heap size for the objects in question is cut in half by using Blerg2.
-
\$\begingroup\$ For reference, here is a quick help from v8: pastebin.com/dTk0eegF (this is the ASM generated for this code: pastebin.com/F59x1U15) \$\endgroup\$Florian Margaine– Florian Margaine2013年07月10日 20:20:29 +00:00Commented Jul 10, 2013 at 20:20
1 Answer 1
related jsperf http://jsperf.com/12312412354
Well this is a really bad idea. Objects are always considered having different class if they don't have exactly the same set of properties in the same order. So a function that accepts these objects will in best case be polymorphic and in worst case megamorphic all the while you are thinking you are passing it same class of objects. This is fundamental to all JS engines although the specifics that follow focus on V8.
Consider:
function monomorphic( a ) {
return a.prop + a.prop;
}
var obj = {prop: 3};
while( true ) {
monomorphic( obj );
}
Now, since the passed object a
always has the same class, we will get really good code:
; load from stack to eax
15633697 23 8b4508 mov eax,[ebp+0x8]
;; test that `a` is an object and not a small integer
1563369A 26 f7c001000000 test eax,0x1
156336A0 32 0f8485000000 jz 171 (1563372B) ;;deoptimize if it is not
;; test that `a`'s class is as expected
156336A6 38 8178ffb9f7902f cmp [eax+0xff],0x2f90f7b9
156336AD 45 0f857d000000 jnz 176 (15633730) ;;deoptimize if it is not
;; load a.prop into ecx, as you can see it's like doing struct->field in C
156336B3 51 8b480b mov ecx,[eax+0xb]
;; this will untag the tagged pointer so that integer arithmetic can be done to it
156336B6 54 d1f9 sar ecx,1
;; perform a.prop + a.prop
;; note that if it was a.prop + a.prop2
;; then it wouldn't need to do all those checks again
;; so for one time check inside the function we can load all the properties
;; quickly
156336B8 56 03c9 add ecx,ecx
Notice what happened here, V8 saw that we always pass the same class of object to the function
monomoprhic
and generated really tight code that assumes we will always get that class of object
in the future as well.
Now let's do:
function polymorphic( a ) {
return a.prop + a.prop;
}
var obj = {prop: 3};
var obj2 = {prop: 3, prop2: 4};
while( true ) {
polymorphic( Math.random() < 0.5 ? obj : obj2 );
}
Now the function must consider 2 different classes of objects. The classes are different but similar enough
that the client code can stay as it is as both classes contain a field prop
.
Let's see:
; load from stack to eax
04C33E17 23 8b4508 mov eax,[ebp+0x8]
;; test that `a` is an object and not a small integer
04C33E1A 26 f7c001000000 test eax,0x1
04C33E20 32 0f8492000000 jz 184 (04C33EB8) ;; deoptimize if not
;; test that `a`'s class is one of the expected classes
04C33E26 38 8178ffb9f7401c cmp [eax+0xff],0x1c40f7b9
04C33E2D 45 0f840d000000 jz 64 (04C33E40) ;; if it is, skip the second check and go to the addition code
;; otherwise check that `a`'s class is the second one of the expected classes
04C33E33 51 8178ff31f8401c cmp [eax+0xff],0x1c40f831
04C33E3A 58 0f857d000000 jnz 189 (04C33EBD) ;; deoptimize if not
;; load a.prop into ecx
;; if you are still reading this you will probably notice that this
;; is actually relying on the fact that both classes declared the prop field
;; first
04C33E40 64 8b480b mov ecx,[eax+0xb]
;; this will untag the tagged pointer so that integer arithmetic can be done to it
04C33E43 67 d1f9 sar ecx,1
;; do the addition
04C33E45 69 03c9 add ecx,ecx
Ok so the situation is still pretty good but here we are relying on the fact that properties are in same order and that there are only 2 different classes.
Let's do the same in different order so that V8 can't use the same instruction (mov ecx,[eax+0xb]
) for both objects:
function polymorphic( a ) {
return a.prop + a.prop;
}
var obj = {prop: 3};
var obj2 = {prop2: 4, prop: 3};
while( true ) {
polymorphic( Math.random() < 0.5 ? obj : obj2 );
}
And:
06C33E84 36 8b4508 mov eax,[ebp+0x8]
;; small integer check
06C33E87 39 f7c001000000 test eax,0x1
06C33E8D 45 0f84d3000000 jz 262 (06C33F66)
;; class check 1
06C33E93 51 8178ffb9f75037 cmp [eax+0xff],0x3750f7b9
06C33E9A 58 7505 jnz 65 (06C33EA1)
06C33E9C 60 8b480b mov ecx,[eax+0xb]
06C33E9F 63 eb10 jmp 81 (06C33EB1)
;; class check 2
06C33EA1 65 8178ff31f85037 cmp [eax+0xff],0x3750f831
06C33EA8 72 0f85bd000000 jnz 267 (06C33F6B)
06C33EAE 78 8b480f mov ecx,[eax+0xf]
06C33EB1 81 f6c101 test_b cl,0x1
06C33EB4 84 0f851e000000 jnz 120 (06C33ED8)
06C33EBA 90 d1f9 sar ecx,1
06C33EBC 92 89ca mov edx,ecx
06C33EBE 94 03d1 add edx,ecx
Ok just as expected, just using different offsets depending on the class.
So you can see where this is going, if you end up with 10 different classes then you will just get 30 instructions (instead of 3) whenever a function will need to lookup a property. Which is still much better than a hash table (100s of instructions?) lookup.
Well no, turns out there is a limit of 4 different classes and then you go into megamorphic mode.
So with this, we should see radically different code output if we use 5 or more different classes:
function megamorphic( a ) {
return a.prop + a.prop;
}
var objs = [
{prop: 3},
{prop3: 4, prop: 3, prop2: 4},
{prop4: 4, prop2: 4, prop: 3, prop5: 6},
{prop: 3, prop12: 6},
{prop7: 15, prop30: 12, prop314: 4, prop34: 15, prop: 3}
];
while( true ) {
var index = Math.random() * objs.length | 0;
megamorphic( objs[index] );
}
Indeed:
3D3342E7 39 8b5508 mov edx,[ebp+0x8]
3D3342EA 42 b9f5e2a115 mov ecx,15A1E2F5
3D3342EF 47 e84c6ffeff call LoadIC_Initialize (3D31B240)
3D3342F4 52 8945ec mov [ebp+0xec],eax
3D3342F7 55 8b75f0 mov esi,[ebp+0xf0]
3D3342FA 58 8b5508 mov edx,[ebp+0x8]
3D3342FD 61 b9f5e2a115 mov ecx,15A1E2F5
3D334302 66 e8396ffeff call LoadIC_Initialize (3D31B240)
3D334307 71 8b4dec mov ecx,[ebp+0xec]
3D33430A 74 f6c101 test_b cl,0x1
3D33430D 77 0f8527000000 jnz 122 (3D33433A)
3D334313 83 d1f9 sar ecx,1
3D334315 85 89c2 mov edx,eax
3D334317 87 f6c201 test_b dl,0x1
3D33431A 90 0f8546000000 jnz 166 (3D334366)
3D334320 96 d1fa sar edx,1
3D334322 98 03d1 add edx,ecx
That actually looks a lot like code that loads properties from objects that are in hash table mode. Is it fast? Unfortunately jsperf is down now so you have to run it yourself with code here:
Megamorphic 492
Monomorphic 30
So the same function body ran 15 times faster because the object passed to it always had the same class. It is also hard to predict if you will actually save memory if all those 10000 objects allocate a different class for instance.
It is also complicated to really look at all the downsides but having optional properties is one of those things that it is easy to say is very bad.
-
3\$\begingroup\$ +1. A more memory oriented estimation would have been good too but this is already very interesting. \$\endgroup\$Denys Séguret– Denys Séguret2013年07月11日 14:16:21 +00:00Commented Jul 11, 2013 at 14:16
-
1\$\begingroup\$ @SimonSarris I left for a vacation after posting this and didn't have Internet access so I saw all your chat pings only now :P \$\endgroup\$Esailija– Esailija2013年07月15日 11:22:26 +00:00Commented Jul 15, 2013 at 11:22
-
\$\begingroup\$ Not sure I understand this. Why does putting default values on a prototype causes the instances to have a different set of properties? It seems to me they all have the same properties (i.e., those defined on the on the prototype). Or am I missing something? \$\endgroup\$Norswap– Norswap2014年11月20日 12:46:45 +00:00Commented Nov 20, 2014 at 12:46
-
\$\begingroup\$ Oh @Norswap I think I understand. In the default case, where none of the default properties were overridden, I think the generated code would be efficient. But as soon as you override a default, that new property will be at a different offset. If many objects override the default, the new properties could be at a different offset for each object, therefore generating the horrible megamorphic code. It seems like in the case where more than ~4 objects plan to override defaults, it would be better to place those properties on the object instead of the prototype, so they are at the same offset. \$\endgroup\$Marius– Marius2015年06月23日 23:06:33 +00:00Commented Jun 23, 2015 at 23:06
Explore related questions
See similar questions with these tags.