javascriptnode.jsperformancev8bluebird

How does Bluebird's util.toFastProperties function make an object's properties "fast"?


In Bluebird's util.js file, it has the following function:

function toFastProperties(obj) {
    /*jshint -W027*/
    function f() {}
    f.prototype = obj;
    ASSERT("%HasFastProperties", true, obj);
    return f;
    eval(obj);
}

For some reason, there's a statement after the return function, which I'm not sure why it's there.

As well, it seems that it is deliberate, as the author had silenced the JSHint warning about this:

Unreachable 'eval' after 'return'. (W027)

What exactly does this function do? Does util.toFastProperties really make an object's properties "faster"?

I've searched through Bluebird's GitHub repository for any comments in the source code or an explanation in their list of issues, but I couldn't find any.


Solution

  • 2017 update: First, for readers coming today - here is a version that works with Node 7 (4+):

    function enforceFastProperties(o) {
        function Sub() {}
        Sub.prototype = o;
        var receiver = new Sub(); // create an instance
        function ic() { return typeof receiver.foo; } // perform access
        ic();
        ic();
        return o;
        eval("o" + o); // ensure no dead code elimination
    }
    

    Sans one or two small optimizations - all the below is still valid.


    Let's first discuss what it does and why that's faster and then why it works.

    What it does

    The V8 engine uses two object representations:

    Here is a simple demo that demonstrates the speed difference. Here we use the delete statement to force the objects into slow dictionary mode.

    The engine tries to use fast mode whenever possible and generally whenever a lot of property access is performed - however sometimes it gets thrown into dictionary mode. Being in dictionary mode has a big performance penalty so generally it is desirable to put objects in fast mode.

    This hack is intended to force the object into fast mode from dictionary mode.

    Why it's faster

    In JavaScript prototypes typically store functions shared among many instances and rarely change a lot dynamically. For this reason it is very desirable to have them in fast mode to avoid the extra penalty every time a function is called.

    For this - v8 will gladly put objects that are the .prototype property of functions in fast mode since they will be shared by every object created by invoking that function as a constructor. This is generally a clever and desirable optimization.

    How it works

    Let's first go through the code and figure what each line does:

    function toFastProperties(obj) {
        /*jshint -W027*/ // suppress the "unreachable code" error
        function f() {} // declare a new function
        f.prototype = obj; // assign obj as its prototype to trigger the optimization
        // assert the optimization passes to prevent the code from breaking in the
        // future in case this optimization breaks:
        ASSERT("%HasFastProperties", true, obj); // requires the "native syntax" flag
        return f; // return it
        eval(obj); // prevent the function from being optimized through dead code
        // elimination or further optimizations. This code is never
        // reached but even using eval in unreachable code causes v8
        // to not optimize functions.
    }
    

    We don't have to find the code ourselves to assert that v8 does this optimization, we can instead read the v8 unit tests:

    // Adding this many properties makes it slow.
    assertFalse(%HasFastProperties(proto));
    DoProtoMagic(proto, set__proto__);
    // Making it a prototype makes it fast again.
    assertTrue(%HasFastProperties(proto));
    

    Reading and running this test shows us that this optimization indeed works in v8. However - it would be nice to see how.

    If we check objects.cc we can find the following function (L9925):

    void JSObject::OptimizeAsPrototype(Handle<JSObject> object) {
        if (object->IsGlobalObject()) return;
    
        // Make sure prototypes are fast objects and their maps have the bit set
        // so they remain fast.
        if (!object->HasFastProperties()) {
            MigrateSlowToFast(object, 0);
        }
    }
    

    Now, JSObject::MigrateSlowToFast just explicitly takes the Dictionary and converts it into a fast V8 object. It's a worthwhile read and an interesting insight into v8 object internals - but it's not the subject here. I still warmly recommend that you read it here as it's a good way to learn about v8 objects.

    If we check out SetPrototype in objects.cc, we can see that it is called in line 12231:

    if (value->IsJSObject()) {
        JSObject::OptimizeAsPrototype(Handle<JSObject>::cast(value));
    }
    

    Which in turn is called by FuntionSetPrototype which is what we get with .prototype =.

    Doing __proto__ = or .setPrototypeOf would have also worked but these are ES6 functions and Bluebird runs on all browsers since Netscape 7 so that's out of the question to simplify code here. For example, if we check .setPrototypeOf we can see:

    // ES6 section 19.1.2.19.
    function ObjectSetPrototypeOf(obj, proto) {
        CHECK_OBJECT_COERCIBLE(obj, "Object.setPrototypeOf");
    
        if (proto !== null && !IS_SPEC_OBJECT(proto)) {
            throw MakeTypeError("proto_object_or_null", [proto]);
        }
    
        if (IS_SPEC_OBJECT(obj)) {
            %SetPrototype(obj, proto); // MAKE IT FAST
        }
    
        return obj;
    }
    

    Which directly is on Object:

    InstallFunctions($Object, DONT_ENUM, $Array(
    ...
    "setPrototypeOf", ObjectSetPrototypeOf,
    ...
    ));
    

    So - we have walked the path from the code Petka wrote to the bare metal. This was nice.

    Disclaimer:

    Remember this is all implementation detail. People like Petka are optimization freaks. Always remember that premature optimization is the root of all evil 97% of the time. Bluebird does something very basic very often so it gains a lot from these performance hacks - being as fast as callbacks isn't easy. You rarely have to do something like this in code that doesn't power a library.