c++optimizationvisual-studio-2012

Can MSVC automatically optimize this case?


If I have many classes that use other classes with purely virtual interfaces can the compiler optimize this to not have a virtual call in Release Mode with full optimizations enabled?

For instance I have a class HardwareBuffer that contains a pointer to IHardwareResourceManager that has a virtual method:

virtual void ReleaseBuffer(HardwareBuffer* buffer) = 0;

and in the Release method of HardwareBuffer, I call

m_pHardwareResourceManager->Release(this);

There is a single class Render that inherits IHardwareResourceManager, where I actually implement the virtual Release method. When I create a HardwareBuffer, I set its m_pHardwareResourceManager to the Renderer itself.

Can the call to IHardwareResourceManager::Release in the Release method of HardwareBuffer be devirtualized?


Solution

  • I don't know when MSVC could accomplish it, but I do know that, in general, one would have to trace m_pHadwareResourceManager all the way back to the construction of Render. It has to be careful: a DLL could always create a new instance of IHardwareRResourceManager and provide it to your application. This is quite a daunting task, unless you allocated the Render object on the stack.

    That being said, indirect lookups from a VTABLE like this are AGGRESSIVELY optimized at the hardware level, because they occur so often. Make sure you profile before assuming the virtual function call is a large cost. For example, I would not be surprised if, on x64, the indirect lookup is cheaper than the prologue and epilogue of the function you are calling.

    For a comparison: DirectX uses COM, which has a comparable indirect lookup on every function call.