recently we encountered in our legacy code that is currently ported from VS2010 to VS2015 an interesting effect. Unfortunately I couldn't create a small example that shows this effect, but I'll try to describe it as accurately as I can.
We have 2 dlls (I'll call them dll A and dll B). The project for dll A defines the interface IFoo & a derived interface IFxFoo
class __declspec(novtable) IFoo {
public:
virtual int GetType() = 0;
virtual ~IFoo() {}
};
class __declspec(novtable) IFxFoo : public IFoo {
public:
virtual int GetSlot() = 0;
};
in dll B, both interfaces are used.
class CBImpl : public IFxFoo {
public:
...
void processFoo(IFoo* f) {
...
if (f->GetType() == IFXFOO) {
IFxFoo* fx = static_cast<IFxFoo>(f); //downcast
fill(fx);
}
}
void fill(IFxFoo* fx) {
m_slot = fx->GetSlot();
}
private:
int m_slot;
};
processFoo() will be called with different implementations of IFoo. Some from dll A and some from dll B.
What now happened was the following: - if we turned on whole program optimization when compiling dll B, the call to the virtual function GetSlot() in function fill() got de-virtualized by Visual C++. This caused our program to crash. we can fix this behavior if we either
The questions that I have now are:
Thank you for your help Tobias
Using LTO results in the compiler making drastic adjustments to any functions for which is it able to see the complete callgraph.
What you are seeing is expected and using __declspec(dllexport)
or extern
on the functions that need to be utilised from a separate module or explicitly declaring them as part of a DLL .def file is the expected way to resolve the problem as the compiler will no longer consider the functions to be internal-only.