c++dllc++17idsueye

How could just loading a dll lead to 100 CPU load in my main application?


I have a perfectly working program which connects to a video camera (an IDS uEye camera) and continuously grabs frames from it and displays them.

However, when loading a specific dll before connecting to the camera, the program runs with 100% CPU load. If I load the dll after connecting to the camera, the program runs fine.

int main()
{
    INT nRet = IS_NO_SUCCESS;
    // init camera (open next available camera)
    m_hCam = (HIDS)0;

    // (A) Uncomment this for 100% CPU load:
    // HMODULE handle = LoadLibrary(L"myInnocentDll.dll");

    // This is the call to the 3rdparty camera vendor's library:
    nRet = is_InitCamera(&m_hCam, 0);    

    // (B) Uncomment this instead of (A) and the CPU load won't change
    // HMODULE handle = LoadLibrary(L"myInnocentDll.dll");

    if (nRet == IS_SUCCESS)
    {
        /*
         * Please note: I have removed all lines which are not necessary for the exploit.
         * Therefore this is NOT a full example of how to properly initialize an IDS camera!
         */
        is_GetSensorInfo(m_hCam, &m_sInfo);

        GetMaxImageSize(m_hCam, &m_s32ImageWidth, &m_s32ImageHeight);

        m_nColorMode = IS_CM_BGR8_PACKED;// IS_CM_BGRA8_PACKED;
        m_nBitsPerPixel = 24; // 32;
        nRet |= is_SetColorMode(m_hCam, m_nColorMode);

        // allocate image memory.
        if (is_AllocImageMem(m_hCam, m_s32ImageWidth, m_s32ImageHeight, m_nBitsPerPixel, &m_pcImageMemory, &m_lMemoryId) != IS_SUCCESS)
        {
            return 1;
        }
        else
        {
            is_SetImageMem(m_hCam, m_pcImageMemory, m_lMemoryId);
        }
    }
    else
    {
        return 1;
    }

    std::thread([&]() {
        while (true) {
            is_FreezeVideo(m_hCam, IS_WAIT);
            /*
             * Usually, the image memory would now be grabbed via is_GetImageMem().
             * but as it is not needed for the exploit, I removed it as well
             */
        }
        }).detach();

    cv::waitKey(0);
}

Independently of the actually used camera driver, in what way could loading a dll change the performance of it, occupying 100% of all available CPU cores? When using the Visual Studio Diagnostic Tools, the excess CPU time is attributed to "[External Call] SwitchToThread" and not to the myInnocentDll.

Loading just the dll without the camera initialization does not result in 100% CPU load.

I was first thinking of some static initializers in the myInnocentDll.dll configuring some threading behavior, but I did not find anything pointing in this direction. For which aspects should I look for in the code of myInnocentDll.dll?


Solution

  • After a lot of digging I found the answer and it is both frustratingly simple and frustrating by itself:

    It is Microsoft's poor support of OpenMP. When I disabled OpenMP in my project, the camera driver runs just fine.

    The reason seems to be that the Microsoft compiler uses OpenMP with busy waiting and there is also the possibility to manually configure OMP_WAIT_POLICY, but as I was not depending on OpenMP anyways, disabling was the easiest solution for me.

    I still don't understand why the CPU only went up high when using the camera and not when running the rest of my solution, even though the camera library is pre-built and my disabling/enabling of OpenMP compilation cannot have any effect on it. And I also don't understand why they bothered to make a hotfix for VS2010 but have no real fix as of VS2019, which I am using. But the problem is averted.