touchgfx

TouchGFX with hardware encoder wheel


Trying to implement touchGFX on ARM with the UI controlled from a push-button encoder wheel, often seen in cars.

The problem I encounter is that the low-level, encoder wheel part is handled from the main backend application in C.

TouchGFX, is in C++. The designer app allows to implement hardware button directly, that is directly handled by the stack, but an encoder wheel needs to have some logic implemented, preferably on the backend.

Usually, the way to interact from the backend to touchGFX is through the model class where data are polled (about 60Hz according to the documentation).

However, for a physical encoder, it would be preferable to have a trigger-based communication between the backend and GFX, however, it isn't clear on the doc or examples how to get GFX context and how to integrate triggers from the backend, either by direct call or by callbacks rather than polling.

I tried to implement a C->C++ callback but couldn't find the GFX context.

After digging through documentation and plenty of post, I have yet to find a solution.


Solution

  • You could do the following:

    1. Add a function to the frontend application which returns a pointer to the model

    2. Add a C++ source file with a function which is called from C and which calls a function in the model:

       #include <gui/common/FrontendApplication.hpp>
      
       #ifdef __cplusplus
       extern "C" {
       #endif
      
       void func1()
       {
           Model * model = static_cast<FrontendApplication*>(Application::getInstance())->getModelPtr();
           if (model)
           {
               model->funcOnModel();   
           }
       }
      
      
       #ifdef __cplusplus
       }
       #endif
      
    3. call func1() from your C backend

    From funcOnModel the touchgfx widgets can be accessed via the ModelListener or the derived Presenters.