I have a ATI Radeon HD 5770 GPU on a 2012 Mac Pro.
When I run the following code in my program:
std::cout << glGetString(GL_RENDERER) << std::endl;
std::cout << glGetString(GL_VENDOR) << std::endl;
std::cout << glGetString(GL_VERSION) << std::endl;
std::cout << glGetString(GL_SHADING_LANGUAGE_VERSION) << std::endl;
The output I get is as follows:
ATI Radeon HD 5770 OpenGL Engine
ATI Technologies Inc.
2.1 ATI-1.24.35
1.20
But using OpenGL Extensions Viewer I get the following
Is there a way I can use 4.1? Why does it keep telling me that the version is 2.1?
When you initialize the OpenGL context, you have the possibility to request a Compatibility profile (up to GL 2.1), or a Core Profile.
You need to enable a Core context to use OpenGL 4.1 functionalities. The process to achieve this depends on the library which creates the context (SDL, GLU, Apple's GL framework etc.), but it is often a simple flag to pass to the init function.