I am currently meet with a strange behavior.
Details:
In vertex shader, I declare the attribute at location 1 to be uint
:
layout (location = 1) in uint VS_In_ID
I declare it as GL_UNSIGNED_INT
in C, stride 1:
glVertexAttribFormat(1, 1, GL_UNSIGNED_INT, GL_FALSE, 0);
I uploaded a bunch of unsigned int data correctly (I ensured this by calling glGetBufferSubData
):
glBindBuffer(GL_ARRAY_BUFFER, data.idsVBO);
glBufferData(GL_ARRAY_BUFFER, sizeof(GLuint) * data.numPoints, data.ids, GL_STATIC_DRAW);
glBindVertexBuffer(1, data.idsVBO, 0, sizeof(GLuint));
The attribute is rendered to a texture (binded at GL_COLOR_ATTACHMENT_1
) configured by
glTexImage2D(GL_TEXTURE_2D, 0, GL_R32UI, config.bufferWidth, config.bufferHeight, 0,
GL_RED_INTEGER, GL_UNSIGNED_INT, NULL);
and the data is fetched by
glGetTexImage(GL_TEXTURE_2D, 0, GL_RED, GL_UNSINGNED_INT, data);
but finally, I found the values I uploaded (all assigned '2') were implicitly converted to float, for example, when I fetch the value which should be 2, I got 0x40000000, which is a float 2!
This conclusion is not drawn after fetching, but in the shader using things like
if (VS_In_ID == 0x40000000)
gl_Position.x = gl_Position.x + 1.0f;
and to see the displacement of the faces.
Why and when did this cast happen?
Why my attribute values are implicitly cast to float in OpenGL?
Because that is how it is specified. glVertexAttribFormat
specifies how the data in the buffer is to be interpreted as floating point numbers, but glVertexAttribIFormat
specifies how the data is to be interpreted as ints. This function specifies how the data for the shader program must be converted. As the type of vertex shader input cannot be guessed, there are different functions for different types. The type argument only specifies the type of the source data, but not the type of the target. So if the type of the attribute is uint
, you have to use glVertexAttribIFormat
.
glVertexAttribFormat(1, 1, GL_UNSIGNED_INT, GL_FALSE, 0);
glVertexAttribIFormat(1, 1, GL_UNSIGNED_INT, 0);