Not sure what is the problem, it failed to read the pixels when writing to multiple renderbuffer. Here is the code:
setup code:
// gen framebuffer object
glGenFramebuffers(1, &fbo);
glBindFramebuffer(GL_FRAMEBUFFER, fbo);
// gen renderbuffer 0
glGenRenderbuffers(1, &rbo_color0);
glBindRenderbuffer(GL_RENDERBUFFER, rbo_color0);
glRenderbufferStorage(GL_RENDERBUFFER, GL_RGB, w, h);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, rbo_color0);
// gen renderbuffer 1
glGenRenderbuffers(1, &rbo_color1);
glBindRenderbuffer(GL_RENDERBUFFER, rbo_color1);
glRenderbufferStorage(GL_RENDERBUFFER, GL_RED_INTEGER, w, h);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT1, GL_RENDERBUFFER, rbo_color1);
// gen depth buffer
glGenRenderbuffers(1, &rbo_depth);
glBindRenderbuffer(GL_RENDERBUFFER, rbo_depth);
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT, w, h);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, rbo_depth);
glBindFramebuffer(GL_FRAMEBUFFER, 0);
render and read pixel code:
// setup shader and uniforms...
// bind framebuffer and clear color/depth buffer
glBindFramebuffer(GL_FRAMEBUFFER, fbo);
glEnable(GL_DEPTH_TEST);
glClearColor(0.f, 0.f, 0.f, 0.1f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
const GLuint buffers[] = {GL_COLOR_ATTACHMENT0, GL_COLOR_ATTACHMENT1};
glDrawBuffers(2, buffers);
glBindVertexArray(vao);
glDrawElements(GL_TRIANGLES, index_count, GL_UNSIGNED_INT, (void*)0);
glReadBuffer(GL_COLOR_ATTACHMENT0);
glReadPixels(x, y, 1, 1, GL_RGB, GL_FLOAT, data);
glReadBuffer(GL_COLOR_ATTACHMENT1);
glReadPixels(x, y, 1, 1, GL_RED_INTEGER, GL_INT, (int*)(data) + 4);
glBindVertexArray(0);
glBindFramebuffer(GL_FRAMEBUFFER, 0);
I used layout qualifier in the shader for the outputs, shouldn't be any problems there. So I think probably there is something wrong the the buffer setup or render code. I've succeed in reading the pixels when there is only one renderbuffer(rbo_color0). The main difference is:
Any helps?
GL_RED_INTEGER
is not a proper internal format, as it can be used for the 2nd parameter of glRenderbufferStorage
.
A proper enumerator constant for the internal format would be GL_R32I
.
See OpenGL 4.6 API Compatibility Profile Specification; 8.26. TEXTURE IMAGE LOADS AND STORES; page 334
If you would check for OpenGL erros (glGetError
), then you would get an INVALID_ENUM
error.
Change the render buffer storage specification, to solve the issue:
glRenderbufferStorage(GL_RENDERBUFFER, GL_R32I, w, h);
Note, you should check the framebuffer completeness by glCheckFramebufferStatus
:
GLenum fb_status = glCheckFramebufferStatus( GL_FRAMEBUFFER );
if ( fb_status != GL_FRAMEBUFFER_COMPLETE )
{
// error handling
}