I have a webgl2 application in which I'm rendering to a framebuffer and then reading the values with readPixels. This works fine when I initialize the framebuffer and later call readPixels both with RGBA/UNSIGNED_BYTE and read into a Uint8Array. However, if I try to do this with RGBA16UI/RGBA_INTEGER/UNSIGNED_SHORT and read into a Uint16Array, I get a "Fragment shader output type does not match the bound framebuffer attachment type." error when I call readPixels.
I initialize the framebuffer like this...
level = 0;
internalFormat = gWebGL.RGBA16UI;
border = 0;
format = gWebGL.RGBA_INTEGER;
type = gWebGL.UNSIGNED_SHORT;
data = null;
gWebGL.texImage2D(gWebGL.TEXTURE_2D, level, internalFormat, targetTextureWidth, targetTextureHeight, border, format, type, data);
// Create and bind the framebuffer
gFrameBuffer = gWebGL.createFramebuffer();
gWebGL.bindFramebuffer(gWebGL.FRAMEBUFFER, gFrameBuffer);
// attach the texture as the first color attachment
const attachmentPoint = gWebGL.COLOR_ATTACHMENT0;
gWebGL.framebufferTexture2D(gWebGL.FRAMEBUFFER, attachmentPoint, gWebGL.TEXTURE_2D, gTargetTexture, level);
Then after rendering I call readPixels like this...
var size = width * height * 4;
const pixels = new Uint16Array(size);
gWebGL.readPixels(
0,
0,
width,
height,
gWebGL.RGBA_INTEGER,
gWebGL.UNSIGNED_SHORT,
pixels,
);
As soon as I call readPixels I get the "Fragment shader output type does not match the bound framebuffer attachment type." error.
I've also tried combinations using RGBA32UI/UNSIGNED_INT with a Uint32Array, same result. I've followed other articles which seem to suggest that these combinations should work. What am I doing wrong? Thanks for any help or suggestions.
It turns out the problem was that when using RGBA16UI/RGBA_INTEGER/UNSIGNED_SHORT the GLSL fragment shader must output a uvec4, not a vec4. This was unexpected because texture format RGBA/UNSIGNED_BYTE does allow shader output as vec4 values in the 0 -> 1 range, and then automatically maps them into the 0 -> 255 range.