I'm having trouble using webgl2 to render to an integer renderbuffer. Using the example code at the bottom, I get all zeros in the pixels array and the following error on Firefox:
WebGL warning: readPixels: Format and type RGBA_INTEGER/<enum 0x1400> incompatible with this RGBA8I attachment. This framebuffer requires either RGBA_INTEGER/INT or getParameter(IMPLEMENTATION_COLOR_READ_FORMAT/_TYPE) RGBA_INTEGER/INT.
When I substitute:
gl.readPixels(
0, 0, 256, 256,
gl.getParameter(gl.IMPLEMENTATION_COLOR_READ_FORMAT),
gl.getParameter(gl.IMPLEMENTATION_COLOR_READ_TYPE),
pixels
);
for
gl.readPixels(0, 0, 256, 256, gl.RGBA_INTEGER, gl.BYTE, pixels);
The error becomes:
WebGL warning: readPixels: `pixels` type does not match `type`.
(On Firefox, gl.getParameter(gl.IMPLEMENTATION_COLOR_READ_TYPE)
returns gl.INT instead of gl.BYTE.)
I've tried changing the TypedArray for pixels between Uint8Array and Int8Array, but none options works. I should note that the provided example does work on Chrome. Is there a way to render to int buffer on Firefox or is it entirely bugged?
const gl = document.getElementById('target').getContext('webgl2', { preserveDrawingBuffer: true });
const prog = gl.createProgram()
const vert = gl.createShader(gl.VERTEX_SHADER);
gl.shaderSource(vert, `#version 300 es
in vec2 a_position;
void main() {
gl_Position = vec4(a_position, 0., 1.);
}`);
gl.compileShader(vert);
gl.attachShader(prog, vert);
const frag = gl.createShader(gl.FRAGMENT_SHADER);
gl.shaderSource(frag, `#version 300 es
precision highp isampler2D;
out ivec4 color;
void main() {
color = ivec4(255, 0, 0, 255);
}`);
gl.compileShader(frag);
gl.attachShader(prog, frag);
gl.linkProgram(prog);
const vao = gl.createVertexArray();
gl.bindVertexArray(vao)
const buff = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, buff);
gl.bufferData(
gl.ARRAY_BUFFER,
new Float32Array([-1,-1, -1,1, 1,-1, 1,1, -1,1, 1,-1]),
gl.STATIC_DRAW
);
const pos = gl.getAttribLocation(prog, 'a_position');
gl.enableVertexAttribArray(pos);
gl.vertexAttribPointer(pos, 2, gl.FLOAT, false, 0, 0)
const rbo = gl.createRenderbuffer();
gl.bindRenderbuffer(gl.RENDERBUFFER, rbo);
gl.renderbufferStorage(gl.RENDERBUFFER, gl.RGBA8I, 256, 256);
const fbo = gl.createFramebuffer();
gl.bindFramebuffer(gl.FRAMEBUFFER, fbo);
gl.framebufferRenderbuffer(
gl.FRAMEBUFFER, gl.COLOR_ATTACHMENT0,
gl.RENDERBUFFER, rbo
);
gl.viewport(0, 0, 256, 256);
gl.clearColor(0,0,0,0);
gl.clear(gl.COLOR_BUFFER_BIT);
gl.disable(gl.DEPTH_TEST);
gl.useProgram(prog);
gl.drawBuffers([gl.COLOR_ATTACHMENT0]);
gl.drawArrays(gl.TRIANGLES, 0, 6);
gl.readBuffer(gl.COLOR_ATTACHMENT0);
const pixels = new Int8Array(256 ** 2 * 4);
gl.readPixels(0, 0, 256, 256, gl.RGBA_INTEGER, gl.BYTE, pixels);
console.log(pixels);
An portable spec-compliant app that hopes to run on all spec-conformant implementations MUST support a fallback for when readpixels must use RGBA_INTEGER
+INT
instead of RGBA_INTEGER
+BYTE
.
In short: The WebGL 2 spec does not guarantee that e.g. RGBA8I
can readPixels with RGBA_INTEGER
+BYTE
, and it is NOT portable to rely on this, as spec-conformant implementations and drivers may indeed not support this.
Implementations MAY support this, but it is indeed spec-conformant for an implementation to not support this, and to require use of the int-ish format default readPixels format+type of RGBA_INTEGER
+INT
.
GLES 3.0.6 p191-192 subsection 4.3.2 "Reading Pixels":
Only two combinations of format and type are accepted in most cases. The first varies depending on the format of the currently bound rendering surface. For normalized fixed-point rendering surfaces, the combination format
RGBA
and typeUNSIGNED_BYTE
is accepted. For signed integer rendering surfaces, the combination formatRGBA_INTEGER
and typeINT
is accepted. For unsigned integer rendering surfaces, the combination formatRGBA_INTEGER
and typeUNSIGNED_INT
is accepted.The second is an implementation-chosen format from among those defined in table 3.2, excluding formats
DEPTH_COMPONENT
andDEPTH_STENCIL
. The values of format and type for this format may be determined by callingGetIntegerv
with the symbolic constantsIMPLEMENTATION_COLOR_READ_FORMAT
andIMPLEMENTATION_COLOR_READ_TYPE
, respectively.GetIntegerv
generates anINVALID_OPERATION
error in these cases if the object bound toREAD_FRAMEBUFFER_BINDING
is not framebuffer complete (as defined in section 4.4.4.2), or ifREAD_BUFFER
isNONE
, or if the GL is using a framebuffer object (i.e.READ_FRAMEBUFFER_BINDING
is non-zero) and the read buffer selects an attachment that has no image attached. The implementation-chosen format may vary depending on the format of the selected read buffer of the currently bound read framebuffer.Additionally, when the internal format of the rendering surface is
RGB10_A2
, a third combination of formatRGBA
and typeUNSIGNED_INT_2_10_10_10_REV
is accepted.
ReadPixels
generates anINVALID_OPERATION
error if the combination of format and type is unsupported.
That said, I'll look at whether we can add support for this where possible on most desktop GL drivers, though there are likely mobile GLES drivers that will still be spec-compliant and not support this. (and never will)
I'll also work on improving the error text further, to make it more clear that the implementation is not out-of-spec in this regard, and that spec-compliant apps MUST accept this as a possible restriction.
I am the WebGL Spec Editor and also the Firefox WebGL engine lead, if there are further questions!
PS: If Chrome does not always accept readPixels(format:RGBA_INTEGER, type:INT)
from an RGBA8I
framebuffer attachment, Chrome is out-of-spec.