javascriptwebgldepth-bufferwebgl2mouse-picking

Can I read single pixel value from WebGL depth texture in JavaScript?


In short

I would like to read a single pixel value from a WebGL 2 depth texture in JavaScript. Is this at all possible?

The scenario

I am rendering a scene in WebGL 2. The renderer is given a depth texture to which it writes the depth buffer. This depth texture is used in post processing shaders and the like, so it is available to us.

However, I need to read my single pixel value in JavaScript, not from within a shader. If this had been a normal RGB texture, I would do

function readPixel(x, y, texture, outputBuffer) {
    const frameBuffer = gl.createFramebuffer();
    gl.bindFramebuffer( gl.FRAMEBUFFER, frameBuffer );
    gl.framebufferTexture2D( gl.FRAMEBUFFER, gl.COLOR_ATTACHMENT0, gl.TEXTURE_2D, texture, 0 );
    gl.readPixels(x, y, 1, 1, gl.RGBA, gl.UNSIGNED_BYTE, outputBuffer);
}

This will write the pixel at x, y into outputBuffer.

However, is it at all possible to do the same with a depth texture? If I just pass a depth texture to my function above, the output buffer only has zeros, and I receive a WebGL warning GL_INVALID_FRAMEBUFFER_OPERATION: Framebuffer is incomplete.. Checking the framebuffer state reveals FRAMEBUFFER_INCOMPLETE_ATTACHMENT.

Naturally, the depth texture is not an RGBA texture, but is there some other values we can give it to get our depth value, or is it impossible?

Motivation

I am aware of that this question has been asked some number of times on StackOverflow and elsewhere in some form of another, but there is always some variation making it confusing for me to get a straight-up yes or no answer to the question in the form I ask it here. In addition, many questions and sources are very old, WebGL 1 only, with some mentions of webgl_depth_texture making a difference etc etc.

If the answer is no, I'd welcome any suggestions for how else to easily obtain this depth pixel. As this operation is not done for every frame, I value simplicity over performance. The use case is picking, and classical ray intersection is not feasible. (I also know that I can encode a scalar depth value into and out of an RGB pixel, but I need to be able to access the pixel from within the js code in the first place.)

I'd welcome any insights.


Solution

  • There is no possibility WebGL 2.0 is based on OpenGL ES 3.0.
    In OpenGL ES 3.2 Specification - 4.3.2 Reading Pixels is clearly specified:

    [...] The second is an implementation-chosen format from among those defined in table 3.2, excluding formats DEPTH_COMPONENT and DEPTH_STENCIL [...]