webglfbodepth-bufferwebgl2

WebGL2 FBO depth attachment values


I'm simply trying to render the depth values of my scene using WebGL2 like so:

//Texture
depthTexture = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, depthTexture);
 
gl.texImage2D(gl.TEXTURE_2D, 0, gl.DEPTH_COMPONENT24,
                width, height, 0,
                gl.DEPTH_COMPONENT, gl.UNSIGNED_INT, null);
 
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);

//FBO
fb = gl.createFramebuffer();
gl.bindFramebuffer(gl.FRAMEBUFFER, fb);
gl.framebufferTexture2D(gl.FRAMEBUFFER, gl.DEPTH_ATTACHMENT, gl.TEXTURE_2D, depthTexture, 0);
gl.bindFramebuffer(gl.FRAMEBUFFER, null);

Then I render it like so:

gl.bindFramebuffer(gl.FRAMEBUFFER, fb);

gl.useProgram(shaderProgram);

//Just a torus...
gl.bindVertexArray(vao);
 
gl.colorMask(false, false, false, false);
gl.enable(gl.DEPTH_TEST);
gl.clear(gl.DEPTH_BUFFER_BIT);
...
gl.drawElements(gl.TRIANGLES, indices.length, gl.UNSIGNED_SHORT,0);

//Then I draw a full screen quad that simply samples the depth texture
gl.bindFramebuffer(gl.FRAMEBUFFER, null)
....

It seems to work fine, however when sampling the attached depth texture I get what looks like linear depth... enter image description here

The programs that I use to render into the FBO are very much basic

Vert:

#version 300 es
 
layout(location = 0) in vec3 position;

uniform mat4 projection;
uniform mat4 view;
uniform mat4 model;

void main() {
 
  gl_Position = projection * view * model * vec4(position, 1.);
}

Frag:

#version 300 es
precision highp float;


layout(location = 0) out vec4 outColor;

void main() {
  outColor = vec4(1,0,0,1)
}

I was expecting the depth buffer to be logarithmic, and even if I cannot say for sure that what I currently get is really linear or not, it doesn't look logarithmic either... If it really is linear, and somehow this is what you are supposed to get from the depth attachment, I'm perfectly fine with it, since I'm probably going to need linear and not logarithmic depth, but currently I'm not sure if this is expected behavior, or I'm doing something wrong(probably the latter)

Cheers


Solution

  • Thank you @LJ for the link. I think I know what is actually happening. My previous conclusions were incorrect mostly because I did not understand what was going on. What I was seeing was, as expected logarithmic depth, but since my near and far clipping planes were 1. and 1000. respectively, and the torus was relatively close to the near clipping plane, it was is that part of the 1/z curve where the values bunch up giving high precision. As soon as I increased the torus's depth, I saw the big precision loss. After linearizeing it using something like How to read depth texture attached to FBO in WebGL2 and using near and far clipping planes of 0.1 and 10(in order to have visual feedback since the torus is relatively small in depth) I could see the depth was linear.