webglcolor-spacegamma

Should WebGL shader output be adjusted for gamma?


Should a WebGL fragment shader output gl_FragColor RGB values which are linear, or to some 1γ power in order to correct for display gamma? If the latter, is there a specific value to use or must a complete application be configurable?

The WebGL Specification does not currently contain “gamma”, “γ”, or a relevant use of “linear”, and the GL_ARB_framebuffer_sRGB extension is not available in WebGL. Is there some other applicable specification? If this is underspecified, what do current implementations do? A well-sourced answer would be appreciated.

(Assume we have successfully loaded or procedurally generated linear color values; that is, gamma of texture images is not at issue.)


Solution

  • This is a tough one, but from what I've been able to dig up (primarily from this email thread) it seems that the current behavior is to gamma correct linear color space images(such as PNGs) as they are loaded. Things like JPEG get loaded without transformation of any sort because they are already gamma corrected. (Source: https://www.khronos.org/webgl/public-mailing-list/archives/1009/msg00013.html) This would indicate that textures may possibly be passed to WebGL in a non-linear space, which would be problematic. I'm not sure if that has changed since late 2010.

    Elsewhere in that thread it's made very clear that the desired behavior should be that everything input and output from WebGL should be in a linear color space. What happens beyond that is outside the scope of the WebGL spec (which is why it's silent on the issue).

    Sorry if that doesn't authoritatively answer your question, I'm just digging up what I can on the matter. As for the matter of wether or not you should be doing correction in a shader, I would say that the answer appears to be "no", since the WebGL output is going to be assumed to be linear, and attempting to self correct may lead to a double transformation of the color space.