openglglsllightdeferred-renderingdeferred-shading

calculate light volume radius from intensity


i am currently having a problem with calculating the light volume radius for a deferred renderer. On low light intensities the volume size looks correct but when the light intensity (and therefore the radius) increases, the light volume seems to be more and more too small.

I am calculating the light volume radius (in world space) like this:

const float LIGHT_CUTOFF_DEFAULT = 50;
mRadius = sqrt(color.length() * LIGHT_CUTOFF_DEFAULT);

I then use this value to scale a box.

In my shader i then calculate the attenuation like this:

float falloff = 5;
float attenuation = max(0, 1.0 / (1+falloff*(distance*distance)));

So obviously I am messing around with the math. The attenuation should be linear, right? But how do I now correctly calculate the world scale value for the light volume?

P.S. the light color can go beyond (1,1,1) since I am planning to use HDR rendering.


Solution

  • Not using that equation, light goes on forever.

    plot 1.0 / (1+5*(x*x)) at wolframalpha.com: enter image description here

    [EDIT] Since your light colour values can go above one, the following 1/255 will need to be divided by the largest RGB component.

    You'll need a threshold. Assuming your monitor can't display anything dimmer than 1/255 before black,

    solve 1.0 / (1+f*(x*x)) = 1/255, x

    enter image description here

    Where f is your falloff. For f = 5, the effective radius is ~7.

    enter image description here

    You could probably increase 1/255 a little depending on your application and you might not notice anything badly wrong. Alternatively, fudge an artificial falloff function which isn't infinite :)

    This issue is also discussed here: https://gamedev.stackexchange.com/questions/51291/deferred-rendering-and-point-light-radius, where the function is adjusted to reach zero at the threshold.