I have a problem with the following shader function in my android opengl project:
highp float calculateShadow(highp float cosTheta, highp vec4 shadowCoord)
{
highp float bias = 0.005 * sqrt ( 1.0f - cosTheta * cosTheta ) / cosTheta;
bias = clamp(bias, 0.0 ,0.01);
shadowCoord.z = shadowCoord.z - bias;
return textureProjOffset(uTextureShadowMap, shadowCoord, ivec2(0,0));
}
This function is used to determine if a fragment is in shadow or not. The first parameter (cosTheta) is calculated in the following way:
float cosTheta = dot(theNormal, fragmentToSun);
So, cosTheta is the angle between the face's normal and the direction from the fragment to the sun. cosTheta is then used to shift the bias according to the face's slope - it kind of works quite well. I got it from opengl-tutorial.com.
But here's my problem:
On my Samsung Galaxy S7 with Exynos-CPU I have no shadow acne, but on devices with a Snapdragon 600 or 855 chipsets I get a lot of acne.
Why is there a difference? The shader function should calculate roughly the same values for the bias - no matter what the chipset is. How can I get the same results on almost all devices?
(I tried glPolygonOffset before, but this is implemented differently on almost every chipset)
The problem may come from the resolution of your others devices. If they have a lower resolution, the shadow map texture will have a lower resolution too if you define it's resolution depending of the device resolution. Try to define the shadow map texture size with a constant value in your program.