I am currently writing a physically based ray tracer and I am trying to implement normal mapping. The issue is that normal mapping results in "weird" geometry such that reflections can act very strangely. For example in the following scenario, where n represents the mapped normal at that point, the light bounces inside the geometry of the object and hence the light is lost.
I am not sure how to handle this in a way that makes sense, there are many different BSDFs and I can't seem to find a solution, which would work for all...
I tried implementing the normal mapping, but I get strange behaviour around grazing angles. For example:
The top part of the top-most brick is very dark, which should not be the case.
Does anyone have an idea of how to handle this?
This is an interesting issue that arises from the normal map being an approximation of a physical surface without the tiny bits of geometry to fully specify it.
That means there's no "physically correct" solution to it, but still there are better ways to handle it than to color the pixel black. For example, one can imagine that if a light ray hit a piece of overhang geometry that directed it inwards toward the object, the ray would intersect the same surface a second time, as it got closer to the material, underneath the theoretical overhang.
Some implementations approximate this by mirroring again at the 90-degree mark from the geometric normal (tangent to the geometric surface). The further past 90 the reflection tries to go, the more it's directed back towards the geometric normal. This emulates a double-bounce, to some extent.
But as with all solutions here, this is merely an approximation. In a real physical situation, the bit of geometry with its normal facing the wrong way would have been occluded by some closer piece of geometry, but we don't know where that piece is because it's been flattened into a normal map. So, we do the best we can with the available data.