swiftgraphicsmetalcore-imagenormals

Calculating a normal vector for each pixel (2D image)


I'm trying to understand how the Core Image Shaded Material filter works and trying to replicate its behavior using Metal.

From the Shaded Material filter documentation:

This filter [...] computes a normal vector for each pixel. It then uses that normal vector to look up the reflected color for that direction in the input shading image.

The normal vector is then used to caclulate the look-up coordinate for the shading image:

The input shading image contains the picture of a hemisphere, which defines the way the surface is shaded. The look-up coordinate for a normal vector is:

(normal.xy + 1.0) * 0.5 * vec2(shadingImageWidth, shadingImageHeight)

image

Assuming I have a height field texture, how do I calculate the normal vector for each pixel? I do not understand how to convert an RGB pixel to a 2-component normal vector that can be used in the formula above.


Solution

  • I would assume that the normals are approximated by computing the image gradient of the height field.

    I did not test this, but it looks like you might be able to use the CIGaborGradients filter to compute the gradient vector for each pixel, which should be equivalent to a normal when computed on the hight field.