Although I am relatively new to OpenGL, I have no trouble with matrix algebra, homogeneous coordinates, rotations and projections. I have set up my mvp matrix to get a good view of a cube and it renders nicely.
Next, instead of rendering the cube, I am trying to read the color from a background image. As expected, the texture is rendered clipped to where the cube is, but for some reason there is quite some distortion. (As per suggestion in the comments I am now dividing by w
[thought OpenGL did this].)
As far as I understand, the xy
coordinates in NDC should map linearly onto the window. Where does the distortion come from?
#version 330
uniform mat4 Mvp;
in vec3 in_shape;
out vec2 v_ndc;
void main() {
gl_Position = Mvp * vec4(in_shape, 1.0);
v_ndc = vec2(
gl_Position.x / gl_Position.w,
-gl_Position.y / gl_Position.w
) / 2 + 0.5;
}
#version 330
uniform sampler2D Background;
in vec2 v_ndc;
out vec4 f_color;
void main() {
f_color = vec4(texture(Background, v_ndc).rgb, 1.0);
}
As far as I understand, the xy coordinates in NDC should map linearly onto the window. Where does the distortion come from?
gl_Position
is a Homogeneous coordinate you have to do a Perspective divide
v_ndc = gl_Position.xy/gl_Position.w
Note, gl_Position.xy
is the clipspace coordiante. The coordinates in the clip space are transformed to the normalized device coordinates (NDC) in the range (-1, -1, -1) to (1, 1, 1) by dividing with the w
component of the clip coordinates.
There is still distortion.
This is because the interpolation of v ndc
is done in the perspective. It is based on the Barycentric coordinates of the triangle primitive, which is defined by gl_Position
in the perspective.
Use noperspective
to do a linearly interpolated in window-space:
Vertex shader:
noperspective out vec2 v_ndc;
Fragment shader
noperspective in vec2 v_ndc;