I am trying to create a simple 3D-array visualisation in opengl. The 3D-array contains color values. To achieve this, I could just draw lots of cubes, giving them texture-coordinates to point to the correct texel. Well, I tried it, and it works. But I need a lot more, and the only way to achieve what I want is if I just draw full 2D planes on every grid in the other axis direction as the plane is on and do this is 6 directions. So for example: I draw XY-plane from -15 to +15, on Z-coordinates -15 to +15, and do this also for back-XY-plane, YZ-plane, back-YZ-plane, ... This way, I can just put 3D-texture coordinates at every corner and interpolation should do the rest.
I have taken array-coordinates as texture coordinates, between 0 and 32, the reason for this is, that in the shader, I can floor the texture-coordinate and then divide it by 32 to get the exact texel for every pixel on the grid.
But alas... the pixel shader is constantly doubting choices, one pixel maps to a different texel than the next pixel, while the next-next pixel maps correctly again.
I used to solve these kind of problems by adding 0.5 or 0.35 to the floats to make sure that the floats are rounded correctly on all hardware platforms, but it still seems such a hack to me. Also, it doesn't work on the 3D texture.
Can anyone tell me how to handle this?
Can anyone tell me how to handle this?
You're running into the infamous fenceposting problem. Simply spoken, texture coordinates 0 and 1 are not pixel centers, but pixel borders. Have a look at this simple sketch
| 0 | 1 | 2 | 3 |
^ ^
0.0 1.0
0 1 2 3 4
--- --- --- --- ---
4 4 4 4 4
So to address pixel i (say 0) of a texture N (say 4) pixels wide you must use the texture coordinate
(i/N + (i+1)/N)/2 = i/2N + (i+1)/2N = (2i + 1) / 2N = (i + 0.5)/N
So adding that 0.5 is not a hack. However in the shader you can just use texelFetch
to address the texture in absolute pixels, instead of a sampling coordinate. However you must implement filtering yourself then.