c++11openglglslshaderssao

Having some wierd artifacting and odd triangle shadows with SSAO Opengl Implmentation


I have been working on implementing SSAO into the engine I am writing, and a major problem has arrived. Everything was going quite well until I realized that my SSAO was not working correctly. There are two things that I can find that are wrong with my SSAO and I am unable to figure out how to remedy them.

My shader code is at the end of this post, before that I will be describing the problems with images.

Firstly, as seen in the below screenshot, there are some wierd artifacts showing up based on the angle of viewing. So far I am assuming the way I am implementing the View matrix is wrong. I have done a lot of research about how this all should work and I understand it in theory. However, in practice things are not changing as I would expect. Strange artifacting on ssao shader.

Secondly, whenever I get close to the blocks, I get very odd triangle shadows that appear around the edges of the screen, as shown in the next screenshot. [![Odd triangle shadows around screen][2]][2]

These two images show the main issues I am having. I am using a deferred type Renderer to render the geometry to a few textures (Position, normals, color) the importing these textures and using them to manipulate the final output. The first two codeblocks are the vertex and fragment shaders respectively for translating the geometry to textures.

Vertex Shader #version 430 core

layout(location=0) in mat4 modelMatrix;
layout(location=4) in vec4 VertexPosition;
layout(location=5) in vec4 VertexNormal;
layout(location=6) in vec3 VertexColor;
layout(location=7) in vec2 TextureCoords;
out vec4 vNormal;
out vec3 vColor;
out vec4 shaderCoord;
out vec2 texCoords;

layout(location=8) uniform mat4 V;
layout(location=12) uniform mat4 P;



void main()
{

    shaderCoord = (V*modelMatrix * VertexPosition);

     mat4 normalMatrix = transpose(inverse(V*modelMatrix));
    vNormal = (normalMatrix*VertexNormal);

    texCoords = TextureCoords;
    vColor = VertexColor;
    gl_Position =  P*shaderCoord;
}

Fragment Shader

#version 430 core

in vec4 vNormal;
in vec3 vColor;
in vec4 shaderCoord;
in vec2 texCoords;
layout (location=0) out vec4 NormalBuffer;
layout (location=1) out vec4 ColorBuffer;
layout (location=2) out vec4 PositionBuffer;
layout (location=3) out vec4 TextureCoordBuffer;
out float fragDepth;

//Start of the main function.
void main()
{   
    NormalBuffer = vec4(normalize(vNormal).xyz, 1.0);
    ColorBuffer = vec4(vColor, 1.0);
    PositionBuffer = vec4(shaderCoord.xyz, 1.0);
    TextureCoordBuffer = vec4(texCoords, 0.0, 1.0);
    fragDepth = gl_FragCoord.z;
}

As you can see, I am translating everything from world space to view space before I write them to the textures. I would much prefer to keep them in world space but when I do, the entire screen looks white with occasional hints of shadows, but the background swaps between white and black depending on camera angle.

Next are my SSAO shaders, In order to implement these I followed a few tutorials, so they probably look familiar. If the tutorial was correct, the next two shaders should work correctly but they are not.

Vetex shader that just creates a quad, and applies the final texture to it. #version 430 core

layout (location=0) in vec3 VertexPosition;
layout (location=1) in vec2 TextureCoords;

out vec2 texCoords;

void main (){
    texCoords = TextureCoords;
    gl_Position = vec4(VertexPosition, 1.0);
}

Fragment shader for SSAO

#version 430 core

in vec2 texCoords;

layout (location=0) out vec4 fColor;

uniform sampler2D NormalBuffer;
uniform sampler2D positionBuffer;

uniform sampler2DArrayShadow shadowMap;
uniform sampler1D SSAOKernelMap;
uniform sampler2D SSAONoiseMap; 

layout(location=12) uniform mat4 P;
layout(location=8) uniform mat4 V;

uniform uint kernelSize;
uniform vec2 windowSize;

//Define Variables for SSAO Processing.
float radius = 0.5;
float SSAOBias = 0.025;
float power = 1.5;
//mat4 biasMatrix = mat4(0.5,0.0,0.0,0.0,0.0,0.5,0.0,0.0,0.0,0.0,0.5,0.0,0.5,0.5,0.5,1.0);

void main()
{   
    //Retrieve from textures
    vec3 shaderCoord = (texture(positionBuffer, texCoords)).xyz;
    vec3 vNormal = normalize((texture(NormalBuffer, texCoords)).rgb);
    //process SSAO
    vec2 NoiseScale = vec2(windowSize.x/4.0, windowSize.y/4.0);
    vec3 randVec = normalize(texture(SSAONoiseMap, texCoords*NoiseScale).xyz);

    vec3 tangent = normalize(randVec - vNormal * dot(randVec, vNormal));
    vec3 bitTangent = cross(vNormal, tangent);
    mat3 TBN = mat3(tangent, bitTangent, vNormal);

    //Begin Processing of SSAO with inputed Kernel Samples
    float Occlusion = 0.0;
    for(int i=0; i<kernelSize; i++){
        vec4 kernelSample = texture(SSAOKernelMap, i);
        vec3 TSample = TBN*kernelSample.rgb;
        TSample = shaderCoord + TSample * radius;

        vec4 newCoord = vec4(TSample, 1.0);
        newCoord = P*newCoord;
        newCoord.xyz /= newCoord.w;
        newCoord.xyz = newCoord.xyz * 0.5 + 0.5;


        float sampleDepth = texture(positionBuffer,newCoord.xy).z;
        //float rangeCheck = smoothstep(0.0,1.0, radius / abs(shaderCoord.z-sampleDepth));
        Occlusion += (sampleDepth >= TSample.z+SSAOBias?1.0:0.0);
    }
    Occlusion =  1.0 - (Occlusion/kernelSize);
    fColor = vec4(vec3(Occlusion),1.0f);
}

That is all the information I can think to provide initially. Any help you guys can provide would be immensely helpful! If any other information would help, please let me know and I will be happy to provide.

EDIT: I figured out that one of my issues was the way that I was accessing the 1D texture above. This made all the kernel samples very strange. I fixed that and now I am getting something like the image below, where half the screen is darker and half the screen is lighter on one side and darker on the other. The contrast line moves with the camera. Wierd line

Any help with this issue would be immensely appreciated!


Solution

  • I have found two things that were wrong that mostly resolved the issue that this current post is about.

    Firstly, the format which I was passing in the kernelMap was off and so all the values were quite skewed.

    Secondly, I was unable to figure out why but when I passed the position and normal values to the Lightingfragment shader in world space and then applied the view and projection matrices to them, they would turn out very strangely. However if I applied the view and projection matrices to the position and normal values in the BaseGeometry shader, then reverted that application in the Lighting shader everything works perfectly.

    If i find out any more information I will happily post here to update any future searchers.