raytracingvertex-bufferdirectx-12index-buffer

Issue with read vertex / index buffer with structured buffer in DXR


enter image description here Scene is just like above image(Sponza scene), and I think texture index for each geometry is good, but texture coordinate seems wrong

I'm currently upload descriptor in descriptor heap like this :

(instance 0 textures(0 to 74)) (vertex buffer for each geometry(75 to 467)) (index buffer for each geometry(468 to 861)) (geometryInfo(862)) (intstance 1 textures)...

but currently I'm uploading only one instance (and all scene geometrys are in same one BLAS)

start index of vertex / index buffer is uploads by per-instance buffer like this :

cbuffer InstanceCB : register(b1)
{
    uint GeometryInfoIndex : packoffset(c0.x);
    uint VertexAttribIndex : packoffset(c0.y);
    uint IndexBufferIndex : packoffset(c0.z);
}

and geometryInfo have texture materials like this :

struct GeometryInfo
{
    uint AlbedoTextureIndex;
    uint MetalicTextureIndex;
    uint RoughnessTextureIndex;
    uint NormalMapTextureIndex;
    uint OpacityMapTextureIndex;
};

and I'm reading vertex buffer or index buffer by dynamic resource feature added in hlsl 6.6 just like this :

Vertex GetHitSurface(in BuiltInTriangleIntersectionAttributes attr, in uint geometryIdx)
{
    float3 barycentrics = float3(1 - attr.barycentrics.x - attr.barycentrics.y, attr.barycentrics.x, attr.barycentrics.y);

    StructuredBuffer<Vertex> VertexBuffer = ResourceDescriptorHeap[VertexAttribIndex + geometryIdx];
    StructuredBuffer<uint> IndexBuffer = ResourceDescriptorHeap[IndexBufferIndex + geometryIdx];
    
    uint primIndex = PrimitiveIndex();
    
    uint i0 = IndexBuffer[primIndex * 3 + 0];
    uint i1 = IndexBuffer[primIndex * 3 + 1];
    uint i2 = IndexBuffer[primIndex * 3 + 2];
    
    Vertex v0 = VertexBuffer[i0];
    Vertex v1 = VertexBuffer[i1];
    Vertex v2 = VertexBuffer[i2];

    return VertexBarycentricLerp(v0, v1, v2, barycentrics);
}

in NSight, vertex buffer is initialized well and i checked VertexAttribIndex and IndexBufferIndex is good but result is bad.

enter image description here

and vertex buffer seems like good for me, because position-normal-texcoord-tangent-bitangent values are correctly uploaded like above screenshot(it's from NSight)

I've checked everything what I can, but every parameter seems good, and still can't find why texture cooredinate that I got is wrong.

Shader debugging might be helpful, but there's no shader debugging feature for RT in every(or most of) grahpics debugger...

enter image description here enter image description here enter image description here

And this is why I can't doubt parameters...

in above screen shot, you can find 75, 468 index is bound of vertex / index buffer(because of it's format)

this is my GitHub repository link : https://github.com/kcjsend2/Chulsu


Solution

  • Found it.

    SRV Descriptor for Structured Buffer was wrong.

    Before fix it, it was just like this :

    D3D12_SHADER_RESOURCE_VIEW_DESC srvDesc{};
    srvDesc.Shader4ComponentMapping = D3D12_DEFAULT_SHADER_4_COMPONENT_MAPPING;
    srvDesc.Format = DXGI_FORMAT_R32_FLOAT;
    srvDesc.ViewDimension = D3D12_SRV_DIMENSION_BUFFER;
    
    srvDesc.Buffer.FirstElement = 0;
    srvDesc.Buffer.NumElements = mVerticesCount * 14;
    srvDesc.Buffer.StructureByteStride = 0;
    srvDesc.Buffer.Flags = D3D12_BUFFER_SRV_FLAG_NONE;
    

    Format and NumElements, StructureByteStride must be just like this :

    srvDesc.Shader4ComponentMapping = D3D12_DEFAULT_SHADER_4_COMPONENT_MAPPING;
    srvDesc.Format = DXGI_FORMAT_UNKNOWN;
    srvDesc.ViewDimension = D3D12_SRV_DIMENSION_BUFFER;
    
    srvDesc.Buffer.FirstElement = 0;
    srvDesc.Buffer.NumElements = mVerticesCount;
    srvDesc.Buffer.StructureByteStride = sizeof(Vertex);
    srvDesc.Buffer.Flags = D3D12_BUFFER_SRV_FLAG_NONE;