c++renderingdirectxdirectx-11index-buffer

DirectX11 each triangle using first vertex in the buffer


I'm currently having a problem with my DirectX 11 voxel renderer where whenever a triangle is rendered, the first vertex in the buffer is being used rather than one of the correct vertices.

I have a working version of the same mesh generation algorithm in Unity, and after comparing the Indices and Vertices, they are exactly the same so I don't believe the problem is with the algorithm.

The reason I know the first vertex in the buffer is being used, is because the first vertex created for each chunk is at the chunk's origin. The world position of each chunk GameObject is (0, 0, 0), and the chunk's position is just added as an offset to each vertex when constructing each chunk's mesh.

As you can see in the image below, the triangles are stretching to the origin of the chunk (The first vertex). The mountainous shape of the voxel terrain can still be seen, so some vertices are being rendered in the correct position. I just do not understand why it appears that every third index gets set to 0 (I'm not sure if this is actually what is happening, it just looks like what I imagine that would look like).

Image of the triangles stretching

Here is the code I am using for creating the vertex and index buffers for each chunk. m_vertices and m_indices are both vectors, which are identical to the working Unity version when I have compared them.

    VoxelMesh mesh;
    DirectX::VertexPositionNormalTexture* verticesArray = new DirectX::VertexPositionNormalTexture[m_vertices.size()];
    for (unsigned int i = 0; i < m_vertices.size(); i++) {
        verticesArray[i].position = m_vertices[i];
        verticesArray[i].normal = m_normals[i];
        verticesArray[i].textureCoordinate = m_uvs[i];
    }

    //Create vertex buffer
    ID3D11Buffer* vertexBuffer;
    D3D11_BUFFER_DESC bd;
    ZeroMemory(&bd, sizeof(bd));
    bd.Usage = D3D11_USAGE_DEFAULT;
    bd.ByteWidth = sizeof(DirectX::VertexPositionNormalTexture) * m_vertices.size();
    bd.BindFlags = D3D11_BIND_VERTEX_BUFFER;
    bd.CPUAccessFlags = 0;

    D3D11_SUBRESOURCE_DATA InitData;
    ZeroMemory(&InitData, sizeof(InitData));
    InitData.pSysMem = verticesArray;

    device->CreateBuffer(&bd, &InitData, &vertexBuffer);

    mesh.m_VertexBuffer = vertexBuffer;
    mesh.m_VBOffset = 0;
    mesh.m_VBStride = sizeof(DirectX::VertexPositionNormalTexture);


    //Create index buffer
    unsigned int* indicesArray = new unsigned int[m_indices.size()];
    for (unsigned int i = 0; i < m_indices.size(); i++) {
        indicesArray[i] = m_indices[i];
    }

    ID3D11Buffer* indexBuffer;

    D3D11_BUFFER_DESC bd1;
    ZeroMemory(&bd1, sizeof(bd1));
    bd1.Usage = D3D11_USAGE_DEFAULT;
    bd1.ByteWidth = sizeof(WORD) * m_indices.size();
    bd1.BindFlags = D3D11_BIND_INDEX_BUFFER;
    bd1.CPUAccessFlags = 0;

    ZeroMemory(&InitData, sizeof(InitData));
    InitData.pSysMem = indicesArray;
    device->CreateBuffer(&bd1, &InitData, &indexBuffer);

    mesh.m_IndexCount = m_indices.size();
    mesh.m_IndexBuffer = indexBuffer;

    delete[] indicesArray;
    delete[] verticesArray;

VoxelMesh struct:

struct VoxelMesh {
    ID3D11Buffer* m_VertexBuffer;
    ID3D11Buffer* m_IndexBuffer;
    UINT m_VBStride;
    UINT m_VBOffset;
    UINT m_IndexCount;
};

Solution

  • Your index buffer is declared as an R16_UINT, but the buffer you're providing is using 32 bit indices. Either change indicesArray to unsigned short*, or change your buffer to be R32_UINT.