My problem is that the first, and final list of triangles are not being displayed properly, in this example I am loading a sphere from a simple wavefront model obj file. I load it from the obj file exactly the same way for both the display list and VBO and I have verified that the they are getting the exact same numbers from the obj file.
In this example I have used the STL vector to store the vertices and vertex indices. However I have tried using float* and int* before as well but it gave the exact same result.
So here is a screen of the sphere being drawn from a VBO
edit: I removed this hyperlink as I can't post more than 2 as my account is still newb...
As you can see, there is a huge cone being drawn from the far side coming in to the middle of the sphere, it is hard to see but the strip of triangles from the closest side to the furthest side is also wrong, this will become apparent in one of the next few screens Here is the exact same sphere being drawn from the DL
edit: I removed this hyperlink as I can't post more than 2 as my account is still newb...
You can see there that the DL is drawing it correctly Here is them both being drawn at the same time, you will see the biggest difference where the lines are green, that is how they should be
edit: I removed this hyperlink as I can't post more than 2 as my account is still newb...
Here it is more apparent that the top strip is being drawn wrong from the VBO
I then tried adding a large number to the y value of each vertex which shows that there is definitely something wrong with my VBO because the cone's apex inside the sphere never moves!
Ok, so now that you have a visual idea of the problem I thought I'd throw some code in here, I've probably just missed a step or have a number wrong. Please take a look
this is how I create the vbo
void VBOWFModel::Create_VBO()
{
// Vertex Array VBO handle
glGenBuffers(1, &m_VAVBO_handle);
glBindBuffer(GL_ARRAY_BUFFER, m_VAVBO_handle);
glBufferData(GL_ARRAY_BUFFER,
m_vertices.size()*sizeof(Vec3f),
&m_vertices[0].x, GL_STATIC_DRAW);
}
Where m_vertices is the STL vector of Vec3f which simply holds x,y,z. I've tried this as vector as well and it gave the same problem so changing to vector or to float* isn't the issue.
Ok, now onto where I draw it, I got a feeling this is where I'm doing something wrong
void VBOWFModel::Render()
{
glEnableClientState(GL_VERTEX_ARRAY);
glBindBuffer(GL_ARRAY_BUFFER, m_VAVBO_handle);
glVertexPointer(3, GL_FLOAT, sizeof(Vec3f), (float*) NULL);
glPolygonMode( GL_FRONT_AND_BACK, GL_LINE );
// m_Vindices are the vertex indices, not the greatest naming convention but hey...
// it is an STL vector of integers
glDrawElements(GL_TRIANGLES, m_Vindices.size(), GL_UNSIGNED_INT, &m_Vindices[0]);
glPolygonMode( GL_FRONT_AND_BACK, GL_FILL );
glDisableClientState(GL_VERTEX_ARRAY);
}
I am drawing using glPolygonMode( GL_FRONT_AND_BACK, GL_LINE ); simply for debugging purposes because it shows the problem with the incorrect triangles the best.
Can anyone see if I've done something wrong? Have I initialised something incorrectly? Am I missing a step?
Not sure if there is any more info I have to give, I'm just including glew and freeglut (the latest of both.. on windows)
OK, I figured my problem. It's quite silly and embarrassing. WFObject files store the indices like 1st, 2nd, etc., etc., rather than 0th, 1st, etc.
While yes, I loaded in the vertices exactly the same in both my display list and VBO, the difference was that in my DL I was drawing at m_Vertices[m_Vindices[i]-1]
Obviously OpenGL doesn't know that it has to minus 1 from each index being stored, and rightly so! It was my own problem. So now I have -1 from each index as it is stored in the indices array. Problem solved!
Sorry to waste your time but thank you for taking a look!