I am having a hard time to match up the OpenGL specification (version 3.1, page 27) with common example usage all over the internet.
The OpenGL spec version 3.1 states for DrawElements:
The command
void DrawElements(enum mode, sizei count, enum type, void *indices);
constructs a sequence of geometric primitives by successively transferring the count elements whose indices are stored in the currently bound element array buffer (see section 2.9.5) at the offset defined by indices to the GL. The i-th element transferred by DrawElements will be taken from element indices[i] of each enabled array.
I tend to interpret this as follows:
The indices parameter holds at least count values of type type. Its elements serve as offsets into the actual element buffer. Since for every usage of DrawElements an element buffer must be currently bound, we actually have 2 obligatory sets of indices here: one in the element buffer and another in the indices array.
This would seem somehow wasting for most situations. Unless one has to draw a model which is defined with an element array buffer but needs to sort its elements back to front due to transparency or so. But how would we achieve to render with the plain element array buffer (no sorting) than ?
Now, strange enough, most examples and tutorials in the internet (here,here half page down 'Indexed drawing'.) give a single integer as indices parameter, mostly it is 0. Sometimes (void*)0. It is always only a single integer offset - clearly no array for the indices parameter!
I have used the last variant (giving a single pointerized integer for indices) successfully with some NVIDIA graphics. But I get crashes on Intel on board chips. And I am wondering, who is wrong: me, the spec or thousands of examples. What are the correct parameter and usage of DrawElements? If the single integer is allowed, how does this go along with the spec?
You're tripping over the legacy glDrawElements has ever since OpenGL-1.1. Back then there were no VBOs, but just client side arrays, and the program would actually give a pointer (=array in C terms) of indices into the buffer/array set with the gl…Pointer functions.
Now with index buffers, the parameter is actually just an offset into the server side buffer. You might be very interested in this SO Question: What is the result of NULL + int?
Also I gave an exhaustive answer there, I strongly recommend reading https://stackoverflow.com/a/8284829/524368
What I wrote about function signatures and typecasts also applies to glDraw… calls.