Index Buffer question

Started by
1 comment, last by MrDoomMaster 16 years ago
Hi, Currently I'm creating an Index Buffer in OpenGL via GL_ELEMENT_ARRAY_BUFFER_ARB in a call to glBufferDataARB(). When I get ready to render my geometry, I call the following: glDrawElements( GL_TRIANGLES, num_indices, GL_INT, 0 ); The above *should* work, I think, however it doesn't. If I change GL_INT to GL_UNSIGNED_INT, it seems to render. My index array is an array of 'int', not 'unsigned int', so I don't know why OpenGL treats these differently (since fundamentally they are the same size anyway). Could someone explain? Thank you.
Advertisement
Check some documentation. It should be pretty clear why GL_UNSIGNED_INT works, but GL_INT doesn't.
Oh, okay. I misunderstood the documentation. It states I cannot use GL_INT. Sorry for the confusion.

This topic is closed to new replies.

Advertisement