VBOs and glDrawElements...

Started by
2 comments, last by V-man 13 years, 11 months ago
Hi, I have been working on a project for school that is supposed to load in a .obj file exported from Blender3d (a modeling program) and display the model using OpenGL's VBOs. I have verified that my custom .obj file parser works correctly and the vertex and index information is being stored correctly in their respective arrays (using std::vector template). The problem is, it seems like the indice or vertex information is somehow off because what is displayed is sort of a twisted box. An image of this may be found at: VBO Error 1 . The .obj file that was exported via Blender3D may be found here. The unit cube (box) should look like what is shown in the middle of the 'VBO Error 1' screenshot. When I use glDrawArrays for the first 4 elements of the array (since they are in order), I get what is shown in this picture. For proof that my file loader worked, check out this screenshot of Visual Studios Debugger in action. My VBO IDs and arrays are initialized as follows:

GLuint			m_vertexBufferID;
GLuint			m_vertexIndexBufferID;

vector<GLfloat>		m_vertexBuffer;
vector<GLuint>		m_vertexIndexBuffer;

The code for the initialization of the index and vertex VBOs as well as the rendering code is shown below: Initialization:

glGenBuffers(1, &m_vertexBufferID); //Generate a buffer for the vertices
glBindBuffer(GL_ARRAY_BUFFER, m_vertexBufferID); //Bind the vertex buffer
glBufferData(GL_ARRAY_BUFFER, sizeof(float) * m_vertexBuffer.size(),
                 &m_vertexBuffer[0], GL_STATIC_DRAW); //Send the data to OpenGL
	
glGenBuffers(1, &m_vertexIndexBufferID);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, m_vertexIndexBufferID);
glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(GL_UNSIGNED_INT) * 
                 m_vertexIndexBuffer.size(), &m_vertexIndexBuffer[0], GL_STATIC_DRAW);


Rendering:

glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();
gluLookAt(-10.0, -6.0, 0.1, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0);

glEnableClientState(GL_VERTEX_ARRAY);
glBindBuffer(GL_ARRAY_BUFFER, m_vertexBufferID);
glVertexPointer(3, GL_FLOAT, 0, (char *) NULL);

glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, m_vertexIndexBufferID); // for indices
glIndexPointer (GL_UNSIGNED_INT, 0, (char *) NULL );

glDrawElements(GL_QUADS, 24, GL_UNSIGNED_INT, static_cast< void * >(0));

glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0);
glBindBuffer(GL_ARRAY_BUFFER, 0);

glDisableClientState(GL_VERTEX_ARRAY);


If anyone has any clue as to what might be happening, please post a reply. Thanks!
Advertisement
Your index buffer has no [0] indices in it, obj indices are stored 1 based, they need to have 1 subtracted off them before using them. The cube would use indices 0 - 7, not 1 - 8 like you have.
I can't believe I missed that...thank you! I've been literally staring at this for hours XD. It works perfectly now...
you are calling glIndexPointer
http://www.opengl.org/wiki/Common_Mistakes#glEnableClientState.28GL_INDEX_ARRAY.29
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);

This topic is closed to new replies.

Advertisement