Problem with vertex buffer extension

Started by
8 comments, last by IgnisDeus 18 years, 8 months ago
I seem to have a problem with loading a binary model file into the OpenGL vertex buffer extension. The file loads fine into a dynamic array, and using a "for" loop along with simple glVertex3f calls the model displays perfectly. But when I try to transfer it over to the buffer, the model will display with random geometry(such as a scrambled sphere, random triangles or just a splinter). Also, if I go above rougly 10k triangles, the render goes haywire. See for yourself in these pictures. http://www.angelfire.com/space2/artib/VBO.html I can't figure out what's wrong. Any help would be highly appreciated. I'm using interleaved arrays with the first 9 floats being 3 normals, the next 9 being 3 vertices. Here is my loading code:

#define BUFFER_OFFSET(i) ((char *)NULL + (i))
//Variables:
GLfloat *GlobalArray; //Pointer for the dynamic array
int GlobalArraySize=0; //How many elements in the dynamic array 
GLuint buffer = 0; For the VBO buffer
This is in the init code:

OpenBinaryFile("c:/mace.jjj");//This inits and loads the DynamicArray
glGenBuffersARB(1, &buffer);
glBindBufferARB(GL_ARRAY_BUFFER_ARB, buffer);
glBufferDataARB(GL_ARRAY_BUFFER_ARB, GlobalArraySize*sizeof(float), GlobalArray, GL_STATIC_DRAW_ARB);
This is for the drawing:

glEnableClientState(GL_NORMAL_ARRAY);
glEnableClientState(GL_VERTEX_ARRAY);

static int nOffsetForNormals = 0; //No offset
static int nOffsetForVertices = sizeof(float) * 9; //Offset by 9 floats

glBindBufferARB( GL_ARRAY_BUFFER_ARB, buffer);
glNormalPointer( GL_FLOAT, 0, BUFFER_OFFSET(nOffsetForNormals));
glVertexPointer( 3, GL_FLOAT, 0, BUFFER_OFFSET(nOffsetForVertices));	
glDrawArrays(GL_TRIANGLES, 0, GlobalArraySize);

glDisableClientState(GL_VERTEX_ARRAY);
glDisableClientState(GL_NORMAL_ARRAY);
I've tried everything. Any help would be appreciated!
Advertisement
Im not seeing any code describing how GlobalArray is getting populated....

Also, how exactly is your buffer filled?

x,y,z,nx,ny,nz,?,?,?

What are the last 3?
I'm surprised it looks as good as it does. When using interleaved arrays you need to have your data in <vertex> <normal> <vertex> <normal> format, not <vertex> <vertex> <vertex> <normal> <normal> <normal>. You also need to specify a stride when you call gl{Normal/Vertex}Pointer (in this case it would be 6 * sizeof(float) for both).

Enigma
glEnableClientState(GL_NORMAL_ARRAY);
glEnableClientState(GL_VERTEX_ARRAY);


If those two states are enabled, shouldnt the stride of both be set to zero?
Not if you're using interleaved arrays but not the glInterleavedArrays function. At the time you specify the normals OpenGL has no idea how many components you'll be specifying for the vertices, so it cannot know what stride to use.

Enigma
^^^ I don't have any of my code available ATM, but I believe you are correct. The strides for both vertex and normal should be equal, and greater than zero (unless they are tightly packed, not the case with his code).
lol, well I just recently started experimenting with interleaved arrays about a week ago. The format would be {nx,ny,nz, nx,ny,nz, nx,ny,nz, vx,vy,vz, vx,vy,vz, vx,vy,vz}. Let me try out the verext, normal, vertex format. Thanks for all the replies!
Well it's fixed, thank you guys SO MUCH! But now I have an odd problem with the lighting. Check it out, he looks plastic:
http://www.angelfire.com/space2/artib/Guy.jpg

This is the same diffuse lighting as the other pictures.
Looks like your code is using the vertex data for normals. Did you adjust your offsets to be 0 for the vertices and 3 * sizeof(float) for the normals?

Enigma
Thank you again Enigma! I need to stop coding at 2:00 AM, I start to do dumb mistakes. Thank you for all your help. It's looking beautiful now ;).

This topic is closed to new replies.

Advertisement