Jump to content
  • Advertisement
Sign in to follow this  
arasmussen

OpenGL Vertex Buffer Objects Crash

This topic is 2846 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi,

I'm trying to get VBO's to work in OpenGL. I'm following a ton of tutorials, and vertex arrays are working fine, but I get a runtime crash when using vertex buffer objects. Here's the code:


// global vars:
GLfloat *vertices;
GLfloat *texcoords;
GLuint vbo;

// init vbo
glGenBuffers(1, &vbo);
glBindBuffer(GL_ARRAY_BUFFER, vbo);
glBufferData(GL_ARRAY_BUFFER, sizeof(vertices) + sizeof(texcoords), 0, GL_STATIC_DRAW);
glBufferSubData(GL_ARRAY_BUFFER, 0, sizeof(vertices), vertices);
glBufferSubData(GL_ARRAY_BUFFER, sizeof(vertices), sizeof(texcoords), texcoords);

// draw loop
glBindBuffer(GL_ARRAY_BUFFER, vbo);
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_TEXCOORD_ARRAY);

glVertexPointer(3, GL_FLOAT, 0, 0);
glTexCoordPointer(2, GL_FLOAT, 0, (GLvoid*)(sizeof(vertices)))

glBindTexture(GL_TEXTURE_2D, texture);
glDrawArrays(GL_TRIANGLES, 0, 3);

glDisableClientState(GL_VERTEX_ARRAY);
glDisableClientState(GL_TEXCOORD_ARRAY);
glBindBuffer(GL_ARRAY_BUFFER, 0);



When I change the following lines, it works, but that's using vertex arrays, rather than vbo's.


// These lines cause a crash
glVertexPointer(3, GL_FLOAT, 0, 0);
glTexCoordPointer(2, GL_FLOAT, 0, (GLvoid*)(sizeof(vertices)))

// These lines work, but use vertex arrays rather than vbos
glVertexPointer(3, GL_FLOAT, 0, vertices);
glTexCoordPointer(2, GL_FLOAT, 0, texcoords);


The actual crash is an "unhandled exception" on the line that contains glDrawArrays.

Here are any relevant system specs:
Language: C++
IDE: Visual Studio 2010 Professional
OS: Windows 7 Ultimate
GPU: nVidia GeForce 9650m GT (supports up to GL 3.3)

Share this post


Link to post
Share on other sites
Advertisement
Hi,
sizeof(vertices)
isn't doing what you are hoping it does, I think. sizeof(vertices) will (probably) be 4, since it is just a pointer. You probably want something more like:
sizeof(float) * 3 * NUM_VERTICES
which will give you the size in bytes of your vertex array. Some explanation:
(1) using float, because that's the data type of your vertex coordinates
(2) multiplying by 3, because there are three floats per vertices (x, y, z)

Also, keep in mind that the last parameter in glVertexPointer works differently when you have a buffer bound.

http://www.opengl.or...fferSubData.xml
http://www.opengl.or...rtexPointer.xml

I would read those pages over. When you fix the sizeof problems you will be a lot closer to having it work correctly, I think. Hope that helps!

Share this post


Link to post
Share on other sites

You probably want something more like:
sizeof(float) * 3 * NUM_VERTICES

You're right, that code was wrong. I'm actually using the following:
glTexCoordPointer(2, GL_FLOAT, 0, (GLvoid*)(9 * sizeof(GLfloat)));

Still doesn't work though. If I take the texture coordinates out altogether, and just try to render the vertex coordinates, it still crashes at the same place.

Share this post


Link to post
Share on other sites
I'm guessing that second snippet isn't right either.&nbsp;&nbsp;glTexCoordPointer's 4th parameter, when you have a buffer bound, is the offset into the buffer where the texture coordinates begin.&nbsp;&nbsp;Unless your vertex array is made up of only three points, (3 vertices with 3 floats each), 9*sizeof(float) is going to be wrong.&nbsp;&nbsp;<br><br>Also, in pretty much every place (in your original post) where you are using sizeof, it is being used incorrectly, so make sure to double check them all.&nbsp;&nbsp;If you want to post an updated sample that might help.

<div><br></div><div>edit: I guess you are just drawing three points. &nbsp;It might help to see where you are creating/populating the arrays as well.</div>

Share this post


Link to post
Share on other sites
Why would it matter what creating my arrays look like? If I use the vertex array method of passing the pointer to the array into the gpu every frame, then it works, correct textures and vertices, so clearly the arrays are fine.

I don't think it's relevant, but here ya go:


vertices = new GLfloat[9];
texcoords = new GLfloat[6];

vertices[0] = 0.0f;
...
vertices[8] = 1.0f;

texcoords[0] = 0.0f;
...
texcoords[5] = 1.0f;


I'm also using GLee. My gl.h header is version 1.2 (standard on every windows machine) and I'm getting all of my buffer functions declared in GLee.h. Not sure if that makes a difference.

Share this post


Link to post
Share on other sites

Just trying to eliminate possible problems. If you can post your latest vbo setup/drawing code that would be helpful too.


I can post the entire source file in about 5 hours. I think I might take the vbo specific code from the program and see if it doesn't work in a program all by itself. I have a few other things going on including keyboard/mouse input, displaying the framerate, moving my character, collision detection. I've commented almost all of that out though, so I could focus on the switch from vertex arrays to vbos.

Still, regardless of all that other stuff, why would it crash on glDrawArrays... I wonder if my framerate code is calling glDrawArrays... Can you call that function twice per render loop?

Share this post


Link to post
Share on other sites
Okay I made the sample program, still doesn't work. Here's the relevant code.



// globals

GLuint vbuffer;
GLuint texture;

GLfloat *vertices;
GLfloat *texcoords;

// called in InitGL to generate the vbo for the trianglevoid GenerateTriangle()
{
vertices = new GLfloat[9];
vertices[0] = -0.5f; vertices[1] = -0.5f; vertices[2] = -0.1f;
vertices[3] = -0.5f; vertices[4] = 0.5f; vertices[5] = -0.1f;
vertices[6] = 0.5f; vertices[7] = 0.5f; vertices[8] = -0.1f;

texcoords = new GLfloat[6];
texcoords[0] = 0.0f;
texcoords[1] = 1.0f;
texcoords[2] = 1.0f;
texcoords[3] = 1.0f;
texcoords[4] = 1.0f;
texcoords[5] = 0.0f;

GLuint vbuffer = 0;
glGenBuffers(1, &vbuffer);
glBindBuffer(GL_ARRAY_BUFFER, vbuffer);
glBufferData(GL_ARRAY_BUFFER, 15 * sizeof(GLfloat), 0, GL_STATIC_DRAW);
glBufferSubData(GL_ARRAY_BUFFER, 0, 9 * sizeof(GLfloat), vertices);
glBufferSubData(GL_ARRAY_BUFFER, 9 * sizeof(GLfloat), 6 * sizeof(GLfloat), texcoords);

delete [] vertices;
delete [] texcoords;
}


// Called on startup after the GL window is created
int InitGL(GLvoid)
{
LoadGLTextures("Data/grass.bmp", &texture);
GenerateTriangle();

glEnable(GL_TEXTURE_2D);
glShadeModel(GL_SMOOTH);
glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
glClearDepth(1.0f);
glEnable(GL_DEPTH_TEST);
glDepthFunc(GL_LEQUAL);
glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST);

return true;
}


// Render loop.
int DrawGLScene(GLvoid)
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();

glTranslatef(0.0f, 0.0f, -10.0f);

glBindBuffer(GL_ARRAY_BUFFER, vbuffer);

glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_TEXTURE_COORD_ARRAY);

glBindTexture(GL_TEXTURE_2D, texture);
glVertexPointer(3, GL_FLOAT, 0, 0);
glTexCoordPointer(2, GL_FLOAT, 0, (GLvoid*)(9 * sizeof(GLfloat)));

glDrawArrays(GL_TRIANGLES, 0, 3);

glDisableClientState(GL_TEXTURE_COORD_ARRAY);
glDisableClientState(GL_VERTEX_ARRAY);

glBindBuffer(GL_ARRAY_BUFFER, 0);

return true;
}

Share this post


Link to post
Share on other sites
Solved. I am an idiot. My issue was on the following line:


GLuint vbuffer = 0


I was declaring a local variable instead of using the global vbo, so that by the time I made it to my render function, the global vbo was still NULL. FML. Textures still don't work with the fixed code though, but I'm sure I can figure that one out. Thanks for taking a look, Scott.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!