Jump to content
  • Advertisement
Sign in to follow this  
Xenkan

OpenGL VBO problem, first vertex in wrong position

This topic is 2453 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I need help from someone smarter than me with OpenGL, before I go crazy(ier)!

I'm trying to use VBO's to load my vertices into the graphics card's memory, so that I can call DrawArrays() and the graphics card can draw lots of triangles on its own without much CPU involvement. Note: I am not using shaders, at least not yet.

However, when I try to use VBO's, the vertices are displayed in wrong positions, not the positions that I specified. It seems the first vertex is always wrong.

When I do this:

float Vertices[] = { -10, 0, 10, 10, 0, 10, -10, 0, -10,
-10, 0, -10, 10, 0, 10, 10, 0, -10 };
uint VBO;
glGenBuffers(1, &VBO);
glBindBuffer(GL_ARRAY_BUFFER, VBO);
glBufferData(GL_ARRAY_BUFFER, 4*6*3, Vertices, GL_STATIC_DRAW);
glVertexPointer(3, GL_FLOAT, 0, 0);
// later...
glEnableClientState(GL_VERTEX_ARRAY);
glDrawArrays(GL_TRIANGLES, 0, 6);

It works, and I see my floating square:
VBOProblem-NoMap.png


However, when I do this:

float Vertices[] = { -10, 0, 10, 10, 0, 10, -10, 0, -10,
-10, 0, -10, 10, 0, 10, 10, 0, -10 };
uint VBO;
glGenBuffers(1, &VBO);
glBindBuffer(GL_ARRAY_BUFFER, VBO);
glBufferData(GL_ARRAY_BUFFER, 4*6*3, 0, GL_STATIC_DRAW);
void* Map = glMapBuffer(GL_ARRAY_BUFFER, GL_WRITE_ONLY);
memcpy(Map, Vertices, 4*6*3);
glUnmapBuffer(GL_ARRAY_BUFFER);
glVertexPointer(3, GL_FLOAT, 0, 0);
// later...
glEnableClientState(GL_VERTEX_ARRAY);
glDrawArrays(GL_TRIANGLES, 0, 6);

I see a misshapen square, with the first vertex in the wrong position:
VBOProblem-Map.png

It was my understanding that these two methods should be equivalent. I seem to have a similar problem, even when using the first method (passing pointer directly to glBufferData), if I place the buffer loading code into a separate function, and then my "Vertices" variable goes out of scope. But I don't understand why this should matter, I thought the vertex data was being copied to the graphics card?

I've tried every which way to make this work, studied documentation, updated my video drivers, tested this code on different PCs, always a problem.

Here's the full source code, in case I missed mentioning something important. Any help would be greatly appreciated! Thanks!

Share this post


Link to post
Share on other sites
Advertisement
[size=2]Seems to work on my computer, modified few lines of the code(just headers and wglGetProcAddress) so it would compile on linux.

Share this post


Link to post
Share on other sites
Hi Sponji, thank you very much for taking the time to test this! I'm glad I used SDL.

Well... maybe the problem has something to do with my build environment, or my testing PCs? I'm building on Windows XP using TDM GCC 4.5.1. The two PCs I tested the .exe on were both Windows XP. I tried testing on a third PC, but that PC didn't seem to support the buffer functions, so that was a: FAIL.

Some more information to add:

I've been playing around with gDEBugger, which seems like a great program so far! Here's what my VBO looks like in gDEBugger:

VBOProblem-Debug.png

The x coordinate for my first vertex is NOT correct. It is listed as 1.0089349e-043 instead of -10.

Share this post


Link to post
Share on other sites
In case anyone searches this later, I finally solved my problem.

I'm loading my function pointers using wglGetProcAddress. However, the function pointers were not marked STDCALL. I believe this was causing my local variables, ie. stack-based variables, to become corrupted.

In other words, I was doing this:

void (*glBindBuffer)(int target, uint buffer);
glBindBuffer = (void (*)(int target, uint buffer))wglGetProcAddress("glBindBuffer");


I should've been doing this:

STDCALL void (*glBindBuffer)(int target, uint buffer);
glBindBuffer = (void (*)(int target, uint buffer))wglGetProcAddress("glBindBuffer");


Thanks Sponji, for tipping me off that Linux doesn't have wglGetProcAddress, and therefore doesn't have this problem!

Share this post


Link to post
Share on other sites
Why don't you use one of the already created Libs for dealing with function pointers, such as GLEW.sourceforge.net

Share this post


Link to post
Share on other sites
I guess I feel that using a library to load a library seems superfluous. In retrospect, it obviously would have saved me some headache. I've got it working now, but if I need more than 4 functions in the future, I might use GLEW. Thanks for the tip!

Share this post


Link to post
Share on other sites
Definitely use GLEW; it seems OK now but when you get to using other functionality beyond 1.1 you'll be incredibly thankful for it (and also be thankful for already knowing how to use it when the time comes that you need it NOW).

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!