• Advertisement

Archived

This topic is now archived and is closed to further replies.

Vertex Arrays dont work

This topic is 5835 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I can make a demo using vertex arrays, and it will compile with no problems. When I run the .exe though it crashes. Any known problems with vertex arrays? I''m running Win XP on a GeForce3 ti500 with 21.83 dets. Any ideas? As you can imagine this is a major setback. Thanks in advance.

Share this post


Link to post
Share on other sites
Advertisement
What is bolux memory? Could you take a look at some sample source I have that wont run? thx

Edited by - executor_2k2 on January 27, 2002 3:59:07 PM

Share this post


Link to post
Share on other sites
this is exactly what i mean about people trying to get into game/graphics programming before they even understanding the basic funcdamentals of programming.

the problem is not with OpenGL. read the documentation on vertex arrays before you start using them. or try using the debug option of VS, that''s what it''s for.

anyway, the definition of bullox memory....

bullox memory: you are either trying to access memory that is not permitted to the application, or the memory is filled with invalid values.

To the vast majority of mankind, nothing is more agreeable than to escape the need for mental exertion... To most people, nothing is more troublesome than the effort of thinking.

Share this post


Link to post
Share on other sites
I dont think the code is at fault. It came with the "OpenGL Game Programming" book by kevin hawkins and dave astle. I understand vertex arrays. The code looks fine by my inspection. I am trying to find out if its an imcompatibility with my machine.

And d00d this is the "For Beginners" forum, so chill out.

Share this post


Link to post
Share on other sites
so what happens when you step through the code?

To the vast majority of mankind, nothing is more agreeable than to escape the need for mental exertion... To most people, nothing is more troublesome than the effort of thinking.

Share this post


Link to post
Share on other sites
Well it says there is an access violation. When I get to the line

glDrawElements(GL_TRIANGLE_STRIP, MAP_X * 2, GL_UNSIGNED_INT, &g_indexArray[z * MAP_X * 2]);

"Unhandeled Exeception in terrain.exe (NVOGLNT.DLL): 0xC0000005: Access Violation"

here''s my definition of g_indexArray:
GLuint g_indexArray[MAP_X * MAP_Z * 6]; //vertex index array

From that error message im thinking memory access violation naturally. The problem is, I dont see where the memory access is going wrong.

Share this post


Link to post
Share on other sites
okay so you are creating an array of size (32x32x6) = 6144 integers.

you are telling "glDrawElements" to draw (32x2) = 64 vertices.

now, what is the value of "z" when this access violation occurs?

Edit: 2048 -> 6144, my math is a little off.

To the vast majority of mankind, nothing is more agreeable than to escape the need for mental exertion... To most people, nothing is more troublesome than the effort of thinking.

Edited by - jenova on January 27, 2002 10:05:22 PM

Share this post


Link to post
Share on other sites
The value of z is 0. The overal structure for that part of the program is as follows

BOOL DisplayScene()
{
static float waterHeight = 154.0f; // height of water
static bool waterDir = true; // used to animate water; true = up, false = down

float radians = float(PI*(g_angle-90.0f)/180.0f);

// calculate the camera's position
g_cameraPos[0] = g_lookAt[0] + (float)sin(radians) * g_mouseY; // multiplying by mouseY makes the
g_cameraPos[2] = g_lookAt[2] + (float)cos(radians) * g_mouseY; // camera get closer/farther away with mouseY
g_cameraPos[1] = g_lookAt[1] + g_mouseY / 2.0f;

// calculate the camera look-at coordinates as the center of the terrain map
g_lookAt[0] = (MAP_X*MAP_SCALE)/2.0f;
g_lookAt[1] = 150.0f;
g_lookAt[2] = -(MAP_Z*MAP_SCALE)/2.0f;

// clear screen and depth buffer
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();

// set the camera position
gluLookAt(g_cameraPos[0], g_cameraPos[1], g_cameraPos[2],
g_lookAt[0], g_lookAt[1], g_lookAt[2],
0.0, 1.0, 0.0);

// set the current texture to the land texture
glBindTexture(GL_TEXTURE_2D, g_land);

// loop through all the triangle strips
for (int z = 0; z < MAP_Z-1; z++)
{
//MessageBox(NULL,"Couldnt Draw Terrain.","TERRAIN ERROR",MB_OK | MB_ICONINFORMATION);
// draw the triangles in this strip
glDrawElements(GL_TRIANGLE_STRIP, MAP_X * 2, GL_UNSIGNED_INT, &g_indexArray[z * MAP_X * 2]);
MessageBox(NULL,"Couldnt Draw Terrain.","TERRAIN ERROR",MB_OK | MB_ICONINFORMATION);
}

// enable blending
glEnable(GL_BLEND);

// enable read-only depth buffer
glDepthMask(GL_FALSE);

// set the blend function to what we use for transparency
glBlendFunc(GL_SRC_ALPHA, GL_ONE);

glColor4f(0.5f, 0.5f, 1.0f, 0.7f); // set color to a transparent blue
glBindTexture(GL_TEXTURE_2D, g_water); // set texture to the water texture

// draw water as one large quad surface
glBegin(GL_QUADS);
glTexCoord2f(0.0f, 0.0f); // lower left corner
glVertex3f(g_terrain[0][0], waterHeight, g_terrain[0][2]);

glTexCoord2f(10.0f, 0.0f); // lower right corner
glVertex3f(g_terrain[MAP_X-1][0], waterHeight, g_terrain[MAP_X-1][2]);

glTexCoord2f(10.0f, 10.0f); // upper right corner
glVertex3f(g_terrain[MAP_X-1 + MAP_X * (MAP_Z-1)][0], waterHeight, g_terrain[MAP_X-1 + MAP_X * (MAP_Z-1)][2]);

glTexCoord2f(0.0f, 10.0f); // upper left corner
glVertex3f(g_terrain[MAP_X * (MAP_Z-1)][0], waterHeight, g_terrain[MAP_X * (MAP_Z-1)][2]);
glEnd();

// set back to normal depth buffer mode (writable)
glDepthMask(GL_TRUE);

// disable blending
glDisable(GL_BLEND);

// animate the water
if (waterHeight > 155.0f)
waterDir = false;
else if (waterHeight < 154.0f)
waterDir = true;

if (waterDir)
waterHeight += 0.01f;
else
waterHeight -= 0.01f;

return TRUE;
} // end DisplayScene()

Sorry bout that code dump. I really appreciate your guy's help.

Edited by - executor_2k2 on January 27, 2002 10:26:04 PM

Share this post


Link to post
Share on other sites
Ok guys I just got an email reply from Dave Astle on the issue.
Here it is from the horse''s mouth:

Okay, I just ran both programs on my computer (I''m using 2000) and I got
the crashes too. The only thing that''s changed about my system since I
wrote those programs is my video card drivers, so NVIDIA must have
changed something. I was able to get the programs to work by removing
the lines:

if (glLockArraysEXT)
glLockArraysEXT(0, MAP_X * MAP_Z * 6);

and

// if the compiled arrays extension is available, unlock the arrays
if (glUnlockArraysEXT)
glUnlockArraysEXT();

I''m not sure why these are causing the crash, so I''ll have to do some
digging. It''s possible that I wasn''t using them correctly.

Dave

Thought i''d ease your guy''s minds and post the solution for future reference.

Share this post


Link to post
Share on other sites
and that folks is an example of bollux memory

in nvidia drivers youll get a message like so
(NVOGLNT.DLL): 0xC0000005: Access Violation

it often happens if u forget something like
glEnableClientState( colour );
glEnableClientState(vertices );
glVertexPointer(..)
glDrawElements(....)

wheres the colour data pointer?

http://uk.geocities.com/sloppyturds/gotterdammerung.html

Share this post


Link to post
Share on other sites

  • Advertisement