Jump to content
  • Advertisement
Sign in to follow this  
serious_learner07

OpenGL glDrawElements crashing

This topic is 3361 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Below is the opengl code for rendering a mesh using glDrawElements. The vertices and indices are read from a file. Application is crashing at glDrawElements. Please check the code(only relevant code have been included) and pin point the issue behind the crash.
struct Vertex
{
	float flx;
	float fly;
	float flz;
};
struct Vertex  vertices[500];
struct  Index
{
	int  index1;
	int  index2;
	int  index3;
};
struct Index  indices[1000];
// function pointer to OpenGL extensions
PFNGLDRAWRANGEELEMENTSPROC glDrawRangeElements = 0;
void draw()
{
    glEnableClientState(GL_NORMAL_ARRAY);
    glEnableClientState(GL_VERTEX_ARRAY);
    glVertexPointer(3, GL_FLOAT, 0, vertices);
    glPushMatrix();
	glTranslatef(0,0,2 );                // move to bottom-left
  	glDrawElements(GL_QUADS, 1000, GL_UNSIGNED_BYTE, indices);
    glPopMatrix();
    glDisableClientState(GL_VERTEX_ARRAY);  // disable vertex arrays
    glDisableClientState(GL_COLOR_ARRAY);
    glDisableClientState(GL_NORMAL_ARRAY);
}
void displayCB()
{
     glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT | GL_STENCIL_BUFFER_BIT);
    // save the initial ModelView matrix before modifying ModelView matrix
    glPushMatrix();
    // tramsform camera
    glTranslatef(0, 0, cameraDistance);
    glRotatef(cameraAngleX, 1, 0, 0);   // pitch
    glRotatef(cameraAngleY, 0, 1, 0);   // heading
	draw();        // with glDrawElements()
    glPopMatrix();
    glutSwapBuffers();
}

void initGL()
{
    initLights();
    setCamera(0, 0, 10, 0, 0, 0);
    	// Reading ivertices and indices from a  file.
	FILE *fpvert = fopen("C:\\cow-1000vert.txt", "r"); 
	for(int i = 0; i< 500;  i ++)
	{
		fscanf(fpvert,"%f", & vertices.flx);
		fscanf(fpvert,"%f", &vertices.fly);
		fscanf(fpvert,"%f", &vertices.flz);
	}
	fclose(fpvert);
	FILE *fpindex = fopen("C:\\cow-1000index.txt", "r"); 
	for(int i = 0; i< 1000;  i ++)
	{
		fscanf(fpindex,"%d", &indices.index1);
		fscanf(fpindex,"%d", &indices.index2);
		fscanf(fpindex,"%d", &indices.index3);
	}	
	fclose(fpindex);
}

///////////////////////////////////////////////////////////////////////////////
int main(int argc, char **argv)
{
    // get function pointer to glDrawRangeElements
    initGL(); 
    glutInit(&argc, argv);
    glutInitDisplayMode(GLUT_RGB | GLUT_DOUBLE | GLUT_DEPTH );         glutInitWindowSize(400, 300);               // window size
    glutInitWindowPosition(100, 100);           // window location
     int handle = glutCreateWindow(argv[0]);     // param is the title of window
        glutDisplayFunc(displayCB);
    glutReshapeFunc(reshapeCB);
    return handle;
    glDrawRangeElements = (PFNGLDRAWRANGEELEMENTSPROC)wglGetProcAddress("glDrawRangeElements");
    glutMainLoop(); /* Start GLUT event-processing loop */
    return 0;
}

Share this post


Link to post
Share on other sites
Advertisement
I'm new to openGL and havn't used this function before, so I just read about it then, but maybe it's because you're using GL_UNSIGNED_BYTE when your indices are ints, and you're trying to draw quads when indices only contains 3 numbers (so it's trying to read up to 4000 ints, when your array only contains 3000 ints).

If you want to draw quads:
Make your indices struct have 4 GLuints
the function call should be: glDrawElements(GL_QUADS, 1000, GL_UNSIGNED_INT, indices);

If you want to draw triangles:
Make your indices struct use 3 GLuints
the function call should be: glDrawElements(GL_TRIANGLES, 1000, GL_UNSIGNED_INT, indices);

Share this post


Link to post
Share on other sites
You enabled the normals array (GL_NORMAL_ARRAY), but never provided the normal pointer using glNormalPointer, so it's probably dereferencing a NULL pointer because of that. Don't enable the normals array if you aren't gonna use it, or provide normal data using glNormalPointer.

Share this post


Link to post
Share on other sites
That's probably has to do with the Index structure, as OpenGL only supports and uses only one index per vertex, and doesn't support different indices for each array. What's struct Index used for?

Share this post


Link to post
Share on other sites
My vertex file.

0.151632 -0.043319 -0.08824
0.163424 -0.033934 -0.08411
0.163118 -0.053632 -0.080509
0.176307 -0.028912 -0.075048
0.174429 -0.051613 -0.073945
0.186153 -0.032952 -0.063704
0.189315 -0.049201 -0.055717
0.173804 -0.063906 -0.070661
0.161034 -0.067741 -0.076137
0.140389 -0.06919 -0.082181
0.145322 -0.055406 -0.086283
0.157816 -0.075509 -0.073519
0.138192 -0.077096 -0.078394
0.154266 -0.085543 -0.067386
0.135011 -0.089151 -0.072101
0.149841 -0.099483 -0.056737
0.171041 -0.083631 -0.062782
0.168093 -0.09623 -0.053987
.....
.....
Total no of vertices 500

My index file.

3 1 4
4 2 3
2 4 6
4 1 5
7 8 4
8 200 6
200 278 2
1 10 11
11 5 1
13 12 14
15 13 14
19 205 15
205 20 13
20 206 16
22 417 418
.....
( Total no of indices 1000)


 struct Index indices[1000] 
defined for acessing the indices (see the code above).


As per your suggestion, I changed the code to (one index per vertex)
 struct Index indices[3000] 
...
glDrawElements(GL_TRIANGLES, 3000 GL_UNSIGNED_BYTE, indices);



Now the output looks like this.(Cow shape is formed with some noise)





[Edited by - serious_learner07 on June 4, 2009 11:40:33 PM]
I don't think changing the indices array to an array of 3000 ints will be any different, because it should produce the same as an array of 1000 structs containing 3 ints.

By the looks of the function call you are using:
glDrawElements(GL_TRIANGLES, 3000 GL_UNSIGNED_BYTE, indices);

You've still got GL_UNSIGNED_BYTE in there... Are your indices actually unsigned bytes?

Because if your indices are ints, each int will be read as 4 seperate numbers. So if you had an indices struct which had, say 4, 2 and 1 in it, this is what it would be like in memory:
0000 0000 0000 0100
0000 0000 0000 0010
0000 0000 0000 0001

and if you specified GL_UNSIGNED_BYTE in the call, then I imagine openGL would draw triangles these indices:
0, 0, 0 (first triangle drawn)
4, 0, 0
0, 2, 0
0, 0, 1

instead of 1 triangle with indices 4, 2 and 1.

I would try changing it back to the way you had it before, but store your indices as unsigned ints:

struct indices
{
Gluint a;
Gluint b;
Gluint c;
}

and call glDrawElements like this:
glDrawElements(GL_TRIANGLES, 1000, GL_UNSIGNED_INT, indices);

Tell me how it goes :)

Share this post


Link to post
Share on other sites



Oh sorry. It was a mistake. I copied from my old code.
Last output was based on GL_UNSIGNED_INT only.

But I have another question, How then output of
glDrawElements(GL_TRIANGLES, 1000, GL_UNSIGNED_INT, indices); and
glDrawElements(GL_TRIANGLES, 3000, GL_UNSIGNED_INT, indices); are different.




Share this post


Link to post
Share on other sites
So it still isn't working?

Quote:
Original post by serious_learner07
But I have another question, How then output of
glDrawElements(GL_TRIANGLES, 1000, GL_UNSIGNED_INT, indices); and
glDrawElements(GL_TRIANGLES, 3000, GL_UNSIGNED_INT, indices); are different.


Well like I said before I havn't used this function before so I'm just going by what the man page says. But it says that size specifies the number of elements to be drawn. So I imagine that it means, 1000 triangles, so it will expect an array of 3000 unsigned ints (or if it was quads, it would expect an array of 4000 indices if you have 1000 there). Otherwise I guess the man page would say it's the size of the indices array.

I'm not sure though. But if I'm wrong I'm sure someone will step in and tell me :)

Anyway, if it wasn't that then maybe you are reading the vert positions in incorrectly (because of the strange large square in the screenshot).

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!