Jump to content
  • Advertisement
Sign in to follow this  
Sappharos

OpenGL Lighting with GL_TRIANGLE_STRIP

This topic is 3215 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm having another go at lighting in OpenGL. My current project renders a heightmap with various lighting modes. Lighting with a normal for each vertex using GL_TRIANGLES primitives looks fine (screenshot), but results in an abhorrent frame rate. Using GL_TRIANGLE_STRIP instead works wonders with the FPS, but the strips are visible (screen 2). I doubt it's a problem with the normals, otherwise the first screen would surely have a related issue. To be honest I don't know where to start looking for errors. I'm posting it up here in case anyone recognises the issue without going through too much of my code and can explain it. Is it simply a limitation of triangle strips? I am not using any 3rd party libraries such as SDL or Allegro. Here's the drawing code: (I won't post the whole source as it consists of 14 .cpp files and 15 .h files! Yes, I overdid splitting the project up.)
case PM_TRIANGLE_STRIP:	// primitive mode: triangle strips
{
	Vector n;	// variable for holding normal
	glFrontFace(GL_CW);	// clockwise front faces
	glColor3fv (terrainCol);	// set the terrain colour (grassy green)
	for (int x = 0; x < HMAP_SIZE - 1; x++) // iterate through x dimension of heightmap
	{
		glBegin (GL_TRIANGLE_STRIP);
			for (int z = 0; z < HMAP_SIZE; z++) // iterate through z dimension
			{
				// triangle 1
				if (z > 0) // if not one of the first two vertices of the strip
					// specify a normal
				{
					n = hMap->f_normals[x][z][0]; // load normal from array into local variable
					glNormal3f (n.x, n.y, n.z);   // send normal to OpenGL
				}
				glVertex3f (GLfloat(x), hMap->heights[x][z], GLfloat(z));	// vertex 1

				// triangle 2
				if (z > 0) // if not one of the first two vertices
					// specify a normal
				{
					n = hMap->f_normals[x][z][1]; // load normal from array
					glNormal3f (n.x, n.y, n.z);   // send normal
				}
				glVertex3f (GLfloat(x + 1), hMap->heights[x + 1][z], GLfloat(z));	// vertex 2
			}
		glEnd();
	}
	glFrontFace(GL_CCW);	// return to anticlockwise front faces
	break;
}


Meanwhile, criticism of my coding 'style' is very welcome. Thanks very much for your time.

Share this post


Link to post
Share on other sites
Advertisement
Looking at the triangle strip image you are not rendering that many polygons to bring the frame rate down. I would bet your low frame rate is a result of doing something in the main draw scene routine that you shouldn't be doing - like constantly reloading the map, recalculating the normals etc. These can be done once up front and then just referred to in the draw scene routine (which you appear to be doing for the most part, so just check the main draw scene routine for something unecessary).

The way in which you render the terrain - quads, individual polygons, triangle strips etc, should not affect the end result as much as the particular way in which you calculate the normals (say per quad, polgon, or per vertex, which is best).ie. The more effort you can put into getting a good normal value for each vertex, the better your scene will look. As long as the normals are calculated correctly (beforehand) and are accessed appropriately by the type of render method in use, then you should see consistent results with any render method. I would bet your triangle strip approach is not accessing the right normals for the particular polygon being rendered.

Hope that helps
F451

Share this post


Link to post
Share on other sites
Thanks for the replies. The map is generated only once: when the scene is created; and the same with the normals. Okay, I will look into VBOs next, they seem very promising.
Quote:
As long as the normals are calculated correctly (beforehand) and are accessed appropriately by the type of render method in use, then you should see consistent results with any render method. I would bet your triangle strip approach is not accessing the right normals for the particular polygon being rendered.

Going over it again in my head, I think you're right. There would hardly be such a limitation in a widely-used feature like triangle strips. The normals have been calculated correctly, judging from the first image, so all that leaves is the way in which I'm accessing them, as you suggest. With this in mind, I'll attack it with a spanner and hopefully solve it. Thanks again. :)

Share this post


Link to post
Share on other sites
Perfect, it's working now. I was generating the vertex normals as needed, but I was using the face normals on the vertices. As usual a staring-you-in-the-face mistake. I can't believe I doubted the OpenGL call over my own incompetence. Thanks for the help. :)

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!