Jump to content
  • Advertisement
Sign in to follow this  
Laz

OpenGL Floor problem

This topic is 4648 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Okay, I'm honestly not sure which forum I should post this in. OpenGL, Math or Game Programming. But, I've got a problem with this code I believe, I just can't find where. (Could be my reasoning though). What I'm trying to do is draw a floor, and have the camera move. I want the floor to stay in the same place (obviously) so I can see that the camera is in fact moving. I realize this should be an easy task, but I'm new to the whole camera thing, and my matrix math isn't up to par. The problem is, the camera's matrix indicates that the camera is actually moving (the z value is decreasing), but nothing on screen seems to happen. It's leading me to believe that the floor is still being rendered in camera space. Here's the code I'm using:
bool CApp::Loop()
{
	float time;
	Matrix4f grndmat;
	grndmat.SetIdentity();
 
	timer.Update();
	time = timer.GetTime();
 
	fpstimer.Update();
	fps++;
 
	camera->Update(time);
 
	// Render the 3d stuff.
	glEnable(GL_LIGHTING);
	glPushMatrix();
		camera->Apply();
 
		glPushMatrix();
			glMultMatrixf(grndmat.matrix);
 
			float startx = -50.0f;
			float startz = -100.0f;
			float endx = 50.0f;
			float endz = 100.0f;
 
			glEnable(GL_LINE_SMOOTH);
			glLineWidth(2.0f);
			glBegin(GL_LINE_STRIP);
				glColor3f(0.0f, 0.5f, 1.0f);
				glNormal3f(0.0f, 1.0f, 0.0f);
 
				for (float z = startz; z < endz; z++) {
					glVertex3f(startx, 0.0f, z);
 
					for (float x = startx; x <= endx; x++) {
						glVertex3f(x, 0.0f, z);
						glVertex3f(x, 0.0f, z + 1.0f);
					}
 
					glVertex3f(endx, 0.0f, z + 1.0f);
				}
			glEnd();
		glPopMatrix();
	glPopMatrix();
 
	// Swap the buffers
	gl->Swap();
 
	// Reset the timer
	timer.Reset();
 
	if (fpstimer.GetTime() >= 1) {
		fpstimer.Reset();
		fts = fps;
		fps = 0;
	}
 
	// Return
	return bStillGoing;
}

In case anyone needs, here is the CCamera::Apply() code:
void CCamera::Apply()
{
	Matrix4f tmpMat = matrix;
 
	tmpMat.Transpose();
	tmpMat.SetTranslation(0, 0, 0);
 
	glLoadMatrixf(tmpMat.matrix);
	glTranslatef(-fPosition.x, -fPosition.y, -fPosition.z);
}

While running, the floor (grndmatrix) is at identity the whole time, and the camera matrix is:
[-1.0f, 0.0f,  0.0f, 0.0f]
[ 0.0f, 1.0f,  0.0f, 0.0f]
[ 0.0f, 0.0f, -1.0f,   z ]
[ 0.0f, 0.0f,  0.0f, 1.0f]
I've only put 'z' there because that value is constantly decreasing (which tells me that the camera SHOULD be moving). I've discussed this with a good friend of mine, who's been pretty much mentoring me, and he's baffled. Any help would be greatly appreciated. and, please excuse the messy code. I hope i've not confused anyone with my banter.

Share this post


Link to post
Share on other sites
Advertisement
Apparently I didn't need tmpMat.SetTranslation(0, 0, 0); in my CCamera::Apply() function.

FIXED.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!