Sign in to follow this  

Super Choppy MD2 Animation

This topic is 4866 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

My MD2 models animation sucks. I wrote the animation code just today, and damn if it isn't choppy. I've been screwing with it all morning. The code is relatively short, and I can't seem to find anything screwed up with it. In my class definition, I have some data members that store some info about the animation.
class MD2Model
{
  // You don't need the whole class, so here is what is pertinent
  // Animation Info
  KeyFrameAnimation Animations_[20];
  float AnimationSpeed_;
  float LastFrameSwitchTime_;
  int CurrentFrame_;
  int CurrentAnimation_;
}

In the constructor, I set the values
AnimationSpeed_ = 200.0f; // Time in milliseconds before frame switch (1000ms = 1 sec)
	LastFrameSwitchTime_ = 0.0;
	CurrentFrame_ = 0;
	CurrentAnimation_ = 0;

Then, I have a couple of functions from the class. This one is to make sure the animation runs at the same speed always:
float MD2Model::CalculateT(void)
{
	float t = 0.0f;
	float ElapsedTimeSinceFrameSwitch = 0.0f;

	float CurrentTime = GetTickCount();

	ElapsedTimeSinceFrameSwitch = CurrentTime - LastFrameSwitchTime_;
	t = ElapsedTimeSinceFrameSwitch / AnimationSpeed_;

	if(ElapsedTimeSinceFrameSwitch >= AnimationSpeed_)
	{
		if(CurrentFrame_ == Animations_[CurrentAnimation_].EndFrame)
			CurrentFrame_ = Animations_[CurrentAnimation_].StartFrame;
		else
			CurrentFrame_ = CurrentFrame_ + 1;

		LastFrameSwitchTime_ = CurrentTime;
	}
	return t;
}

Lastly, here is the code to display the animated model. I keep all my vertices in one big buffer, for those of you wondering why it looks the way it does and why I always have to multiply by the number of vertices per frame.
void  MD2Model::DisplayModelAnimated(void)
{
	float t;
	int NextFrame;
	Vertex CV1, CV2, CV3;
	Vertex NV1, NV2, NV3;

	// Get the next frame
	t = CalculateT();

	if(t > 1.0f)
		t = 0.0f;

	if(CurrentFrame_ == Animations_[CurrentAnimation_].EndFrame)
			NextFrame = Animations_[CurrentAnimation_].StartFrame;
	else
		NextFrame = CurrentFrame_ + 1;
	
	// Draw the interpolated model
	glBindTexture(GL_TEXTURE_2D, textures[0]);
	glBegin(GL_TRIANGLES);
	for(int i = 0; i < NumTriangles_; i++)
	{
		// Get the triangles
		CV1 = Vertices_[Triangles_[i].VertexIndex[0] + (CurrentFrame_ * NumVerticesPerFrame_)];
		CV2 = Vertices_[Triangles_[i].VertexIndex[1] + (CurrentFrame_ * NumVerticesPerFrame_)];
		CV3 = Vertices_[Triangles_[i].VertexIndex[2] + (CurrentFrame_ * NumVerticesPerFrame_)];
		NV1 = Vertices_[Triangles_[i].VertexIndex[0] + (NextFrame     * NumVerticesPerFrame_)];
		NV2 = Vertices_[Triangles_[i].VertexIndex[1] + (NextFrame     * NumVerticesPerFrame_)];
		NV3 = Vertices_[Triangles_[i].VertexIndex[2] + (NextFrame     * NumVerticesPerFrame_)];

		glTexCoord2f(TexCoords_[Triangles_[i].TextureIndex[0]].s, TexCoords_[Triangles_[i].TextureIndex[0]].t);
		glVertex3f(CV1.x + t * (NV1.x - CV1.x), 
			         CV1.y + t * (NV1.y - CV1.y),
							 CV1.z + t * (NV1.z - CV1.z));
		glTexCoord2f(TexCoords_[Triangles_[i].TextureIndex[2]].s, TexCoords_[Triangles_[i].TextureIndex[2]].t);
		glVertex3f(CV3.x + t * (NV3.x - CV3.x), 
			         CV3.y + t * (NV3.y - CV3.y), 
							 CV3.z + t * (NV3.z - CV3.z));
		glTexCoord2f(TexCoords_[Triangles_[i].TextureIndex[1]].s, TexCoords_[Triangles_[i].TextureIndex[1]].t);
		glVertex3f(CV2.x + t * (NV2.x - CV2.x), 
			         CV2.y + t * (NV2.y - CV2.y), 
							 CV2.z + t * (NV2.z - CV2.z));
	}
	glEnd();
}

I hope somebody can spot a reason why its all choppy, because I've been looking at it all day and it looks correct to me. It's rather simple. The model displays fine, and I can display any frame I want statically. The problem is when I try to run an animation it's all choppy and nasty. I hope somebody here can see it cause I don't.

Share this post


Link to post
Share on other sites
This is really odd, but I seem to have fixed it by doing something that seems wrong. I have a check to make sure t doesn't exceed 1 in my interpolation or:

if(t > 1.0f) t = 0.0f;

I simply changed it to:

if(t >= 1.0f) t = 0.0f;

and it works. I thought 1 was a perfectly valid value for the interpolation? Anyways, I suppose 0.0 is the same as 1.0 for the last one, so I guess that's what was causing it.

It's wierd, I'll search for something all day and I can't fix it, but two minutes after I give up and post it to gamedev, I figure it out. Seriously, I've been screwing with this for 8 hours. This happens to me all the time. I post it and then fix it right after. Go figure.

Share this post


Link to post
Share on other sites

This topic is 4866 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this