• 10
• 10
• 12
• 14
• 15

SDL and Frame Timing

This topic is 703 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

Recommended Posts

Hi there!

I've been trying to work on writing my own game engine. The sad part is, while I've figured out libraries to load models, I have 3D sound, I just don't know how to freaking handle frame timing.

Basically, my graphics card begins to shriek when running at high frames, which is fixed by VSync or limiting frames. Or maybe I'm a terribly bad programmer, I dunno, perhaps both.

Regardless, to the code!

Here is the basic loop I have set up at the moment:

 while (!m_bQuit)
{
Time->Think();

while (SDL_PollEvent(&event))
{
if ((event.type == SDL_KEYDOWN && event.key.keysym.sym == SDLK_ESCAPE) || event.type == SDL_QUIT)
Terminate();
}

if (Time->ShouldUpdate())
{
AMBehavior::MasterUpdate();

Renderer->Draw();
}
}


AMBehavior's MasterUpdate() iterates through a list of objects that inherit from it and call their respective update functions. Time inherits from AMBehavior, but I also gave it a Think() to operate outside of MasterUpdate() so time can be managed, and I can print out a frame count.

Here are the 3 relevant functions I have for my time manager:

bool TimeMgr::ShouldUpdate()
{
if (ticks >= 1000.0f / AMEngine->GetMaxFPS() / 1000.0f)
{
ticks = 0.0f;
return true;
}

return false;
}

void TimeMgr::Update()
{
FPS++;

if (frameTicks >= 1.0f)
{
frameTicks = 0.0f;
std::cout << "FPS: " << FPS << std::endl;
FPS = 0;
}
}

void TimeMgr::Think()
{
elapsedTime = (float)SDL_GetTicks();
ticks += deltaTime = (elapsedTime - prevElapsedTime) / 1000.0f;
frameTicks += deltaTime;

prevElapsedTime = elapsedTime;
}


Aside from some of the calculations being inefficient, this does work... Sort of.

You see when I have the engine's max FPS set to 60, it works as I would expect, FPS shows as 59.

However, when I set the engine's max FPS to say 300, my calculated FPS is only about 250.

If I set my max to 3000, I see my calculated value be about 1000, even though I know I can get more frames than that. What gives? Am I using deltaTime incorrectly here? What stupid silly thing am I doing wrong?

Edited by Spirrwell

Share on other sites

Well there's a reason I made my avatar here of Homer Simpson. D'oh.

Regardless, I was coming at this wrong. All I should be affecting is the actual rendering, because that's what matters, not how much my game updates, though a fixed update function makes sense.

For anybody interested in yet another one of these threads, here ya go, here's the code that made this work:

Loop:

while (!m_bQuit)
{
while (SDL_PollEvent(&event))
{
if ((event.type == SDL_KEYDOWN && event.key.keysym.sym == SDLK_ESCAPE) || event.type == SDL_QUIT)
Terminate();
}
AMBehavior::MasterUpdate();

if (Time->ShouldRender())
Renderer->Draw();
}


Time Code ( No Traveling Allowed (That's a joke) ):

bool TimeMgr::ShouldRender()
{
if (ticks >= 1.0f / AMEngine->GetMaxFPS())
{
ticks -= 1.0f / AMEngine->GetMaxFPS();
return true;
}

return false;
}

void TimeMgr::Update()
{
elapsedTime = (float)SDL_GetTicks();
ticks += deltaTime = (elapsedTime - prevElapsedTime) / 1000.0f;

prevElapsedTime = elapsedTime;
}


Then finally, where I calculate the FPS, in my renderer!:

void RenderingEngine::Update()
{
frameTicks += Time->deltaTime;
}

void RenderingEngine::Draw()
{
FPS++;

if (frameTicks >= 1.0f)
{
frameTicks = 0.0f;
std::cout << "FPS: " << FPS - 1 << std::endl;
FPS = 0;
}

...