Currently, I determine frames per second like this:
void GameTimer::tick()
{
frameCount_++;
fpsFrameCount_++;
currentTime_ = clock.now();
durationDouble time_span = currentTime_ - previousTime_;
deltaTime_ = time_span.count();
totalTime_ += deltaTime_;
fpsTimeCount_ += deltaTime_;
previousTime_ = currentTime_;
// Count the number of frames displayed each second.
if (fpsTimeCount_ > 1.0f)
{
fps_ = fpsFrameCount_;
fpsFrameCount_ = 0;
fpsTimeCount_ = 0.0f;
}
}
Every iteration of the game loop, tick is called, and increments fpsFrameCount_. After one second has elapsed, I store this in fps_ and reset the counter.
I am confused however because it typically ends up being > 1000 fps. If it drops below 400, I start to notice a lag. If it were to run at 60 fps it would be cripplingly slow. But 60 fps is what most games run at, isn't it? That is my understanding anyway. Why are the numbers I'm seeing not matching up with my understanding? Is this method incorrect?