MattieMan

Members
  • Content count

    0
  • Joined

  • Last visited

Community Reputation

124 Neutral

About MattieMan

  • Rank
    Newbie
  1. Hello everyone, this is my first post on the forums! I've currently designing a Game Engine in C#, using OpenTK. Note that; I'm ditching the 'GameWindow' methods, like load, update and render, and creating my own. Because of this, I need to create a way to manage time in my engine (FPS, frames, delta etc). So far I've created a class, which can sort of handle time correctly. Currently, if I run a game at 60fps, everything works fine. However, if I run it at 120fps, I only get around 111fps; (Note: I'm not rendering anything to the screen). Another funny thing is that I can set the fps to even higher, say 300fps and I get around 250. I'm not sure why this is happening, maybe it's the inaccuracy of the Stopwatch class. Here is the update method of my game time class: public void Update(float max_updates_per_second) { /*              * watch: a System.Diagnostics.Stopwatch object * CanProcess: Tells the game engine whether we're allowed to update and render then next frame * wait_ticks = The amount of milliseconds needed to wait until we can render the next frame */ float wait_ticks = SECOND / max_updates_per_second; delta = watch.ElapsedMilliseconds - (float)lastTimeUpdatedFrames; if (watch.ElapsedMilliseconds >= lastTimeUpdatedFrames + wait_ticks) { frames++; canProcess = true; lastTimeUpdatedFrames = watch.ElapsedMilliseconds; } else { canProcess = false; } if (watch.ElapsedMilliseconds >= lastTimeUpdatedFPS + SECOND) { fps = frames; frames = 0; lastTimeUpdatedFPS = watch.ElapsedMilliseconds; } } Any help on why my game time isn't accurate would be great!