Game Time Management (C#)

Started by
1 comment, last by Dr_Asik 7 years ago

Hello everyone, this is my first post on the forums!

I've currently designing a Game Engine in C#, using OpenTK. Note that; I'm ditching the 'GameWindow' methods, like load, update and render, and creating my own.

Because of this, I need to create a way to manage time in my engine (FPS, frames, delta etc). So far I've created a class, which can sort of handle time correctly.

Currently, if I run a game at 60fps, everything works fine. However, if I run it at 120fps, I only get around 111fps; (Note: I'm not rendering anything to the screen). Another funny thing is that I can set the fps to even higher, say 300fps and I get around 250.

I'm not sure why this is happening, maybe it's the inaccuracy of the Stopwatch class. Here is the update method of my game time class:


        public void Update(float max_updates_per_second)
        {
            /*
             * watch: a System.Diagnostics.Stopwatch object
             * CanProcess: Tells the game engine whether we're allowed to update and render then next frame
             * wait_ticks = The amount of milliseconds needed to wait until we can render the next frame
             */

            float wait_ticks = SECOND / max_updates_per_second;

            delta = watch.ElapsedMilliseconds - (float)lastTimeUpdatedFrames;

            if (watch.ElapsedMilliseconds >= lastTimeUpdatedFrames + wait_ticks)
            {
                frames++;
                canProcess = true;
                lastTimeUpdatedFrames = watch.ElapsedMilliseconds;
            }
            else
            {
                canProcess = false;
            }

            if (watch.ElapsedMilliseconds >= lastTimeUpdatedFPS + SECOND)
            {
                fps = frames;
                frames = 0;
                lastTimeUpdatedFPS = watch.ElapsedMilliseconds;
            }
        }

Any help on why my game time isn't accurate would be great!

Advertisement

            if (watch.ElapsedMilliseconds >= lastTimeUpdatedFrames + wait_ticks)
            {
                frames++;

You only ever detect and count 1 frame, even if multiple frames have passed.


Anyway, your whole concept is a bug.
There is no such thing as, “can the engine render?”—the game renders every single frame. You don’t tell the engine to speed up to a specific speed, you limit it to a specific speed, optionally.
The game loop runs as fast as it can (unless in special cases you have systems to preserve battery life etc.) and checks how many logical updates need to be made, and then renders once.
You reach target framerates via hardware support from v-blanks, and you only apply artificial limiters to the number of logical updates that can be made in a frame.
It should never be possible to call that function without producing a render, and you haven’t separated logic from rendering.


L. Spiro

I restore Nintendo 64 video-game OST’s into HD! https://www.youtube.com/channel/UCCtX_wedtZ5BoyQBXEhnVZw/playlists?view=1&sort=lad&flow=grid

This topic is closed to new replies.

Advertisement