How would I set a "maximum FPS"?

Started by
17 comments, last by TDragon 18 years, 10 months ago
Quote:Original post by ursus
..i now consider setting a fixed frame rate for my next game.


I hope thats a fixed world update rate, not a fixed redraw rate...

Advertisement
sure it's the real world update rate! i don't think any human being would be able to function with frame rate over 60!
I like to provide frame limiting as an option in my games. I personally don't like being locked into a specific framerate.

I would cache the value you are using to limit (1000 / 60) so you don't perform a divide each frame.
Quote:Original post by intrest86
Everything is going 60 because that is as fast as they can run. They are not arbitrarly limiting ther games to 60, they are making games go as quickly as they can and being able to go 60.

What is going to happen to your game when the framerate dips to 30? That is still in the playable range, but if you are updating everything based on a "change per frame" method everything is going half as fast!


No, that's not as fast as they can run. My computer, not even a top-of-the-line machine, can get hundreds of FPS when unleashed on 3d-intensive tasks. The main reason 60 FPS is aimed for (in the US) is that NTSC signal transmission is 60hz. In any case, I intended that more as a sidenote to the fact that it's best to use the v-sync method. Of course you'll notice that I said "at most once for every vertical retrace". My poor little laptop can only get 30-40 fps on the engine I'm designing right now, so of course rendering occurs less often and of course I have to account for that. Actually, you can't tell much of a difference since the movement is still the same per unit time, and for the most part is still less than a pixel difference.

Oh, and @helix - Any compiler worth its salt will do the divide at compile time for you...not that it makes a nanosecond difference on a P4/Athlon...
{[JohnE, Chief Architect and Senior Programmer, Twilight Dragon Media{[+++{GCC/MinGW}+++{Code::Blocks IDE}+++{wxWidgets Cross-Platform Native UI Framework}+++
Why would you ever want to artificially limit the FPS? You're going to have to account for non-constant frame rates anyways if any of the computers you plan to run the software on ever has a chance of falling bellow your predetermined rate. If the end-user wants to limit their FPS, they can just turn on V-sync themselves.
By limiting the FPS to the refresh rate, you are making sure the program doesn't consume unnecessary CPU time, instead returning it to the OS where it can be used for other processes running in the background. Rule 1 of Programming: Always play nice with multitasking :) (well, maybe not rule 1, but pretty important nonetheless). And of course I'm not advocating that you code with a constant game state update frame rate in mind, but with a constant delta vs. time in mind. Two different cats, and a million ways of skinning each.
{[JohnE, Chief Architect and Senior Programmer, Twilight Dragon Media{[+++{GCC/MinGW}+++{Code::Blocks IDE}+++{wxWidgets Cross-Platform Native UI Framework}+++
there is nothing wrong with limiting frame rate. Sometimes, it's necessary, when the game runs so fast, you get FP inacuracies (mostly on the delta time) so severe it will brake the game. I'd limit it to 100hz personnaly. [grin]

Everything is better with Metal.

Quote:Original post by oliii
there is nothing wrong with limiting frame rate. Sometimes, it's necessary, when the game runs so fast, you get FP inacuracies (mostly on the delta time) so severe it will brake the game. I'd limit it to 100hz personnaly. [grin]


Which is why you fix the world update rate which should be independant of the frame redraw rate, see my earlier post for an example of this.

Quote:Original post by TDragon
By limiting the FPS to the refresh rate, you are making sure the program doesn't consume unnecessary CPU time, instead returning it to the OS where it can be used for other processes running in the background. Rule 1 of Programming: Always play nice with multitasking :) (well, maybe not rule 1, but pretty important nonetheless). And of course I'm not advocating that you code with a constant game state update frame rate in mind, but with a constant delta vs. time in mind. Two different cats, and a million ways of skinning each.


Define 'unnecessary CPU time'?
If a game is running fullscreen then the user has already given their agreement that it can run as fast as possible, V-syncing and blocking on IO and other stalls will allow the pre-emptive OS you are running on to swap the task out as and when it wants to, so other processes wont starve. If a user turns off v-sync they have effectively told you 'hey, render as fast as you can!'

So, my game, when it has focus, will use as much CPU time as required, options will be given to allow the user to restrict frame rate if needs be (thinking mostly of laptop users) and ofcourse v-sync lets you limit framerate as well (which is a blocking call, which can cause your thread to be swapped out, again freeing time to the CPU), however all of this are user options, not enforced by the programmer in some draconian manner, but optionally avaible if the end user wants them for some reason, if the end user wants to the program to throw out frames as fast as it can who are we to stop them?

Thus my advocation of a fixed world update step, it lets you render as fast as possible while stoppping your calculations from becoming unstable when you update the world.

Frankly, if you cant see the advantages of this system then I dispair, its not like I dont like a good chunk of code in everything thread to help either [smile]
Meh, I guess what I said didn't come out quite right. What I mean to say is, framerate can be limited by the programmer not to be higher than the user's refresh rate. There would be absolutely no difference in what is displayed onscreen at a framerate of 75 and at a framerate of 105 if the user has a 75hz refresh rate from their video card to their monitor. So anything above 75 FPS (in this case) is "unnecessary CPU time". That's not to say that every draw should necessarily be v-synced -- because if you're running at BELOW 75 FPS, there will be a difference. So all I'm saying is, have the refresh rate as a maximum -- in my case, by implementing a 3-state system that only renders if the v-sync indicator is 1 (which means we're getting max FPS equal to or greater than refresh rate) or 2 (which means we're not matching the refresh rate). Each v-sync EVENT increments the state (up to 2); each render decrements it.

Oh, and always keep in mind that the end user is a complete idiot who will destroy their computer if you make it easy enough for them. Not this is something that could destroy a computer...but what I mean is, if render FPS above and beyond the refresh rate is completely meaningless, help the user out with more CPU time for other processes instead.

World update time, as we all know, is a completely different story. It can be fixed, tied to the renderer, or in a whole different thread, or whatever. Nobody really cares, as long as it works correctly. Keeping it in the same loop as the renderer is the easiest. Just have to make sure you calculate everything based on frame time.

Ta,
Twilight Dragon
{[JohnE, Chief Architect and Senior Programmer, Twilight Dragon Media{[+++{GCC/MinGW}+++{Code::Blocks IDE}+++{wxWidgets Cross-Platform Native UI Framework}+++

This topic is closed to new replies.

Advertisement