frame rates and refresh rates

Started by
9 comments, last by Norman Barrows 7 years, 4 months ago

frame rates and refresh rates

on the PC, if your FPS is higher than the monitor's refresh rate, you only get the refresh rate as the number of images presented to the player per second - right?

so if i'm at 172 fps (pulling a number out of thin air), and the monitor is at 60Hz, the player sees 60 images per sec - right?

so there's really no need to go faster than refresh rate? you're writing data to vidram that never gets sent to the monitor? overwritten before it ever gets read?

and refresh rates are typically some multiple of 30 Hz ?

I'm running caveman 3.0 on the new i7-6700K at 4Ghz and a GTX1080. with framerate limiter off, i show 62 fps. due to vsync no doubt.

i'm trying to figure out how to take advantage of these new speeds.

a standard game loop framerate limited to 60Hz or so would be nice. less work that fix your timestep, and no temporal aliasing.

game is currently coded for 15Hz update, so i'd have to mod all update code to run at a different speed. such as 30Hz or 60Hz. (or 120Hz ?)

Norm Barrows

Rockland Software Productions

"Building PC games since 1989"

rocklandsoftware.net

PLAY CAVEMAN NOW!

http://rocklandsoftware.net/beta.php

Advertisement

on the PC, if your FPS is higher than the monitor's refresh rate, you only get the refresh rate as the number of images presented to the player per second - right?

so if i'm at 172 fps (pulling a number out of thin air), and the monitor is at 60Hz, the player sees 60 images per sec - right?


Well, yes and no.
The screen will only do a redraw every 1/60 of a second, but the player might well see some of three frames of data as the source data gets updated while the drawing is happening - and it'll be a torn mess.

so there's really no need to go faster than refresh rate? you're writing data to vidram that never gets sent to the monitor? overwritten before it ever gets read?


Not if you want a clean image, no.
The data will get sent to the monitor however, you'll just see artefacts in the image.
Example; monitor is drawing at one new frame every 16ms but you are presenting new data every 8ms, then half way down the screen the new frame of data will be presented which will likely be noticeable.
(Things won't get 'over written' at the GPU stage because you'll be presenting a different back buffer each time)

and refresh rates are typically some multiple of 30 Hz ?


No.
Some monitors can run at 72hz, others at 144hz for example; others such as Freesync or G-sync monitors have no fixed update rate and will refresh on demand.

I'm running caveman 3.0 on the new i7-6700K at 4Ghz and a GTX1080. with framerate limiter off, i show 62 fps. due to vsync no doubt.


Do not assume; profile.
Check what the profilers says you are doing both CPU and GPU wise.
Never guess.
Always use tools.

And if your frame rate limiter is off then you'd see 60fps if you were v-sync locked with a monitor which can do a max refresh.
If you have vsync off as well and only seeing 62fps then the limit isn't vsync.

game is currently coded for 15Hz update, so i'd have to mod all update code to run at a different speed. such as 30Hz or 60Hz. (or 120Hz ?)


This is why games decouple their updates so that everything can run at different speeds and adapt to the system they are on aka part of the reason why Fix Your Time Step is such widely given advice.

The games which don't do that tend to be console titles, where they tune to the hardware and then limit the PC version in some manner; such as games which update at 30hz and rely on v-sync to keep things in check... and then go wrong when v-sync is disabled.

game is currently coded for 15Hz update, so i'd have to mod all update code to run at a different speed.


Running at low update rates is often fine, depending on how twitch-heavy the game is. You can use interpolation to get smooth graphics on arbitrary framerates without touching much of the game code. The downside of interpolation is that it creates a gameplay lag up to the length of your game tick rate (so in this case, a 15hz or 250ms lag). Games will often then break up gameplay systems to run at different update rates, e.g. physics + player input at 60hz or higher while AI and other systems might run as low as 10hz, and even the HUD/UI might update at a different rate than the main graphics.

You could start with adding interpolation and seeing if the lag is meaningful for your game. If it is, consider what it'd take to split out physics (and input / player controller) from the rest of your game logic's update rate, since that's really all you need to make the game feel more responsive to (most) controls.

Sean Middleditch – Game Systems Engineer – Join my team!

Keep in mind that things are a bit different between windowed and non-windowed modes on all recent versions of Windows (Vista+). In windowed mode your app doesn't directly present to the screen: it presents to the compositor (DWM), and the compositors combines your window's contents with other windows and draws the whole thing to the screen. So if you run unlocked with no VSYNC in Windowed mode you won't see tearing, but you may see some juddering due to uneven frame times. In exclusive fullscreen mode your app "owns" the display, in which case the front buffer gets scanned out right to the screen. In this scenario you will get tearing if you don't enable VSYNC, and you can also see juddering due to uneven frame times.

One possible advantage of running faster than the refresh rate is that there may be less latency between input->display, at least if you're polling input and updating game state at the same rate that you're rendering. But that latency will be variable since you're effectively decoupled from the display rate of the monitor.

i'm trying to figure out how to take advantage of these new speeds.

If you would be able to render 600 fps, you could blend 10 frames together for both motion blur and antialiasing.

Additional fake screenspace motion blur on each frame before the blend would come very close to ground truth then.

Sounds inpractical for now, but i think this soon starts making sense in combination with object space / multiresolution shading, variable resolution per tile, foveated rendering...

In another thread you've made the statement that 24 fps are enough for movies so also enough for the eye in general, but that's wrong for games because they have no 'free' motion blur.

A frame of a movie shows a time duration of 1s/24, while a game shows only a single moment of time 60 times per second, so a movie will still look smoother although it has less fps.

(this said without further discussion about camera shutter settings, movies targeting 48 fps for good reasons, how different people notice low fps differntly, and so forth)

If you have vsync off as well and only seeing 62fps then the limit isn't vsync.

i'm pretty sure vsync is on.

sounds like you still need vsync on, to avoid tearing - correct?

and you can't really count on any specific refresh rate.

so the choices are decouple (a la fix your timestep), or limit rate to "fast enough" - whatever your definition of that is (30fps, 60fps, etc). no other options, right? - i can't think of any...

.

Norm Barrows

Rockland Software Productions

"Building PC games since 1989"

rocklandsoftware.net

PLAY CAVEMAN NOW!

http://rocklandsoftware.net/beta.php

what about detecting the refresh rate, then using a variable framerate limiter to match the refresh rate, or half the refresh rate in the case of high refresh rates that the game just can't do?

that would let render run as fast a refresh (assuming it could do that), without the need for fix your timestep. you get a simpler loop, and no temporal anti-aliasing.

fixed timestep, with vsync on, a game would run at vsync. with vsync off it would run at max render speed with possible tearing. don't want tearing, so i guess keep vsync on.

keeping vsync on, it seems a basic game loop framerate limited to refresh rate yields the same results as fixed timestep, with less work. you have a multiplier for update based on refresh rate - sort of like ET in fixed timestep. but that's it. no current and previous states, no lerping location, orientation, and tween. and what you render is always the current state - no temporal aliasing. and you process input every render, and render after every update, so the player experiences minimum lag.

guess the big question is can DX determine the refresh rate with certainty?

Norm Barrows

Rockland Software Productions

"Building PC games since 1989"

rocklandsoftware.net

PLAY CAVEMAN NOW!

http://rocklandsoftware.net/beta.php

It sounds like you're describing, in very long terms, what is actually the simplest game loop possible:

  1. update
  2. render
  3. wait for vsync
  4. goto 1

If the refresh rate is 60Hz, and your game loop concludes in under 1/60th of a second, you'll run at 60fps. If your game loop is between 1/60th of a second and 1/30th, you'll run at 30fps, because you miss odd-numbered vsyncs but are ready for every even-numbered one.

You still need to consider the timestep because you don't know, without trying, whether you are going to be able to run 1 frame in 1/60th second or 1/30th of a second, regardless of the hardware refresh rate. But you can work with multiples of the frame rate. And yes, DirectX can query for the current device's frame rate.

It sounds like you're describing, in very long terms, what is actually the simplest game loop possible:

yeah - basically.

i've always used loops of the form:

while (! quitgame)

{

start timer

render

input

update

wait until desired ET has elapsed, such as ~66ms for 15Hz, or ~33ms for 30Hz, etc.

}

there was never any thought about "render as fast as possible", just "keep it from running too fast".

and frankly, i'm still not worried about rendering as fast as possible. i'd much prefer a lockstep loop that's "fast enough". especially given that it seems that the extra speed of fixed timestep gets limited by vsync and tearing issues.

in the end, neither implementation is that difficult to add. an extra location and orientation and 3 lerps (location, orientation, animation frame) per entity. a variable framerate limiter would need an ET value, just like fixed timestep. something like hard coded to 60Hz would be simplest.

i think my first step, while i wait for further opinions, will be to make the framerate limiter variable, and add a multiplier to update, so i can test it at 15, 30, 60 HZ etc. i may just code it to run at 60 or 30 hz and get on with life.

the ironic thing about the new 6700K GTX1080 runing both Skyrim SE and Caveman 3.0 is the number one thing i notice is i need better plant models! <g>. i don't really notice that one is vsync limited fixed timestep at 60Hz refresh, and the other is locked at 15Hz. but i don't plan on keeping caveman locked at 15Hz. 30 or 60 maybe. or fixed timestep, or variable framerate. actually 30 sounds easiest. should be fast enough, and gives you 33ms to do everything, better than just having 16ms to do everything.

Norm Barrows

Rockland Software Productions

"Building PC games since 1989"

rocklandsoftware.net

PLAY CAVEMAN NOW!

http://rocklandsoftware.net/beta.php

i don't really notice that one is vsync limited fixed timestep at 60Hz refresh, and the other is locked at 15Hz

That's very different for individuals, watch out you don't get biased from your own perception of this. Many people are yery picky about it and really want 60 fps.

I, for instance, want 60fps for a 3D game with mouse based view control, but i don't care about high or low display resolution. Others are just the other way around.

Probably you should give the user control over all this. Frame interpolation for fixed timestep is the main requirement but not much work.

This topic is closed to new replies.

Advertisement