30fps versus 60fps

Started by
17 comments, last by loufoque 14 years, 12 months ago
I have a hard time understanding how 60 frames per second versus 30 fps can be noticeable in a game. The human eye detects a little under 30 frames per second. I can understand it could appear choppy if there was no motion blur. Though if your game supports motion blur then visually there should be no perceived difference. The response time to push a button on a controller is much much slower then the human eye, its different for all ages but for someone in there 20's I think its about 180ms off the top of my head. What then makes a game appear slower. Could it be that it is more of a psychological effect to the user? -= Dave
Graphics Programmer - Ready At Dawn Studios
Advertisement
Motion blur in games (generally) works different than motion blur on film.

Film catches the movement between two frames as motion blur.
Game (usually) generate motion blur by accumulating data from _previous_ frames, possibly masking "chopiness", but certainly not adding any information on what happened between two frames like film does.

Secondly, those famous 24fps for film are the bare minimum to perceive animation as fluid. The human brain can and will process information at a much higher framerate.

Edit:
I found a pretty thorough discussion of that topic here.
I think it's because FPS is not constant. Having an average FPS of 30 means that at times the FPS is less than 30, and at times it is greater than 30. An average FPS of 30 could be below the minimum frame rate necessary to look fluid at times.
Placeholder for better sig.
I've noticed this too. A game running at an even 30 fps does not look nearly as smooth as 60 fps. I could be wrong but I think it works similarly to sound sampling rates. The ear can't percieve anything higher than 22 khz (tape quality) but CD's are 44 khz and the difference is noticable. If you are listening to something at a sampling rate of 22 khz and your range of hearing extends to 22 khz, it doesn't necessarily mean that your ear is receiving all samples. There's still room for higher frequencies to occur between the percieved samples. Therefore, in sound engineering, you need DOUBLE the perceptable sampling rate to get a clean sound.

I'm assuming the same goes for visual media. I've never looked into it so I could be wrong. It does seem to make sense that, like audio media, you need double the frame rate to acheive smooth animation.
Quit screwin' around! - Brock Samson
It's true for pure response times people can't react much under 100ms, without training (fighter pilots and FPS gamers for example), however people can predict moving things within 1 ms accuracy. In most action games people don't respond to sudden random stimulus (ie random stuff popping out of the air like in the psyche tests), but rather react to continuous stimulus over time ( ie tracking a bad guy moving through 3d space or trying to return a ball etc.. ).

In those cases 60fps vs 30fps has a perceptible difference in terms of rate of interaction and visual quality of motion (irrespective of motion blurring). At 60fps the user has 2x the chances to make corrections to their aim and tracking and the feedback is 2x shorter for the same time period as a 30fps game. This is why the best action games aim for 60fps (Call of Duty series, Quake series, etc..). Motion blurring helps increase visual quality for fast moving objects but by its very nature also obscures an objects exact position, which could be an issue for fast moving games.

Enjoy!

-ddn
Quote:Original post by Rattenhirn
Motion blur in games (generally) works different than motion blur on film.

Film catches the movement between two frames as motion blur.
Game (usually) generate motion blur by accumulating data from _previous_ frames, possibly masking "chopiness", but certainly not adding any information on what happened between two frames like film does.

Secondly, those famous 24fps for film are the bare minimum to perceive animation as fluid. The human brain can and will process information at a much higher framerate.

Edit:
I found a pretty thorough discussion of that topic here.


Great link! Explained a lot and I was able to go from there and do a little more research in the area.

Another interesting tidbit is that at another studio I talked to they had three tv's set up. One at 60fps motion blur, 30fps motion blur and one with 30fps all side by side. Only a handful of developers in the entire studio could identify which were which.

-= Dave
Graphics Programmer - Ready At Dawn Studios
It's a gross oversimplification to say "The human eye detects a little under 30 frames per second". The mechanics of the human eye don't work at all like discrete frames. The human eye and optical nervous system contains many different receptors and nerves that connect to different parts of the brain. And many of these different mechanisms can operate at different speeds.

Take the wagon wheel effect, for example. This is an optical effect that exercises the motion detection aspects of the visual cortex. At low speeds (below 5 Hz) a wagon wheel appears to the human eye to be turning in the direction it is actually turning. Higher than that, starting in the 7 to 12 Hz range depending on the person, the wagon wheel will start to appear more or less stationary, and starting at 30 to 35 Hz the wagon wheel will appear to be rotating against it's actual direction of rotation (in this range the wagon wheel effect is called Illusory Motion Reversal (IMR)). Get higher than that and starting around 50 Hz the wagon wheel appears as a solid disk with little discernible motion, though this effect can be delayed until the 100 Hz range for exceptional people. Nothing above 7 Hz is processed accurately by the human brain, but it's capable of recognizing differences in input all the way up to the 60 Hz range and for some people past that. The kicker is that studies on IMR have shown that different objects moving at the same speed in different areas of the field of view can show IMR effects at different rates. The last part is actually pretty recent research, published in the last year.

Of course, that's only a single aspect of vision. Another part of human vision is that even if you think you're staring at a fixed position, the eye actually has a number of small movements called micro-saccade that occur at about 30 to 70 Hz at a range of motion of about 20 arcseconds. This is necessary because the rods and cones only fire in response to change in luminance. If you stare at a red dot you don't have a nerve constantly signaling hFF0000. Instead you've got a nerve saying "hey, things have gotten lighter... wait, darker, now lighter again" and and another nerve saying "hey, things have gotten redder... wait, now less red, now more red again" and next to those nerves you've got a bunch saying "no change here" the brain synthesizes these impulses into an image of the red dot.

Quote:Original post by ddn3
It's true for pure response times people can't react much under 100ms, without training (fighter pilots and FPS gamers for example), however people can predict moving things within 1 ms accuracy. In most action games people don't respond to sudden random stimulus (ie random stuff popping out of the air like in the psyche tests), but rather react to continuous stimulus over time ( ie tracking a bad guy moving through 3d space or trying to return a ball etc.. ).

In those cases 60fps vs 30fps has a perceptible difference in terms of rate of interaction and visual quality of motion (irrespective of motion blurring). At 60fps the user has 2x the chances to make corrections to their aim and tracking and the feedback is 2x shorter for the same time period as a 30fps game. This is why the best action games aim for 60fps (Call of Duty series, Quake series, etc..). Motion blurring helps increase visual quality for fast moving objects but by its very nature also obscures an objects exact position, which could be an issue for fast moving games.

Enjoy!

-ddn


While I do see the previous posters point on motion blur in games not adding correct information I don't follow your logic.

I don't understand how the feedback would be faster? The feedback is still limited by the the visual responce time.

At 30 frames per second
Frame 0: Player tracks head movement [visual]
Frame 3: Player responds to head movement (with shot) [responce]
..
Frame 29: Player sees missed shot [visual]
Frame 32: Player responds to head movement (with shot) [responce]

At 60 frames per second
Frame 0: Player tracks head movement [visual]
Frame 6: Player responds to head movement (with shot) [responce]
..
Frame 59: Player sees missed shot [visual]
Frame 65: Player responds to head movement (with shot) [responce]

More frames have elapsed yes but the same amount of milliseconds have gone by since the player can respond to the head movement.

EDIT: Though my whole point becomes mute when the player can detect visual responce times faster then that!
:)
Graphics Programmer - Ready At Dawn Studios
Quote:Original post by SiCrane
It's a gross oversimplification to say "The human eye detects a little under 30 frames per second". The mechanics of the human eye don't work at all like discrete frames. ...
Yes, the more I read on the subject the more that becomes apparent. Fascinating stuff!

-= Dave

Graphics Programmer - Ready At Dawn Studios
Well in the first example with :

At 30 frames per second
Frame 0: Player tracks head movement [visual]
Frame 3: Player responds to head movement (with shot) [response]
..

3 frames have passed from initial tracking to response. At most the player has a chance to make 3 input frames well ( since fps runs in sync with input rate in most games ). At most the user can consciously correct their aim for 2 frames after initial movement, assuming some parabolic error dropoff this gives you the possible error in targeting from just user input.

Take the 2nd example :

Frame 0: Player tracks head movement [visual]
Frame 6: Player responds to head movement (with shot) [responce]

Again player responds to element at frame 6, or ~100ms into the initial contact point ( which btw isn't realistic at all people usually track and engage targets within 1-2 secs in contact for most games ). In this case the user has 6 input frames, or 2x the number of input frames as a 30fps game. The user can visually correct their tracking and aiming with 2x number of inputs, but note that error dropoff isn't linear ( since people are pretty good at tracking moving elements with their custom wetware ), the error is probably 4x less than the 30fps update rate.

Ultimately a 60fps will allow users to track and respond to moving targets better than a 30fps game, input is just one factor. Esp in first person shooters where the user controls the yaw of the camera a 60fps game gives the user much more fine control over yaw than a 30fps game.

There are alot of factors at work, but it isn't just psychological perception which makes a 60fps better than 30fps there are qualitative reasons.

Good Luck!

-ddn

This topic is closed to new replies.

Advertisement