Archived

This topic is now archived and is closed to further replies.

X-Man

Minimum Frame Rate

Recommended Posts

I'm kinda curious as to what people preceive to be the minimum acceptable frame rate. A lot of posts boast 60+ fps and such, but a really complex game can quickly bog down to 10 or less. Since the programmer has to compromise meshes, textures etc. to achieve a target fps on a minimum machine, what should that rate be? If I'm not mistaken, US TV runs at 30fps, movies at 24 fps and old 16mm movies at 16fps. I had based my opinion on the latter, plus game playing experience to conclude that about 15fps is a minimum acceptable number. What do you all think...? Edited by - X-Man on 7/24/00 9:06:40 AM

Share this post


Link to post
Share on other sites
30 fps is the minimum. if you''re on a fast computer though, it looks jittery because it has to wait for the timer. I don''t lock the frame rate, but rather update everthing else based on how much time has passed to draw the objects. like this:
    
while(!done)
{
DWORD stime=GetTickCount();
drawstuff();
DWORD gametime=GetTickCount()-stime;
moveobjects(gametime); //this is basically how much to move the objects

}

you basically mulitiplay ''gametime'' by 0.02 to get the object to move the same amuont no matter what the frame rate is.

JoeMont001@aol.com www.polarisoft.n3.net

Share this post


Link to post
Share on other sites
Oh man, does that question every bring back memories.. On comp.graphics.algorithims that very same question was asked and months of discussion followed.

Basically, people said that tv and movies get by with such a low frame rate because of two factors:
1) The way the image is displayed, ie. the tv is not as precise a display as a computer monitor due to the different technologies used.
2) Spatial-anti-aliasing, because film opens a shutter for a specific amount of time, all information in that time period is recorded on to that frame of film, so as you play back the film, you aren''t really seeing 24 discrete intervals of time, as you do on a computer screen (of course 3dfx''s t-buffer allows you too do this on a limited basis).

But for these reasons, computers need to have a higher frame rate, or have less spatial aliasing.

Keep in mind this is only very basic analysis of why lower framerates on computers seem to be more noticable than on tv.

The discussion was in 1998 sometime.. very intersting read if you have the time.

Share this post


Link to post
Share on other sites
30 fps is the minimum. if you''re on a fast computer though, it looks jittery because it has to wait for the timer. I don''t lock the frame rate, but rather update everthing else based on how much time has passed to draw the objects. like this:
        
while(!done)
{
DWORD stime=GetTickCount();
drawstuff();
DWORD gametime=GetTickCount()-stime;
moveobjects(gametime); //this is basically how much to move the objects

}

you basically mulitiplay ''gametime'' by 0.02 to get the object to move the same amuont no matter what the frame rate is.

-----------------------------------------------------------------
If you did this, would you be open to the problem that your game
won''t stay the same speed across different computer speeds? Your game might run fine on a P3, but when the P7s come out, it might be unplayably fast. Also, what is GetTickCount? I thought most timers are notoriously low res unless you reprogram them. Personally I wait for the vertical blank, since it seems to be constant across CPU clock speeds and gives a pretty high refresh at the same time. But I''m just a novice too, so if anyone has a better way let me know.

Ut


Share this post


Link to post
Share on other sites
Geez...that was quick!

Hey, I agree, don't wait for a timer...move objects proportional to the frame rate! My concern is balancing graphic detail and game complexity with frame rate!

Also, re: movies - monitors don't have a "shutter closed" period (flicker) between frames. Shouldn't this allow pushing the minimum a bit?

Premandrake: I do appreciate the fact that film blurs movement and I'll have to look into t-buffers more...

Edited by - X-Man on July 24, 2000 12:45:10 PM

Share this post


Link to post
Share on other sites
I gotta agree with Julio on this one, that''s how you should go about it.
The movement is all based on the elapsed time. So on slower machines, let''s say it took 200ms to do drawstuff(), then it''ll move the units some factor*200ms. IF on a P7 it takes 20ms to do drawstuff, then the units will move in increments 10 times less than on the other machine, giving the units the same movement no matter how fast the processor is, but you will get more fps from the better comp.

I do it that way myself.



ByteMe95::~ByteMe95()

Share this post


Link to post
Share on other sites
Hi all.

What we are talking about here? The minimum rate at wich the video card should refresh the monitor, or the rate at which the application should redraw the frame to create the efect of animation?

If it is the later, then the human eye can not see any diference between diferent rates after 30 or more FPS, so 30 is the maximun not the minimum. Also, I made 2 games in DOS which work at the default 18.2 clock rate (so they are 18.2 FPS games) and the animation looks just fine.

By the way, increasing the frame rate in movies would also increase the amount of film needed which would also increase the cost (considerably) of making the movie, so I though the 24 FPS in film movies was a compromise of cost/quality.

Please correct me if wrong.

Topgoro



Edited by - Topgoro on July 24, 2000 3:05:17 PM

Share this post


Link to post
Share on other sites
30 fps is NOT the maximum a human can detect, it''s really something like 32.

There was a post about this not too long ago and I was referred to an article on the web that went into a whole discussion about this exact topic. Do a search for it on the messageboards, im sure you''ll find it.

Anyway, the conclusion was that the human can detect much more than 30fps. The only reason TVs and movies are able to look good at those low frame rates is cause movies are in dark areas, so whatever you see you have an after effect of it in your retina cause of the dark room vs. the bright movie lights.
TVs also have some catch i cant really remember it exactly. So if you''re real curious about this go hunt down that article

- Rob



ByteMe95::~ByteMe95()

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
Whether the difference is noticable or not, any game that maintains at least 24 FPS is quite playable. One thing that annoys me about some modern games: some games periodically decide to do a lot of memory rearrangement, often involving disk access, causing the display to freeze for 1/4 to 1/2 second. A example of this is Descent 3, which does 60 FPS most of the time, but periodically dips down to 4 FPS. So whatever you do, beware of complex memory management and disk access.

- JRod

Share this post


Link to post
Share on other sites
Ok, the information i've seen says that-

The human eye cannot see detail above 40 fps
-however-
it can see large movement up to 70fps

i'm sure it varies from person to person sightly but the main
thing you guys are missing is, your eyes don't mind low fps so much as it annoys them to have varying fps. that is why you want to either lock the fps (not a good idea for 3d game) or try to achieve at or above 30 fps at all time...

the constant fps in tv/video and movie are what makes it look smooth...and because of tv's seperate fields the acual fps is only 1/2 of the 24 or 29.97 fps in tv, video and movie. As for your question though...
for 2d animation you can go as low as 12 fps without bugging the heck out of the persons eyes, but in a 3d game where the fps isn't constant try to stay >=30 fps

-mike

btw i don't claim all (if any ) of this information is reliable but is to the best of my knowledge, accurate.

Edited by - thr33d on July 24, 2000 4:02:57 PM

Share this post


Link to post
Share on other sites
I think that the problem is, what we can see and what we can feel (i.e. mouselook) are two completely different things. I mean, getting 30 fps or 90 fps in a game looks pretty much the same, but 90 FEELS so much smoother. That''s why a lot of hardcore gamers want triple digit framerates; not to make the game look better, but to make it feel smoother.

Overall, though, I think it''s more important to maintain a constant framerate. I would rather play with a consistent 30 fps than a framerate that jumps around constantly.

Just my opinion....

Martee
Magnum Games

Share this post


Link to post
Share on other sites
quote:
Original post by ByteMe95

30 fps is NOT the maximum a human can detect, it''s really something like 32.

There was a post about this not too long ago and I was referred to an article on the web that went into a whole discussion about this exact topic. Do a search for it on the messageboards, im sure you''ll find it.

Anyway, the conclusion was that the human can detect much more than 30fps. The only reason TVs and movies are able to look good at those low frame rates is cause movies are in dark areas, so whatever you see you have an after effect of it in your retina cause of the dark room vs. the bright movie lights.
TVs also have some catch i cant really remember it exactly. So if you''re real curious about this go hunt down that article

- Rob



ByteMe95::~ByteMe95()


Well, according to common wisdom I was right, but one million wrongs don''t make 1 right. ByteMe is right, according to the this article (which might even be the one he was recomending us to go find in the message boards).

Thank you for the info ByteMe.

Topgoro

Share this post


Link to post
Share on other sites
Hmmm...

All this is really interesting, but I wasn''t really trying to match the eyeball refresh rate, or even the monitor refresh rate. What I was trying to get at was a minimum game frame rate - which runs quasi-asynchronously with the others. Agreeably, if you can keep up with the monitor, thats great, but assume for a moment you are targeting a low end Pxxx as your advertised "minimum system" and you have a full 360 flight sim for sale.

Also suppose that testing reveals worst case 8fps on a P133, 12fps on a P233 and 20fps on a P400. Where do you draw the line on a advertised minimum system (I can''t wait to see the answers on this one...).

Personally I''ve found combat sims (like Jane''s USNF) playable down to 10-12fps on my old P90. Below that it''s impossible to target the enemy. Except for quick manuevers, most flight sims have minimal screen motion, so a low fps isn''t really noticed.

Share this post


Link to post
Share on other sites
A flash of light that is microseconds in duration (not milliseconds, but microseconds) is quite visible to the human eye. I know this because I work with strobing hardware that can provide flashes of light with durations ranging from 15-250 microseconds, and those flashes are quite visible. Now admittedly when those flashes are occuring at extremely high speeds they can blur into a seemingly constant light source (were it not for the fact that each flash results in an audible snap or popping sound, making these units moderately annoying to use, but unavoidable).

I also know that it is quite possible to notice the difference between 60fps and 120fps in a fast paced 3D action game. In particular, performing a quick 180 degree about face in a fraction of a second can make even 60fps too slow for smooth movement as only a few frames are used to render this rather quick motion. At higher frame rates, our eyes will blur those frames together, but they will blur them into a more realistic representation of high speed motion (just as they blur those extremely short duration flashes into a seemingly constant source of light)

And I''d draw the line at 20fps, even for a flight simulator. Unless your game will have NO close range dogfighting, 8-15fps is going to be too slow. As you said, it''d be too slow for close range targetting (which is needed for using chain guns or the like). Min specs should never render any expected game feature unusable (excluding graphical enhancements).

Share this post


Link to post
Share on other sites
Thanks for the inputs...and, yeah I agree. 20-30 fps is probably a reasonable expectation with today''s technology. I guess I''m a little disappointed with the first cut results, but I''m confident I can tune it up - that''s the fun part, right?

(wimper, choke, sniff...)

BTW, profiling shows about 49% of time in peekmessage() so using Directinput should nearly double my fps - my first target. Curiously, another near 49% is in DirectDrawPrimative (as expected) - but all my AI and scene update processing is only about 2%!

Share this post


Link to post
Share on other sites
I think 25 frames is acceptable...

what i was thinking of in the game i'm developing with a group is to let the player modify the framerate he is aming at.. and then the game automatically adjusts the details... so if u r aiming at 60 FPS you always get 60 FPS but the level of detail always jumps up and down... i'm still thinking of how to implement this, but wouldn't that be great?

What i hate most in 3d action shooters, is if i turned of V-sync and the edges start to jump.... aaarrrghhhh...
(I always have to look at the jumping wall then, and can't look at were my enemies are..) since then i always got v-sync turned on and it seems to work fine...

if somebody got ideas how to implement the fixed FPS email me thx.

Phil

Edited by - phueppl1 on July 26, 2000 2:00:27 PM

Share this post


Link to post
Share on other sites