Minimum Frame Rate

Started by
17 comments, last by X-Man 23 years, 8 months ago
I'm kinda curious as to what people preceive to be the minimum acceptable frame rate. A lot of posts boast 60+ fps and such, but a really complex game can quickly bog down to 10 or less. Since the programmer has to compromise meshes, textures etc. to achieve a target fps on a minimum machine, what should that rate be? If I'm not mistaken, US TV runs at 30fps, movies at 24 fps and old 16mm movies at 16fps. I had based my opinion on the latter, plus game playing experience to conclude that about 15fps is a minimum acceptable number. What do you all think...? Edited by - X-Man on 7/24/00 9:06:40 AM
Advertisement
30 fps is the minimum. if you''re on a fast computer though, it looks jittery because it has to wait for the timer. I don''t lock the frame rate, but rather update everthing else based on how much time has passed to draw the objects. like this:
    while(!done){DWORD stime=GetTickCount();drawstuff();DWORD gametime=GetTickCount()-stime;moveobjects(gametime); //this is basically how much to move the objects}    

you basically mulitiplay ''gametime'' by 0.02 to get the object to move the same amuont no matter what the frame rate is.

JoeMont001@aol.com www.polarisoft.n3.net
My HomepageSome shoot to kill, others shoot to mame. I say clear the chamber and let the lord decide. - Reno 911
Oh man, does that question every bring back memories.. On comp.graphics.algorithims that very same question was asked and months of discussion followed.

Basically, people said that tv and movies get by with such a low frame rate because of two factors:
1) The way the image is displayed, ie. the tv is not as precise a display as a computer monitor due to the different technologies used.
2) Spatial-anti-aliasing, because film opens a shutter for a specific amount of time, all information in that time period is recorded on to that frame of film, so as you play back the film, you aren''t really seeing 24 discrete intervals of time, as you do on a computer screen (of course 3dfx''s t-buffer allows you too do this on a limited basis).

But for these reasons, computers need to have a higher frame rate, or have less spatial aliasing.

Keep in mind this is only very basic analysis of why lower framerates on computers seem to be more noticable than on tv.

The discussion was in 1998 sometime.. very intersting read if you have the time.
30 fps is the minimum. if you''re on a fast computer though, it looks jittery because it has to wait for the timer. I don''t lock the frame rate, but rather update everthing else based on how much time has passed to draw the objects. like this:
        while(!done){DWORD stime=GetTickCount();drawstuff();DWORD gametime=GetTickCount()-stime;moveobjects(gametime); //this is basically how much to move the objects}        

you basically mulitiplay ''gametime'' by 0.02 to get the object to move the same amuont no matter what the frame rate is.

-----------------------------------------------------------------
If you did this, would you be open to the problem that your game
won''t stay the same speed across different computer speeds? Your game might run fine on a P3, but when the P7s come out, it might be unplayably fast. Also, what is GetTickCount? I thought most timers are notoriously low res unless you reprogram them. Personally I wait for the vertical blank, since it seems to be constant across CPU clock speeds and gives a pretty high refresh at the same time. But I''m just a novice too, so if anyone has a better way let me know.

Ut


no, it stays exactly the same speed. it''s called frame-based modeling. you don''t think quake 3 or unreal tournament lock the framerate do you?


JoeMont001@aol.com www.polarisoft.n3.net
My HomepageSome shoot to kill, others shoot to mame. I say clear the chamber and let the lord decide. - Reno 911
Geez...that was quick!

Hey, I agree, don't wait for a timer...move objects proportional to the frame rate! My concern is balancing graphic detail and game complexity with frame rate!

Also, re: movies - monitors don't have a "shutter closed" period (flicker) between frames. Shouldn't this allow pushing the minimum a bit?

Premandrake: I do appreciate the fact that film blurs movement and I'll have to look into t-buffers more...

Edited by - X-Man on July 24, 2000 12:45:10 PM
I gotta agree with Julio on this one, that''s how you should go about it.
The movement is all based on the elapsed time. So on slower machines, let''s say it took 200ms to do drawstuff(), then it''ll move the units some factor*200ms. IF on a P7 it takes 20ms to do drawstuff, then the units will move in increments 10 times less than on the other machine, giving the units the same movement no matter how fast the processor is, but you will get more fps from the better comp.

I do it that way myself.



ByteMe95::~ByteMe95()
ByteMe95::~ByteMe95()My S(h)ite
Hi all.

What we are talking about here? The minimum rate at wich the video card should refresh the monitor, or the rate at which the application should redraw the frame to create the efect of animation?

If it is the later, then the human eye can not see any diference between diferent rates after 30 or more FPS, so 30 is the maximun not the minimum. Also, I made 2 games in DOS which work at the default 18.2 clock rate (so they are 18.2 FPS games) and the animation looks just fine.

By the way, increasing the frame rate in movies would also increase the amount of film needed which would also increase the cost (considerably) of making the movie, so I though the 24 FPS in film movies was a compromise of cost/quality.

Please correct me if wrong.

Topgoro



Edited by - Topgoro on July 24, 2000 3:05:17 PM
We emphasize "gotoless" programming in this company, so constructs like "goto hell" are strictly forbidden.
30 fps is NOT the maximum a human can detect, it''s really something like 32.

There was a post about this not too long ago and I was referred to an article on the web that went into a whole discussion about this exact topic. Do a search for it on the messageboards, im sure you''ll find it.

Anyway, the conclusion was that the human can detect much more than 30fps. The only reason TVs and movies are able to look good at those low frame rates is cause movies are in dark areas, so whatever you see you have an after effect of it in your retina cause of the dark room vs. the bright movie lights.
TVs also have some catch i cant really remember it exactly. So if you''re real curious about this go hunt down that article

- Rob



ByteMe95::~ByteMe95()
ByteMe95::~ByteMe95()My S(h)ite
Errr, error in my last post
I meant to say it''s really something like 72, not 32



ByteMe95::~ByteMe95()
ByteMe95::~ByteMe95()My S(h)ite

This topic is closed to new replies.

Advertisement