FPS

Started by
31 comments, last by Stonicus 21 years, 1 month ago
Ok, I have to ask this because it has been bothering me for some time. Do people *really* get 800 frames/sec? For starters, I do not believe they do at all. That being said, I have a dual processor PIII 1Ghz with 1gig ram and GeForce4 MX420 AGP card. I never ever get 800 frames/sec. I sometimes get 100 or 120 on a very simply scene with nothing being drawn. Do the math, to get 800 frames/sec you have to do a frame almost every single millisecond. How do you people with 800 frames/sec calculate your frames? How do you get your frame time? Do you use QueryPerformanceCounter or just timeGetTime()? If anyone claiming 400, 500, 600, etc frame/sec could answer these questions I''d be interested in knowing the answers. And to top it off, I have tried many sample/tutorial programs and have found many of them to have buggy fps counters so was just curious if people blindly copy them...
Advertisement
Well, if you aren''t drawing anything what should limit your FPS? I just did a quick test and with frame/depth buffer clears only (nothing gets drawn) I get 2000 fps and more. Try using some profiler to find where your bottleneck is.

-lev
Might possibly be your monitor refresh rate...
It''s your graphics card. If you were to try it on a GF3 or GF4 you would get >2000 fps (without vsync enabled).
It could be that you have processes running in the background that are sucking up CPU cycles (my friend had that problem, he would barely get 30fps on simple games and he had a geforce ti4600).

I''ve got a Geforce4 ti4600 and I''ve gotten over well over 800 fps on some apps and demos. For my game, I use QueryPerformanceCounter cause its faster than timeGetTime()

Look up some tutorials on calculating fps there is one on www.gametutorials.com.

-------------------
Realm Games Company
-------------------Realm Games Company
I think gametutorials uses timeGetTime()

is QueryPerformanceCounter in WINMM.lib, too? If not, where is it?
Well the problem is I get 70 fps with nothing on screen, I can dump like 50000 triangles per frame and I get 50 fps... so I dunno... maybe it is my fps counter that bugs
this is a good thread.. something that has been on my mind a lot lately is what is the best and most effecient way to calculate the FPS on any demo or game?? ive seen it done several different ways, i just thought i would take a stab on asking someone for the best way. thanks

heh
Is GetTickCount unadvisable?

[Hugo Ferreira][Positronic Dreams]
Visit my WebSite!

Firstly, someone stated that they use QueryPerformamceCounter because it's faster than timeGetTime() - this is not true. However, it is far more precise.

There's nothing wrong with using GetTickCount for a simple FPS counter - just remember that it isn't very accurate. It's most useful if you use it in some way like this:-

unsigned int t1 = GetTickCount( );
unsigned int t2;
unsigned int f = 0; //frame counter
float fps;

game loop
.
.
.
f++;
t2 = GetTickCount( ) - t1;
if( t2 >= 1000 ) //roughly one second has passed
{
fps = f / float( t2 ) * 1000.0f;
f = 0;
t1 = GetTickCount( );
}
.
.
.
end game loop

That's one way (psuedo) example in which you can use GetTickCount as a cheap FPS counter. It'll only update the frame rate once every second or so - but it's good for an average.

Ultimately it depends on your requirements - whether or not you need to know the exact amout of time between in each frame (for time based movement etc) and so on.

Regards

[edited by - Shag on February 20, 2003 6:36:34 PM]

This topic is closed to new replies.

Advertisement