Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

dirkduck

how to disply the frames per second?

This topic is 5632 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Advertisement
In your main loop, you have this:

  
while( true )
{
float firstTime = timeGetTime();
// do stuff

float secondTime = timeGetTime();
float frameTime = secondTime - frameTime;
float fps = 1.0f / frameTime;
}


Then, just display fps however you normally display text in your application. Perhaps using wglUseFontBitmaps() and glCallLists()...

Note: timeGetTime() is defined in winmm.lib (I think) so be sure to include that into your project. You can also use QueryPerformanceCounter() for greater accuracy.



War Worlds - A 3D Real-Time Strategy game in development.

Share this post


Link to post
Share on other sites
If you want a really accurate FPS counter... Then you will need to do something besides TimeGetTime, and GetTickCount and such.. Because if you have one ''spike'' in your program, it will screw up the entire FPS count.

------------------------------
Trent (ShiningKnight)
E-mail me
OpenGL Game Programming Tutorials

Share this post


Link to post
Share on other sites
Average your frames but not across the entire runtime of the application. Average like 10 or 20 frames at a time. Then if you must, average the averages This way a single spike won''t affect it too much.

Seeya
Krippy

Share this post


Link to post
Share on other sites
interesting demo the other day on the nvidia site concerning timings
on my celeron433 the order from slowest to fastest
queryperformancecounter - slowest
timegettime
gettickcount
pent internal high freq - fastest

http://members.xoom.com/myBollux

Share this post


Link to post
Share on other sites
I do:


  
frame_end_time = GetTickCount();
glPrint("Frame rate = %3.5f", 1.0f/((frame_end_time - frame_start_time)/1000),"\n");
frame_start_time = GetTickCount();


but get a result saying "Frame rate = 1.#INFO". Why is that happening?

Share this post


Link to post
Share on other sites
hm.. as far as i can tell both timegettime and gettickcount have a lousy resolution (1ms), which wont help you if your app is quite fast (and even averaging over 100frames wont help as 100*0 is still 0).

queryperfomancecounter might be the slowest function to call, but it has a higher resolution. rdtsc has great resolution and is the fastest, but dont expect it to work on old machines.

Share this post


Link to post
Share on other sites
The basic idea is to init your fps-timer and your frame counter


      
start_time = timeGetTime();
frame_counter = 0;
fps = 0.0f;



//then every frame increase the frame counter

frame_counter++;

//and check for elapsed time


elapsed_time = timeGetTime() - start_time;

if(elapsed_time>2000) // or 1000 or 5000...

{
// elapsed_time is in ms -> *1000!

fps = (float)1000*frame_counter/elapsed_time;
frame_counter = 0;
start_time = timeGetTime();
}


As you see is very simple...however I suggest you to implement different classes or functions to manage timers, fps timers and freeze timers...the idea is the same!

Of course the precision of timeGetTime() depends on your OS and in some case it's not very very accurate but...as you see an error of about 50ms (I think it's the max error) creates an error of 50/2000 = 2,5% in your FPS value!


[edited by - blizzard999 on April 25, 2003 7:06:56 AM]

[edited by - blizzard999 on April 25, 2003 8:13:54 AM]

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!