how to disply the frames per second?

Started by
9 comments, last by dirkduck 20 years, 12 months ago
Just a quick question, how would you see the FPS (frames per second) in an OpenGL program? thanks
http://www.labino.net
Advertisement
In your main loop, you have this:

  while( true ){    float firstTime = timeGetTime();    // do stuff    float secondTime = timeGetTime();    float frameTime = secondTime - frameTime;    float fps = 1.0f / frameTime;}  


Then, just display fps however you normally display text in your application. Perhaps using wglUseFontBitmaps() and glCallLists()...

Note: timeGetTime() is defined in winmm.lib (I think) so be sure to include that into your project. You can also use QueryPerformanceCounter() for greater accuracy.


War Worlds - A 3D Real-Time Strategy game in development.
for a more accurate reading, you can average the frames. you can do this by keeping a FrameCounter variable that increments every frame, and a StartTime for when it all started.

HHSDrum@yahoo.com
Polarisoft Home Page
My HomepageSome shoot to kill, others shoot to mame. I say clear the chamber and let the lord decide. - Reno 911
If you want a really accurate FPS counter... Then you will need to do something besides TimeGetTime, and GetTickCount and such.. Because if you have one ''spike'' in your program, it will screw up the entire FPS count.

------------------------------
Trent (ShiningKnight)
E-mail me
OpenGL Game Programming Tutorials
Average your frames but not across the entire runtime of the application. Average like 10 or 20 frames at a time. Then if you must, average the averages This way a single spike won''t affect it too much.

Seeya
Krippy
interesting demo the other day on the nvidia site concerning timings
on my celeron433 the order from slowest to fastest
queryperformancecounter - slowest
timegettime
gettickcount
pent internal high freq - fastest

http://members.xoom.com/myBollux
I do:


  frame_end_time = GetTickCount();glPrint("Frame rate = %3.5f", 1.0f/((frame_end_time - frame_start_time)/1000),"\n");frame_start_time = GetTickCount();  


but get a result saying "Frame rate = 1.#INFO". Why is that happening?
Apparently, frame_end_time and frame_start_time are exactly equal, causing a divide by 0.

BTW, Did you necro this on purpose?



//email me.//zealouselixir software.//msdn.//n00biez.//
miscellaneous links

[twitter]warrenm[/twitter]

hm.. as far as i can tell both timegettime and gettickcount have a lousy resolution (1ms), which wont help you if your app is quite fast (and even averaging over 100frames wont help as 100*0 is still 0).

queryperfomancecounter might be the slowest function to call, but it has a higher resolution. rdtsc has great resolution and is the fastest, but dont expect it to work on old machines.
f@dzhttp://festini.device-zero.de
The basic idea is to init your fps-timer and your frame counter


             start_time    = timeGetTime();       frame_counter = 0;              fps           = 0.0f;           //then every frame increase the frame counter      frame_counter++;  //and check for elapsed time           elapsed_time = timeGetTime() - start_time;      if(elapsed_time>2000) // or 1000 or 5000...      {         // elapsed_time is in ms -> *1000!         fps = (float)1000*frame_counter/elapsed_time;         frame_counter = 0;         start_time = timeGetTime();     }      


As you see is very simple...however I suggest you to implement different classes or functions to manage timers, fps timers and freeze timers...the idea is the same!

Of course the precision of timeGetTime() depends on your OS and in some case it's not very very accurate but...as you see an error of about 50ms (I think it's the max error) creates an error of 50/2000 = 2,5% in your FPS value!


[edited by - blizzard999 on April 25, 2003 7:06:56 AM]

[edited by - blizzard999 on April 25, 2003 8:13:54 AM]

This topic is closed to new replies.

Advertisement