Archived

This topic is now archived and is closed to further replies.

The correct way to measure frame rate...

This topic is 5278 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi Gamedev people, I was just wandering how to measure frame rate properly in my games. I see a lot of people getting frame rates of 100-200 FPS and I only get something like 60. My computer is: AMD athlon 1600 XP 128 MB Geforce 4 MX SE 128 DDR Ram so Im thinking its the way I calculate my FPS. So far its not really a game as such, just a program which shows a bitmap. The way I calulate it is like so I have variables: 1) fps // obvious 2) frameCount everytime I make a call to update the screen, I increment frameCount to count how many frames I have drawn. After each second derived by:
// has ONE second passed

if (GetTickCount() - lastTime>=1000)  // one second I hope :S

{
    lastTime = GetTickCount();
    // save the framecount into frames per second

    fps = frameCount;
    frameCount=0;
}

fps is then shown on the screen a long with the bitmap on each frame. Generally what Im doing is counting each time I update the screen and then if one second passes I record that as the fps (as how many frames drawn that second) and then reset the frame counter. Is this correct way? DarkStar UK

Share this post


Link to post
Share on other sites
If you always get something like 60 fps I would guess you''ve forgotten to disable V-Sync... if the eerythink is synchronized to the monitor refresh rate you can''t expect it go go faster then that... (yes i know you might tell me you''ve set a much higher refresh rate, but Windows often set it to 60 hz in OpenGL by default)

Share this post


Link to post
Share on other sites
You get 60 fps because your game is matching the refresh rate of your moniter. People who are getting a higher frame-rate (like 100-200) are either running in windowed mode (where frames are not matched to the mointer refresh) or have a moniter that supports a very high refresh rate and an amazing video card.

If you are consistently getting 60 though, it''s ok, because by using vsync you prevent tearing which looks really messy.

If you want a better measure of how well your game is performing measure the frame-time rather than the frame-rate. Your frame-rate is going to be locked in at 60 anyway, and plus since it is a non-linear measure it is not the most accurate bench-mark anyway. Measure the time it takes for a frame to completeand you know exactly how efficient your game really is.

Share this post


Link to post
Share on other sites
Actually thats incorrect, the people who get 100 to 200 FPS disable VSync on their cards. I do this at the very beginning of my code and thats how I get good frame rate (it can be done with both OGL and D3D, just takes some research to find out how). However it does lead to tearing which gets annoying sometimes (especially if the frame rate dips below 40). The best thing to do is let the user choose whether or not to disable VSync.

quote:
Original post by Alex
You get 60 fps because your game is matching the refresh rate of your moniter. People who are getting a higher frame-rate (like 100-200) are either running in windowed mode (where frames are not matched to the mointer refresh) or have a moniter that supports a very high refresh rate and an amazing video card.

If you are consistently getting 60 though, it''s ok, because by using vsync you prevent tearing which looks really messy.

If you want a better measure of how well your game is performing measure the frame-time rather than the frame-rate. Your frame-rate is going to be locked in at 60 anyway, and plus since it is a non-linear measure it is not the most accurate bench-mark anyway. Measure the time it takes for a frame to completeand you know exactly how efficient your game really is.


Share this post


Link to post
Share on other sites