Finding out FPS

Started by
6 comments, last by pixeltasim 11 years, 1 month ago

I just started working with SDL and with some help from the internet got deWitter's game loop working, but I want to find out what my fps is. I'm not exactly sure how to go about doing this, and I haven't really understood any answer I have found on the internet. Any answers would be appreciated.


#include <SDL.h>

int main( int argc, char *argv[] )
{
const int TICKS_PER_SECOND = 25;
const int SKIP_TICKS = 1000 / TICKS_PER_SECOND;
const int MAX_FRAMESKIP = 5;

float interpolation;
int loops;
int fps;

SDL_Event event;
SDL_Surface *screen = NULL;
screen = SDL_SetVideoMode( 640, 480, 32, SDL_HWSURFACE);

SDL_Surface* bmp = SDL_LoadBMP("cb.bmp");
SDL_Rect source;
source.x = 0;
source.y = 0;
source.h = 150;
source.w = 120;

SDL_Rect dest;
dest.x = 120;
dest.y = 120;
dest.h = 150;
dest.w = 120;

bool running = true;
unsigned int next_game_tick = SDL_GetTicks();

while ( running )
{

while( SDL_PollEvent( &event ) )
{
if ( event.type == SDL_QUIT )
running = false;
}
if (event.type == SDL_KEYDOWN)
         {
            SDLKey keyPressed = event.key.keysym.sym;

            switch (keyPressed)
            {
               case SDLK_ESCAPE:
                  running = false;
                  break;
            }
         }


loops = 0;
while( SDL_GetTicks() > next_game_tick && loops < MAX_FRAMESKIP )
{


// update_game();

next_game_tick += SKIP_TICKS;
loops ++;
}

interpolation = float( SDL_GetTicks() + SKIP_TICKS - next_game_tick ) / float( SKIP_TICKS );

// display_game( interpolation );
SDL_BlitSurface(bmp,&source,screen,&dest);
SDL_Flip(screen);

}

return 0;
}

Advertisement

Lazy Foo has a tutorial on calculating frame rate you could check out. The tutorials just prior to it cover timing functions and regulating frame rate if you need more clarification on the functions involved.

Does that help? smile.png

- Jason Astle-Adams

Also note that much better measure than frames/s is ms/frame. This gives you information how long in miliseconds it takes to generate one frame on average. There is great explanation of why FPS sucks as a measure for programmers: http://www.mvps.org/directx/articles/fps_versus_frame_time.htm.

Especially the part comparing non-linear FPS vs linear ms/frame - you can see how bad of a measure FPS is and that it doesn't really tell you anything. Leave it to GPU marketers and sellers trying to impress customers with something that doesn't make sense to measure changes in performance on per frame basis.


Where are we and when are we and who are we?
How many people in how many places at how many times?

Alright, thank you! I will try that out.

I followed the tutorial and used their timer class, but I have an absolutely massive fps, and I'm not sure if that is correct.


#include <SDL.h>
#include "Timer.h"
int main( int argc, char *argv[] )
{
const int TICKS_PER_SECOND = 25;
const int SKIP_TICKS = 1000 / TICKS_PER_SECOND;
const int MAX_FRAMESKIP = 5;

float interpolation;
int loops;
int frame;
Timer fps;
Timer update;

SDL_Event event;
SDL_Surface *screen = NULL;
screen = SDL_SetVideoMode( 640, 480, 32, SDL_HWSURFACE | SDL_DOUBLEBUF);

SDL_Surface* bmp = SDL_LoadBMP("cb.bmp");
SDL_Rect source;
source.x = 0;
source.y = 0;
source.h = 150;
source.w = 120;

SDL_Rect dest;
dest.x = 120;
dest.y = 120;
dest.h = 150;
dest.w = 120;

bool running = true;
unsigned int next_game_tick = SDL_GetTicks();
fps.start();
update.start();
while ( running )
{

while( SDL_PollEvent( &event ) )
{
if ( event.type == SDL_QUIT )
running = false;
}
if (event.type == SDL_KEYDOWN)
         {
            SDLKey keyPressed = event.key.keysym.sym;

            switch (keyPressed)
            {
               case SDLK_ESCAPE:
                  running = false;
                  break;
            }
         }


loops = 0;
while( SDL_GetTicks() > next_game_tick && loops < MAX_FRAMESKIP )
{


// update_game();

next_game_tick += SKIP_TICKS;
loops ++;
}

interpolation = float( SDL_GetTicks() + SKIP_TICKS - next_game_tick ) / float( SKIP_TICKS );

// display_game( interpolation );
SDL_BlitSurface(bmp,&source,screen,&dest);
SDL_Flip(screen);
frame++;
if(update.get_ticks()>1000){
    int fpsnum = frame/(fps.get_ticks()/1000.f);
    printf("Rendering at %u frames per second\n", fpsnum);
    update.start();
}

}

return 0;
}

Standard output:
Rendering at 2683765 frames per second
Rendering at 895185 frames per second
Rendering at 537183 frames per second
Rendering at 383724 frames per second
Rendering at 298462 frames per second
Rendering at 244201 frames per second
Rendering at 223834 frames per second
Should that be normal?

Alright, so I figured out how to display FPS properly, but I am now having an issue with SDL_GetTicks(), whenever I try to assign a variable to it, the variable's value remains zero, while when I check SDL_GetTicks() directly, it works fine.


float Timer::timeSinceCreation()//calling this method returns zero as well
{
   return SDL_GetTicks(); 
}

float Timer::timeSinceLastFrame()
{
   float thisTime = SDL_GetTicks();//thistime is always zero
   float deltaTime = thisTime - m_timeOfLastCall;//, thus every other value remains zero
   m_timeOfLastCall = thisTime;
   printf("%u ms \n",thisTime );
   return deltaTime;
}

The program ran fine once, but then it stopped working.

Just realized, this seems to be part of a larger problem in this class, all of the variables don't get assigned, so when I changed deltaTime to 5, it stayed at zero.

SDL_GetTicks returns an unsigned integer (number of milliseconds), so you shouldn't assign it to a float or return it as a float either. I can't say if that is the cause of your problem though.

Thank You! It was part of the problem actually, I was assigned %u when I should have been doing %f. I've switched my variables over to uint32 though, because that's what SDL_GetTicks() returns in. I've got my FPS working now with a modified timer and Min/Avg/Max framerate system going.

This topic is closed to new replies.

Advertisement