Archived

This topic is now archived and is closed to further replies.

Jason2Jason

Frame Rates

Recommended Posts

Hi I've just been looking at a reply to a post i posted a few months back about frame rates. This person said to get how long a frame lasts do this: // Cheap example DWORD lastTime = GetTickCount(); DoRenderStuff(); DWORD change = GetTickCount() - lastTime; printf("DoRenderStuff() took %d milliseconds.\n", change); ///// but i want a frame counter. Sonce i'm not too good at math i'd ask you guys how to work out how fast my game is running (in frames per second) Thanks for any help, J PS, coded in C++ Edited by - jason2jason on July 21, 2001 8:43:28 AM

Share this post


Link to post
Share on other sites
another cheap example then..

// Right before rendering loop only execute ONCE before main rendering loop
frames=0;
startTime = GetTickCount();

DoRenderStuff();
frames++;

// this can happen when ever
framerate = (float)( 1000 * frames ) / (float)( GetTickCount() - startTime );


Edited by - snowmoon on July 21, 2001 9:13:26 AM

Share this post


Link to post
Share on other sites
Snowmoon: Your code would calculate the average framerate since the start of the rendering loop! However I think this isn't very good for measuring the actual framerate of the moment, as it will most likely change very broad if you for example turn around or are surrounded by many enemies, so for exact measurement of the current fps you should use the following:

main loop
{
lasttime = GetTickCount();

AI();
Physics();
Animation();
Whatever();
RenderFrame();

fps = 1000 / (GetTickCount() - lasttime);
}

Assuming that (GetTickCount() - lastime) returns a value in milliseconds
Hope this is correct and helps!

Edited by - Vaporisator on July 21, 2001 9:31:47 AM

Share this post


Link to post
Share on other sites
How did you test my code?
If the loop content needs less then 1 ms to execute then (GetTickCount - LastTime) will surely get zero!
You could avoid this problem by enhance the code like that:

main loop
{
lasttime = GetTickCount;

AI();
Physics();
Animation();
Whatever();
RenderFrame();

change = (GetTickCount - lasttime);
if (change != 0)
{
fps = 1000 / change;
}
else
{
fps = 1001;
}
}

However it will occur very unlikely that the framerate climbes above 1000 fps
Hope at least this did help.

Share this post


Link to post
Share on other sites
Two things, I would recommend adding the line ''lasttime = GetTickCount;'' at the bottom. This will be more accurate since the time inbetween calls to the game loop is also included. Also, I would recommend timeGetTime over GetTickCount because of its higher resolution.

Share this post


Link to post
Share on other sites
  
CTimer::CTimer()
{
Init();
}

CTimer::~CTimer()
{

}
void CTimer::Init()
{

if (!QueryPerformanceFrequency((LARGE_INTEGER *) &Frequency))
{
UseQPC=false;
fResolution=1.0f/1000.0f;
Frequency=1000;
timeBeginPeriod(1);
StartTime=(float)timeGetTime();
timeEndPeriod(1);
}
else
{
__int64 time;
UseQPC=true;
fResolution=(float) (((double)1.0f)/((double)Frequency));
QueryPerformanceCounter((LARGE_INTEGER *) &time);
StartTime=(float)time * fResolution * 1000.0f;
}
FrameCount=0;
}
void CTimer::GetStartTime() // Place at the beginning of game loop

{
__int64 Time;

if (UseQPC)
{
QueryPerformanceCounter((LARGE_INTEGER *)&Time);
FrameStartTime=(float)Time * fResolution * 1000.0f;
}
else
{
timeBeginPeriod(1);
FrameStartTime=(float)timeGetTime();
timeEndPeriod(1);
FrameStartTime=FrameStartTime * fResolution * 1000.0f;
}
}
void CTimer::GetEndTime() // Place at the end of the game loop

{
__int64 Time;

if (UseQPC)
{
QueryPerformanceCounter((LARGE_INTEGER *)&Time);
FrameEndTime=(float)Time * fResolution * 1000.0f;
}
else
{
timeBeginPeriod(1);
FrameEndTime=(float)timeGetTime();
timeEndPeriod(1);
FrameEndTime=FrameEndTime * fResolution * 1000.0f;
}

FrameCount++;

fDelta=FrameEndTime-FrameStartTime;
}
void CTimer::Wait(unsigned int iMilliseconds) // 33=30 fps

{
if (fDelta>0) // Make sure they called Start and EndTime functions

{
while (fDelta<iMilliseconds)
{
GetEndTime();
}
}
}
void CTimer::GetFPS() // Be sure to call after GetEndTime()

{
// FramesPerSecond=1.0f/(fDelta/1000.0f); // convert milliseconds to seconds

FramesPerSecond=FrameCount/((FrameEndTime-StartTime)/1000.0f);
}


HHSDrum@yahoo.com
Polarisoft Home Page

Share this post


Link to post
Share on other sites
  
// global

int frames=0;
float startTime = GetTickCount();

DoRenderStuff();

frames++;
if (frames >= 10)
{
float time = (float)GetTickCount();
if (time == startTime)
time++;
framerate = (float)( 1000 * frames ) / (float)( time - startTime );
frames = 0;
startTime = GetTickCount();
}

This will give you a more accurate reading since it takes the average frame rate every 10 frames instead of from the beginning of the program, and is easier to spot hotspots because it is not updated so fast that you can''t read the differences.

Seeya
Krippy

Share this post


Link to post
Share on other sites
quote:
Original post by Krippy2k

    
// global

int frames=0;
float startTime = GetTickCount();

DoRenderStuff();

frames++;
if (frames >= 10)
{
float time = (float)GetTickCount();
if (time == startTime)
time++;
framerate = (float)( 1000 * frames ) / (float)( time - startTime );
frames = 0;
startTime = GetTickCount();
}

This will give you a more accurate reading since it takes the average frame rate every 10 frames instead of from the beginning of the program, and is easier to spot hotspots because it is not updated so fast that you can''t read the differences.

Seeya
Krippy



I don''t know what the price of an int to float is, but I know it costs something-and you''re doing it multiple times. If you''re comparing avg frame rate every 10 frames, the frame rate would drop, giving you an incorrect fps. Better keep the var frames as a float and keep the casts to a minimum.

Share this post


Link to post
Share on other sites
I''de like to think that whatever rendering he is doing will
cost a magnitude more than any fps calcualtions that might
need to be made. This being the case their would have to
be alot more programming before it made any diffrence.

So the cost of a few casts is nothing. Especially at the
lower framerates where their is a big diffrence between 9
and 9.99999999. At the higher framerates it will matter
even less.

If you only wat to update is at a specific interval I recommend

if ( frames % 10 )

so it only executes once every 10th frame.

Share this post


Link to post
Share on other sites