#### Archived

This topic is now archived and is closed to further replies.

# Frame Rates

This topic is 6351 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

Hi I've just been looking at a reply to a post i posted a few months back about frame rates. This person said to get how long a frame lasts do this: // Cheap example DWORD lastTime = GetTickCount(); DoRenderStuff(); DWORD change = GetTickCount() - lastTime; printf("DoRenderStuff() took %d milliseconds.\n", change); ///// but i want a frame counter. Sonce i'm not too good at math i'd ask you guys how to work out how fast my game is running (in frames per second) Thanks for any help, J PS, coded in C++ Edited by - jason2jason on July 21, 2001 8:43:28 AM

##### Share on other sites
another cheap example then..

// Right before rendering loop only execute ONCE before main rendering loop
frames=0;
startTime = GetTickCount();

DoRenderStuff();
frames++;

// this can happen when ever
framerate = (float)( 1000 * frames ) / (float)( GetTickCount() - startTime );

Edited by - snowmoon on July 21, 2001 9:13:26 AM

##### Share on other sites
Snowmoon: Your code would calculate the average framerate since the start of the rendering loop! However I think this isn't very good for measuring the actual framerate of the moment, as it will most likely change very broad if you for example turn around or are surrounded by many enemies, so for exact measurement of the current fps you should use the following:

main loop
{
lasttime = GetTickCount();

AI();
Physics();
Animation();
Whatever();
RenderFrame();

fps = 1000 / (GetTickCount() - lasttime);
}

Assuming that (GetTickCount() - lastime) returns a value in milliseconds
Hope this is correct and helps!

Edited by - Vaporisator on July 21, 2001 9:31:47 AM

##### Share on other sites
Vaporisator, your code gets a divide by zero error message, but thanks anyway.

##### Share on other sites
How did you test my code?
If the loop content needs less then 1 ms to execute then (GetTickCount - LastTime) will surely get zero!
You could avoid this problem by enhance the code like that:

main loop
{
lasttime = GetTickCount;

AI();
Physics();
Animation();
Whatever();
RenderFrame();

change = (GetTickCount - lasttime);
if (change != 0)
{
fps = 1000 / change;
}
else
{
fps = 1001;
}
}

However it will occur very unlikely that the framerate climbes above 1000 fps
Hope at least this did help.

##### Share on other sites
Two things, I would recommend adding the line ''lasttime = GetTickCount;'' at the bottom. This will be more accurate since the time inbetween calls to the game loop is also included. Also, I would recommend timeGetTime over GetTickCount because of its higher resolution.

##### Share on other sites
  CTimer::CTimer(){ Init();}CTimer::~CTimer(){}void CTimer::Init(){ if (!QueryPerformanceFrequency((LARGE_INTEGER *) &Frequency)) { UseQPC=false; fResolution=1.0f/1000.0f; Frequency=1000; timeBeginPeriod(1); StartTime=(float)timeGetTime(); timeEndPeriod(1); } else { __int64 time; UseQPC=true; fResolution=(float) (((double)1.0f)/((double)Frequency)); QueryPerformanceCounter((LARGE_INTEGER *) &time); StartTime=(float)time * fResolution * 1000.0f; } FrameCount=0;}void CTimer::GetStartTime() // Place at the beginning of game loop{ __int64 Time; if (UseQPC) { QueryPerformanceCounter((LARGE_INTEGER *)&Time); FrameStartTime=(float)Time * fResolution * 1000.0f; } else { timeBeginPeriod(1); FrameStartTime=(float)timeGetTime(); timeEndPeriod(1); FrameStartTime=FrameStartTime * fResolution * 1000.0f; }}void CTimer::GetEndTime() // Place at the end of the game loop{ __int64 Time; if (UseQPC) { QueryPerformanceCounter((LARGE_INTEGER *)&Time); FrameEndTime=(float)Time * fResolution * 1000.0f; } else { timeBeginPeriod(1); FrameEndTime=(float)timeGetTime(); timeEndPeriod(1); FrameEndTime=FrameEndTime * fResolution * 1000.0f; } FrameCount++; fDelta=FrameEndTime-FrameStartTime;}void CTimer::Wait(unsigned int iMilliseconds) // 33=30 fps{ if (fDelta>0) // Make sure they called Start and EndTime functions { while (fDelta

HHSDrum@yahoo.com

##### Share on other sites
btw, I don''t use the ''wait'' function. some people like to lock the frame rate, that''s why it''s there.

HHSDrum@yahoo.com

##### Share on other sites
  // globalint frames=0;float startTime = GetTickCount();DoRenderStuff();frames++;if (frames >= 10){ float time = (float)GetTickCount(); if (time == startTime) time++; framerate = (float)( 1000 * frames ) / (float)( time - startTime ); frames = 0; startTime = GetTickCount();}This will give you a more accurate reading since it takes the average frame rate every 10 frames instead of from the beginning of the program, and is easier to spot hotspots because it is not updated so fast that you can''t read the differences.SeeyaKrippy 

##### Share on other sites
quote:
Original post by Krippy2k

  // globalint frames=0;float startTime = GetTickCount();DoRenderStuff();frames++;if (frames >= 10){ float time = (float)GetTickCount(); if (time == startTime) time++; framerate = (float)( 1000 * frames ) / (float)( time - startTime ); frames = 0; startTime = GetTickCount();}This will give you a more accurate reading since it takes the average frame rate every 10 frames instead of from the beginning of the program, and is easier to spot hotspots because it is not updated so fast that you can''t read the differences.SeeyaKrippy `

I don''t know what the price of an int to float is, but I know it costs something-and you''re doing it multiple times. If you''re comparing avg frame rate every 10 frames, the frame rate would drop, giving you an incorrect fps. Better keep the var frames as a float and keep the casts to a minimum.

1. 1
2. 2
3. 3
4. 4
Rutin
13
5. 5

• 13
• 10
• 9
• 9
• 11
• ### Forum Statistics

• Total Topics
633692
• Total Posts
3013355
×