Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

AdamDawes

Timing methods

This topic is 6132 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I was wondering what approaches you guys take to timing your OpenGL apps -- by which I mean, ensuring they run at the same speed on all systems, regardless of CPU speed and graphics card capabilities. I assumed the easiest way to do this would be to call to GetTickCount() at the start of my rendering code. This works, but on a faster machine I''m finding that the code is running at a higher frequency than GetTickCount() updates. So some of my frames appear the same as the previous frame, making everything look very blocky. The only way I''ve worked around this for the time being is by making the code render the scene without moving anything for one second, and then counting the frames that were drawn. This can then be used to calculate the speed at which the scene should be updated. The two main downsides to this, though, are that I have to pause at the beginning of rendering, and that if the rendering complexity changes once I''ve started I am unable to detect how fast or slow the scene is being displayed. What other methods are there available for this? I''ve seen some reference in MSDN to "high resolution timers" and "multimedia timers" but haven''t had a chance to investigate them yet -- are these what I''m looking for? Many thanks, Adam.

Share this post


Link to post
Share on other sites
Advertisement
ok...

I''ll just rip this out of my code... since I wrote it ages back, it''s very ugly code (class''s within classes - *shudder*)

so I''ll be re-writing it soon anyway...

this chooses a multimedia timer or a performance timer depending on what is avaliable...

the only thing you need to know, is that the multimedia timer requires winmm.lib.

I don''t have the includes listed here (in the stdafx precompiled header), so you''ll have to go through the functions in your docs to find what ones they need.

Timer.h


  

class CTimer
{
public:

CTimer();
virtual ~CTimer();
float GetFrameTickLength();
protected:
void SetTimersToIdentity();
bool Identity;
class CMultiMediaTimer
{
public:
friend class CTimer;
protected:
unsigned long iTimerInit;
unsigned long iTimerFrameTime;
unsigned long iTimerElapsed;
}MultiMediaTimer;

class CPerformanceTimer
{
public:
friend class CTimer;
protected:
__int64 iPerformanceTimerInit;
__int64 iPerformanceTimerFrameTime;
__int64 iPerformanceTimerElapsed;
__int64 iTimerFrequency;
}PerformanceTimer;

bool bPerformanceCounterSupported;
};




Timer.cpp

  
CTimer::CTimer()
{
MultiMediaTimer.iTimerElapsed=0;
MultiMediaTimer.iTimerInit=0;

PerformanceTimer.iPerformanceTimerElapsed=0;
PerformanceTimer.iPerformanceTimerInit=0;

if (QueryPerformanceFrequency((LARGE_INTEGER *) &PerformanceTimer.iTimerFrequency))
{
bPerformanceCounterSupported=false;
MultiMediaTimer.iTimerInit = timeGetTime();
MultiMediaTimer.iTimerElapsed =MultiMediaTimer.iTimerInit;
}
else
{
bPerformanceCounterSupported=true;
QueryPerformanceCounter((_LARGE_INTEGER *) &PerformanceTimer.iPerformanceTimerInit);
PerformanceTimer.iPerformanceTimerElapsed= PerformanceTimer.iPerformanceTimerInit;
}
Identity=false;
}

CTimer::~CTimer()
{

}

float CTimer::GetFrameTickLength()
{
if (!Identity)
{
Identity=true;
SetTimersToIdentity();
}

if (bPerformanceCounterSupported)
{
__int64 iCurrentTick;
QueryPerformanceCounter((_LARGE_INTEGER *) &iCurrentTick);
float ret=__FLOAT(iCurrentTick - PerformanceTimer.iPerformanceTimerFrameTime);
PerformanceTimer.iPerformanceTimerFrameTime=iCurrentTick;
return ret/__FLOAT(PerformanceTimer.iTimerFrequency);
}
else
{
unsigned long iCurrentTick=timeGetTime();
float ret=__FLOAT(iCurrentTick - MultiMediaTimer.iTimerFrameTime);
MultiMediaTimer.iTimerFrameTime=iCurrentTick;
return ret*0.001f;
}
}

void CTimer::SetTimersToIdentity()
{
GetFrameTickLength();
}

Share this post


Link to post
Share on other sites
in your programs init code call
timeBeginPeriod( 1 );

+ now use timegettime in most win os''s u now have a timer accurate to 1ms

http://uk.geocities.com/sloppyturds/gotterdammerung.html

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!