Sign in to follow this  
Concentrate

Timer class

Recommended Posts

In my previous timer class I used clock() to get the time. Now I am trying QueryPerformanceCounter. But something is giving me trouble. Here is my incomplete class for now :
class HighTimer
{
public:        
	typedef LARGE_INTEGER TimeType;	
	typedef double	ReturnType;
private:
	//helpers
	void _GetTime(TimeType * timeVar){
		QueryPerformanceCounter(timeVar);
	}
	void _calcFrequency(TimeType* timeVar){
		QueryPerformanceFrequency(timeVar);
	}
	
private:
	TimeType _startTime;
	TimeType _endTime;
	TimeType _ticksPerSecond;
public:
	HighTimer(){
		_startTime.QuadPart = 0;
		_endTime.QuadPart = 0;	
		_ticksPerSecond.QuadPart = 0;
		_calcFrequency(&_ticksPerSecond);		
	}
	void start(){
		_GetTime(&_endTime);
	}
	LONGLONG getRawDiff(){
		return (_endTime.QuadPart - _startTime.QuadPart);
	}	
	ReturnType getDiffSec(){
		return getRawDiff()/ReturnType(_ticksPerSecond.QuadPart);
	}
	ReturnType getDiffMilli(){
		return ( getDiffSec() * 1000 ) * 0.001;
	}
};

The output is not what I thought it would be. For example this trial,
 
int main()
{
	HighTimer ht; //uses QueryPerformanceCounter
	LowTimer  lt; //uses clock()

	lt.start();
	ht.start();

	for(int i = 0; i < 100*100; i++){ cout << ""; } //waste time

	cout.precision(20);
	cout << fixed;

	cout<<"Clock result :\n";
	cout << lt.getDiffMilli() << endl;
	cout << lt.getDiffSec() << endl;
	

	cout<<"Query result : \n";
	cout << ht.getRawDiff() << endl;
	cout << ht.getDiffMilli() << endl;
	cout << ht.getDiffSec() << endl;

	return 0;
}

That produces this result:
Clock result :
2.00000000000000000000 //seconds
0.00200000009499490260 //milli seconds
Query result :
187245824782 //raw 
110894.44495692936000000000 // supposedly in Seconds ??
110894.44495692935000000000 //in Mili seconds ??
So seconds is wrong in the Query result. Can someone help.

Share this post


Link to post
Share on other sites
There are some things wrong with your timer class. Some obvious things are:

- In the function start() you set the endtime. Mustn't that be the starttime?
- In the function getRawDiff(), you use endtime, but it isn't set anywhere (according to the function above). You must call _getTime(&endtime) first.
- When calculating the milliseconds (getDiffMilli()), you must multiply the amount of seconds with 1000. Not multiply with 1000 and then multiply with 0.001.

Emiel1

Share this post


Link to post
Share on other sites
this is how to use QueryPerformanceCounter

timer.h

class Timer
{

protected:

double m_frequency;

__int64 m_startClock;

float m_frameStart;
float m_frameEnd;
float m_frameTime;

public:

float getFrameTime() { return m_frameTime; }

float getFps() { return 1/m_frameTime; }

double getTime();

void init();
void update();

};




timer.cpp

#include <windows.h>
#include "timer.h"

double Timer::getTime()
{
__int64 endClock;

QueryPerformanceCounter((LARGE_INTEGER*)&endClock);

return static_cast<double>((endClock-m_startClock)*m_frequency);
}

void Timer::init()
{
__int64 rate;

QueryPerformanceFrequency((LARGE_INTEGER*)&rate);
m_frequency = 1.0/static_cast<double>(rate);

QueryPerformanceCounter((LARGE_INTEGER*)&m_startClock);

m_frameStart = static_cast<float>(getTime());
m_frameEnd = 0.0f;
m_frameTime = 0.0f;
}

void Timer::update()
{
m_frameEnd = static_cast<float>(getTime());
m_frameTime = m_frameEnd - m_frameStart;
m_frameStart = m_frameEnd;
}


Share this post


Link to post
Share on other sites
Can someone help. I am trying this new timer class, but it does not work properly.
It does not update my draw function properly.

Here is the Timer class,

class HighTimer
{
public:
typedef LARGE_INTEGER TimeType;
typedef double ReturnType;
private:
//helpers
void _GetTime(TimeType * timeVar){
QueryPerformanceCounter(timeVar);
}
void _calcFrequency(TimeType* timeVar){
QueryPerformanceFrequency(timeVar);
}
ReturnType ticksPerSecond(){
return _ticksPerSecond.QuadPart;
}

private:
TimeType _startTime;
TimeType _endTime;
TimeType _ticksPerSecond;
public:
HighTimer() {
_startTime.QuadPart = 0;
_endTime.QuadPart = 0;
_ticksPerSecond.QuadPart = 0;
_calcFrequency(&_ticksPerSecond);
}
void start(){
_GetTime(&_startTime);
}
void reset(){
_startTime = _endTime;
}
LONGLONG getRawDiff(){
_GetTime(&_endTime);
return (_endTime.QuadPart - _startTime.QuadPart);
}
ReturnType getDiffSec(){
return getRawDiff()/ ticksPerSecond();
}
ReturnType getDiffDeci(){
return (getDiffSec() * 10 );
}
ReturnType getDiffCenti(){
return ( getDiffSec() * 100 );
}
ReturnType getDiffMilli(){
return ( getDiffSec() * 1000 );
}
bool hasSecondPassed(){
return getDiffSec() > 1;
}
};



I don't know if I did something wrong. Here is how I am using it.

GLvoid disp()
{
update.start();

static float accumlator = 0.0f;
static const float deltaTime = 0.01f; //millisecond

if(update.getDiffSec() >= deltaTime)
{

cout<<update.getDiffSec()<<endl; //does not come inside here often than it should

accumlator += update.getDiffSec();

update.reset();

glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();
glTranslatef(0,0,-30);

while(accumlator >= deltaTime){
spring.drawSpring(deltaTime);
accumlator -= deltaTime;
}

glutPostRedisplay();
glutSwapBuffers();
}
}



It does not go inside the if statement very often, so my spring does not
go into motion often, in fact it looks still.

Share this post


Link to post
Share on other sites
Well first, a millisecond is 0.001 seconds, not 0.01 like in your disp() function.

Second, you're starting your timer immediately before checking the how long as elapsed. Yes, it should be extremely rare that calling QueryPerformanceCounter() twice in a row results in a time difference of even 1ms, to say nothing of 10ms like what was in your code. Consider calling update.start() outside of disp(), perhaps immediately before whatever while loop is calling disp().

Share this post


Link to post
Share on other sites
I was messing around with it, and it works fine now, but the only problem that occurs is when the application first starts. It stalls for a few seconds with a empty transparent screen, then it works normally. I debugged it and found that
initially the accumulator has the value 4000 because update.getDiffSec() returns
a number around 4000, and it then gets stalled in the while(accumlator >= deltaTime) loop because accumlator is around 4000. I would appreacite it if someone helped to find out a reason why initally getDiffSec returns a number around 4000.

GLvoid disp()
{
update.start();

static float accumlator = 0.0f;
static const float deltaTime = 0.01f; //10 millisecond

if(update.getDiffSec() >= deltaTime)
{

cout<<update.getDiffSec()<<endl; //does not come inside here often than it should

accumlator += update.getDiffSec(); //frist returns around 4000

update.reset();

glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();
glTranslatef(0,0,-30);

while(accumlator >= deltaTime)
{
spring.drawSpring(deltaTime);
accumlator -= deltaTime;
}
glBegin(GL_LINES);
glVertex3f(spring.getRestLength().getX(),-5,0);
glVertex3f(spring.getRestLength().getX(),5,0);
glEnd();
glutPostRedisplay();
glutSwapBuffers();
}
}



The timer class :

class HighTimer
{
public:
typedef LARGE_INTEGER TimeType;
typedef double ReturnType;
private:
//helpers
void _GetTime(TimeType * timeVar){
QueryPerformanceCounter(timeVar);
}
void _calcFrequency(TimeType* timeVar){
QueryPerformanceFrequency(timeVar);
}
ReturnType ticksPerSecond(){
return _ticksPerSecond.QuadPart;
}

private:
TimeType _startTime;
TimeType _endTime;
TimeType _ticksPerSecond;
public:
HighTimer() {
_startTime.QuadPart = 0;
_endTime.QuadPart = 0;
_ticksPerSecond.QuadPart = 0;
_calcFrequency(&_ticksPerSecond);
}
void start(){
_GetTime(&_endTime);
}
void reset(){
_startTime = _endTime;
}
LONGLONG getRawDiff(){
_GetTime(&_endTime);
return (_endTime.QuadPart - _startTime.QuadPart);
}
ReturnType getDiffSec(){
return getRawDiff()/ ticksPerSecond();
}
ReturnType getDiffDeci(){
return (getDiffSec() * 10 );
}
ReturnType getDiffCenti(){
return ( getDiffSec() * 100 );
}
ReturnType getDiffMilli(){
return ( getDiffSec() * 1000 );
}
bool hasSecondPassed(){
return getDiffSec() > 1;
}
};


Share this post


Link to post
Share on other sites
From what you've posted, you still aren't initializing your HighTimer's _startTime. The call to update.start() does not set it, so it remains 0 until the call to update.reset(). So when you call update.getDiffSec(), you're basically doing "_endTime.QuadPart - 0", where _endTime.QuadPart is whatever was inside your CPU's performance counter. The result is a huge number, and your program stalls.

The performance counter is a part of your CPU, and does not reset to 0 each time you start your application. You must initialize _startTime with a value from the performance counter before using it to calculate a time difference from _endTime.

The only reason it appears to work after it gets through the stall is because you call update.reset(), which sets _startTime to something reasonable. Try calling update.reset() outside of disp(), outside of and before whatever loop you're calling disp() from, and your problem will go away.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this