Sign in to follow this  
Arcibald Wearlot

QueryPerformanceCounter() buggy? pleaser run this test

Recommended Posts

I had some jerky movements in my game, and tried to discover the reason. The problem is timing. I found that my DeltaTime value sometimes makes jumps. I wrote this small test and run it:
__int64 Freq;
__int64 CurrentTicks;
__int64 LastTicks;

QueryPerformanceFrequency((LARGE_INTEGER*)&Freq);
QueryPerformanceCounter((LARGE_INTEGER*)&CurrentTicks);
LastTicks=CurrentTicks;

for(int i=0;i<1000;++i)
{
	LastTicks=CurrentTicks;
	QueryPerformanceCounter((LARGE_INTEGER*)&CurrentTicks);

	float Delta=((float)(CurrentTicks-LastTicks))/((float)Freq);

	Sleep(1);

	cout<<(DWORD)CurrentTicks<<"  "<<Delta;
	
	if(Delta>0.02)
	{
		cout<<"......................JUMP";
	}
	cout<<endl;
}

I get quite regular jumps, every 65 cycles or so. this is an example output:
1953944988  0.0156953
1954000706  0.0155657
1954056766  0.0156612
1954112508  0.0155724
1954169047  0.015795
1954262938  0.0262299......................JUMP
1954281876  0.00529062
1954337144  0.01544
1954392172  0.0153729
1954451183  0.0164856
1954504206  0.0148128
1954559991  0.0155844
1954616100  0.0156749
See? the current ticks (and consequently the delta time) advance too much, but the following loop the value is very little, "compensating" the previous one. This is really weird. could you please run this little test on your computer? thanks! you will probably need to change the 0.02 value, because the speed of the loop could be different on another pc.

Share this post


Link to post
Share on other sites
just another thing, QueryPerformanceCounter() is the only way to do timing for me, because both GetTickCount() and timeGetTime() have a resolution of 15ms on my pc!
Since a game frame takes 5-8 ms, I get a DeltaTime==0 on two frames out of three.
Is this normal?

Share this post


Link to post
Share on other sites
Funny you should mention this, there is actually a fantastic article here on gamedev which should help to give you an insight into why there are so many timing isues on windows (including some useful references).

The article can be found here: Timing Pitfalls and Solutions.

edit: PS. although I believe this is mentioned in the article, to refute the Anonymous Poster's suggestion, a warning. Although you can indeed increase the interupt timout used for the standard windows timer this can have adverse affects on performance (unusually frequent context switching). Usually it is set to about 10ms, give or take, but as you noticed this isn't always the case.

Hope that helps.

Share this post


Link to post
Share on other sites
Sleep is underministic process-wise. This explains why you get different "values".

Also, if I were to use this function, I would always compare my time to an origin of some sort, that way, your delta_error over each frame would not propagate.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this