Trouble with int64s

Started by
6 comments, last by Spoonbender 18 years, 7 months ago
Konbon wa, I'm using int64s to hold values of my high frequency counter for purposes of timing in my game. Though I'm having trouble getting accurate results when debugging my code. I have the following tidbit of code->

	__int64 comp_frequency;
	__int64 start, end;

	QueryPerformanceFrequency((LARGE_INTEGER *) &comp_frequency);
	QueryPerformanceCounter((LARGE_INTEGER *) &start);	
	
        Sleep(3000);
	QueryPerformanceCounter((LARGE_INTEGER *) &end);
	
        float time = static_cast<float>((end-start)/comp_frequency);
	cout << time << endl;
The output should be 3.xxx seconds, but it says 2. Then when I make the program sleep for 500 ms, the output should be 1.5 ms but the output says 1. Why isn't time being given a floating point number, and why isn't sleep working right? I want to eventually work on my own simple profiler, but I definitely can't begin that until I figure out how my timing functions can return time slices in fractions. Thank you to anybody who can help me out.
Advertisement
I'm not in windows at the moment, but try this
__int64 comp_frequency;__int64 start, end;QueryPerformanceFrequency((LARGE_INTEGER *) &comp_frequency);QueryPerformanceCounter((LARGE_INTEGER *) &start);		Sleep(3000);QueryPerformanceCounter((LARGE_INTEGER *) &end);	float time = static_cast<float>(end-start)/static_cast<float>(comp_frequency);cout << time << endl;
Cocalus is right about the casting, but I can't say what's wrong with your sleep function.
Yes, Cocalus is right, though I still dont understand why his cast worked over mine. Basically my timer is always about ~10 ms seconds off, and this makes since, because sleep probably relies on GetTickCount, or a GetTime function, which has a 10 ms resolution, and therefore is always going to be off ~10 ms or so. It would be cool if we could get some other game deveres to try it out on their machines and compare results.

By the way I'm using Visual C++ 6.0, which is probably one of the reasons :)
timeBeginPeriod
timeGetTime
timeEndPeriod

works for me as a high resolution timer down to 1ms on all the machines I've tried so far.
The reason the cast works is that in your case you're doing an integer (although int64) division which will throw away any reminder. Only the division result is casted to float.

If you cast one of the division participants to float before you'll get the "real" value.

VC6 is old, but it's not that bad. There won't be any difference to VC7 for that piece of code.

Fruny: Ftagn! Ia! Ia! std::time_put_byname! Mglui naflftagn std::codecvt eY'ha-nthlei!,char,mbstate_t>

sweet thanks for clarifying endurion
Quote:Original post by ordered_disorder
Sleep probably relies on GetTickCount, or a GetTime function, which has a 10 ms resolution, and therefore is always going to be off ~10 ms or so.

That's part of it. Another is that Sleep just determines how long your process should be held in a sleep queue. Once that time has expired, Windows will put it back in the active queue, and *then* it'll be activated.... When it gets its turn.
That might be instantly, or it might be after 10 (or more) ms.

This topic is closed to new replies.

Advertisement