Jump to content
  • Advertisement
Sign in to follow this  
ordered_disorder

Trouble with int64s

This topic is 4773 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Konbon wa, I'm using int64s to hold values of my high frequency counter for purposes of timing in my game. Though I'm having trouble getting accurate results when debugging my code. I have the following tidbit of code->
	__int64 comp_frequency;
	__int64 start, end;

	QueryPerformanceFrequency((LARGE_INTEGER *) &comp_frequency);
	QueryPerformanceCounter((LARGE_INTEGER *) &start);	
	
        Sleep(3000);
	QueryPerformanceCounter((LARGE_INTEGER *) &end);
	
        float time = static_cast<float>((end-start)/comp_frequency);
	cout << time << endl;
The output should be 3.xxx seconds, but it says 2. Then when I make the program sleep for 500 ms, the output should be 1.5 ms but the output says 1. Why isn't time being given a floating point number, and why isn't sleep working right? I want to eventually work on my own simple profiler, but I definitely can't begin that until I figure out how my timing functions can return time slices in fractions. Thank you to anybody who can help me out.

Share this post


Link to post
Share on other sites
Advertisement
I'm not in windows at the moment, but try this

__int64 comp_frequency;

__int64 start, end;



QueryPerformanceFrequency((LARGE_INTEGER *) &comp_frequency);

QueryPerformanceCounter((LARGE_INTEGER *) &start);



Sleep(3000);

QueryPerformanceCounter((LARGE_INTEGER *) &end);



float time = static_cast<float>(end-start)/static_cast<float>(comp_frequency);

cout << time << endl;



Share this post


Link to post
Share on other sites
Cocalus is right about the casting, but I can't say what's wrong with your sleep function.

Share this post


Link to post
Share on other sites
Yes, Cocalus is right, though I still dont understand why his cast worked over mine. Basically my timer is always about ~10 ms seconds off, and this makes since, because sleep probably relies on GetTickCount, or a GetTime function, which has a 10 ms resolution, and therefore is always going to be off ~10 ms or so. It would be cool if we could get some other game deveres to try it out on their machines and compare results.

By the way I'm using Visual C++ 6.0, which is probably one of the reasons :)

Share this post


Link to post
Share on other sites
timeBeginPeriod
timeGetTime
timeEndPeriod

works for me as a high resolution timer down to 1ms on all the machines I've tried so far.

Share this post


Link to post
Share on other sites
The reason the cast works is that in your case you're doing an integer (although int64) division which will throw away any reminder. Only the division result is casted to float.

If you cast one of the division participants to float before you'll get the "real" value.

VC6 is old, but it's not that bad. There won't be any difference to VC7 for that piece of code.

Share this post


Link to post
Share on other sites
Quote:
Original post by ordered_disorder
Sleep probably relies on GetTickCount, or a GetTime function, which has a 10 ms resolution, and therefore is always going to be off ~10 ms or so.

That's part of it. Another is that Sleep just determines how long your process should be held in a sleep queue. Once that time has expired, Windows will put it back in the active queue, and *then* it'll be activated.... When it gets its turn.
That might be instantly, or it might be after 10 (or more) ms.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!