Sign in to follow this  

QueryPerformanceFailure()

This topic is 4598 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I tried to switch over to the QueryPerformance set of functions from timeGetTime() and it won't work. I've narrowed the problem down to dividing RenderTime by timer_freq. (Please excuse the inconsistent variable notation.) //global variables __int64 timer_freq = 0, RenderTime = 0, current_time = 0, start_time = 0; //at the beginning of the program if (!QueryPerformanceFrequency((LARGE_INTEGER*)&timer_freq)) MessageBox(0,"This sucks.",0,0); //at the beginning of Render() QueryPerformanceCounter((LARGE_INTEGER*) ¤t_time); start_time = current_time; //at the end of Render() QueryPerformanceCounter((LARGE_INTEGER*) ¤t_time); RenderTime = current_time - start_time; RenderTime = RenderTime/timer_freq; I believe the problem occurs at the last line where I divide RenderTime by timer_freq. If I live this out the program works but really fast. If I put it in the program doesn't work at all. RenderTime is used like this later though I don't think it matters: RotateYBy += RenderTime * (2.0f * D3DX_PI) / 2500.0f;

Share this post


Link to post
Share on other sites
Unless your time between frames is 1 second or longer, RenderTime/timer_freq will be 0 because you are storing it in an integer. You should store the result of the division as a floating point.

Share this post


Link to post
Share on other sites
Quote:
Original post by KrazeIke
Unless your time between frames is 1 second or longer, RenderTime/timer_freq will be 0 because you are storing it in an integer. You should store the result of the division as a floating point.


And make sure you cast at least either RenderTime or timer_freq before dividing to a double/float [smile]

Share this post


Link to post
Share on other sites
Quote:
Original post by Providence
I thought timer_freq would be a smaller number than RenderTime, and that the result of this operation would be x number of milliseconds.


timer_freq is how many ticks in one second, so it should be larger than RenderTime.

Share this post


Link to post
Share on other sites
PS

On some hardware (VIA chipsets, can't remember any other) the hardware may occasionally return a time stamp that is WAAAAAY off from the other stuff


10000
10013
10030
10039
2000340343
10060


like that.

Share this post


Link to post
Share on other sites
Quote:
Original post by Dave Hunt
timer_freq is how many ticks in one second, so it should be larger than RenderTime.


Thank you, that makes everything clear. This is what I get for assuming and not reading my docs carefully :(

Share this post


Link to post
Share on other sites

This topic is 4598 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this