Hello again.
The timing of my game events (mostly interp/extrap and movement tracking) rely on a steady tick rate/heartbeat. I am struggling to find a way that doesn't drift farther away from accuracy the longer the program is running for.
At the moment, I am using the HPET through the Windows API functions QueryPerformanceCounter and QueryPerformanceFrequency. I have also tried a similar method using clock().
I have added code to print how many seconds (nearly exactly) that the application has been running for, I then compare this to the number of ticks passed and the time it takes for one tick. I understand that rounding issues will exist, but something odd seems to be happening.
Here is a pseudo-code version of my server's tick timing:
query performance counter value at app start
calculate counts per tick (tickrate 30, theoretical counter resolution of 200000 = 200000/30 = 6667 rounded)
calculate performance counter value for when the next tick should start (current + counts per tick)
set current tick to 1
set minutes passed to 0
while(running){
//do something to take up time like Sleep(1), or don't... doesn't change the outcome either way from my findings
get current counter value
calculate gap = current - when next tick should start
if(gap >= 0){ //if a new tick is due
increment the tick
calculate counter value for when the next tick should start (current - gap + counts per tick)
if(tickrate ticks have passed since this last resolved to true){
increment minutes passed
print exact seconds passed - (current counter - starting counter - gap)/counter frequency
}
}
}
Every time a minute has passed, the printed value in seconds drifts upwards by 0.09. Which proves that the tick count doesn't accurately represent the time passed, each tick is infact taking less time than it should by a tiny bit.
Here is another strange thing that happened. I originally wrote a class to calculate ticks on the client app using clock() and used the performance timer on the server. Tick signals reaching the client would be behind the ticks that the client was on (when it should have been the opposite). In an attempt to debug the server, I started making use of that class on the server too - the inaccuracy is the same as reported by my printed information - but the syncronization to the client started behaving as I was expecting in the first place (the client is always about 1 or 2 ticks behind the server). That seems quite paradoxical to me, but I won't worry about it just yet.
Here is a screenshot (using clock() on the server, but the command window shows the same results as I got using the high performance event timer):
Green underlined text is printed based on the number of ticks passed, in the same conditional statement it prints the calculated number of seconds passed. As you can see, this drifts upwards.
Essentially, I need to fix my code so that the inaccuracy doesn't keep drifting larger as more time passes.
Have I done something horribly wrong with my logic, or do I need to calculate the inaccuracy in runtime and account for it? I understand that rounding errors will occur, but I can't think of a way to do it that doesn't suffer from them.