[quote name='ApochPiQ' timestamp='1314265316' post='4853573']
Especially when dealing with PCs, it is not possible to guarantee that one machine will elapse time at the same fixed rate as another. Even on a single machine you won't get perfect n-Hz cycles, there will be variances between frames/ticks. The purpose of using a time value instead of a tick count is to help absorb these variances.
I was wondering something... Assuming we are talking about time variance, not hardware saturation, how does the time variance matter exactly? Can't you simply do a (TickCount * Frequency) formula to determine the time of a given frame and ignore the variance that gets reset every frames? It's not as if the variance would stack up and make the game drift.
[/quote]
I was thinking the exact same thing in this instance, since the _only_ reason we sync time is to make sure we don't perform X+1 before we perform X+2, and the variance gets cancelled out every frame anyway. But maybe there are more things at play here