Members - Reputation: 372
Posted 28 June 2012 - 10:27 AM
So, I'm working on multiplayer (just another day!) and I've gotten a system which I believe works quite well. However, I understand that when I implement this into a real game, I can't expect everyone to play at 60 frames per second.
Thus, I am a little stuck with how to proceed:
I can no longer get away with using game ticks as a base measure of elapsed time, because if the tick rate changes, so does the duration of time that a "tick" represents. Therefore i cannot treat a server tick as a client tick.
However, if I were to use purely time based approaches, it would be unlikely that I would be able to compare events processed on ever so slightly differing timings (i can compare gamestates at tick X, but not at time X because time X is very unlikely to be indentical, thus i'd have to interpolate or something).
How do commercial games circumvent such issues?
Members - Reputation: 3708
Posted 28 June 2012 - 11:04 AM
Edited by SimonForsman, 28 June 2012 - 11:07 AM.
The voices in my head may not be real, but they have some good ideas!