Synchronizing timers

Started by
4 comments, last by hplus0603 15 years ago
Hi all, Following my last post about extrapolation and time-based anti-lag techniques I've made some good progress in AAsteroids and got a basic algorithm working. I'd like to thank everybody who shared tips and info with me in that thread, thank you. I have now reached a followup point where the basic algorithm is working, but is crude, and I'd like to refine it. The problem: consider the following snippets from the game's logs - Client
00:06:10.0766 [debug]   Turning right, Angle: -6.167618
00:06:10.0766 [debug]   Angle: -6.129788
00:06:10.0772 [debug]   Angle: -6.092988
00:06:10.0777 [debug]   Angle: -6.0561
00:06:10.0783 [debug]   Angle: -6.028537
00:06:10.0789 [debug]   Angle: -6.001168
00:06:10.0795 [debug]   Angle: -5.974089
00:06:10.0801 [debug]   Angle: -5.946125
00:06:10.0807 [debug]   Angle: -5.918099
00:06:10.0813 [debug]   Angle: -5.890485
00:06:10.0818 [debug]   Angle: -5.863604
00:06:10.0824 [debug]   Angle: -5.835265
00:06:10.0830 [debug]   Angle: -5.808373
00:06:10.0836 [debug]   Stopped turning

Server
00:06:34.0170 [debug]   Angle after rewind: -6.167618
00:06:34.0170 [debug]   Turning right, Angle: -6.167618
00:06:34.0170 [debug]   Fastforwarding 27 ticks
00:06:34.0170 [debug]   Angle after fastforward: -5.173763
00:06:34.0170 [debug]   Angle: -5.136973
00:06:34.0176 [debug]   Angle: -5.100628
00:06:34.0182 [debug]   Angle: -5.06374
00:06:34.0188 [debug]   Angle: -5.026403
00:06:34.0193 [debug]   Angle: -4.990155
00:06:34.0199 [debug]   Angle: -4.952763
00:06:34.0205 [debug]   Angle: -4.915812
00:06:34.0211 [debug]   Angle: -4.879279
00:06:34.0217 [debug]   Angle: -4.842765
00:06:34.0223 [debug]   Angle: -4.805939
00:06:34.0229 [debug]   Angle: -4.768663
00:06:34.0235 [debug]   Angle: -4.730613
00:06:34.0241 [debug]   Angle: -4.694594
00:06:34.0246 [debug]   Angle: -4.657986
00:06:34.0252 [debug]   Angle: -4.621612
00:06:34.0258 [debug]   Angle: -4.585176
00:06:34.0264 [debug]   Rewinding 27 ticks
00:06:34.0264 [debug]   Angle after rewind: -5.541834
00:06:34.0264 [debug]   Stopped turning
00:06:34.0264 [debug]   Fastforwarding 27 ticks
00:06:34.0264 [debug]   Angle after fastforward: -5.541834

To explain a bit about my logs: * Time is given in hours:minutes:seconds:miliseconds. * Each line that details the angle is produced by the tick function. * The lines that talk about turning, rewinding and fastforwarding are produced by the function that processes user events. * On the server the sequence of events is: rewind, apply turn, rerun ticks. These correspond to lines 1, 2 and 3-4 respectively in the server log. The client only performs the "apply turn" logic, which corresponds to line 1 in its log. * The calculation for how many ticks to go back is [connection roundtrip time] / 2 * 5. * My tick time is 5ms, but of course its not accurate. * I am using Lidgren.Network with simulated roundtrip of 200ms-300ms. * This is not directly relevant but I'm also applying relaxtion techniques on top of the time-based lag compensation. So with these details you can clearly see from the logs what happens: 1) The client performs a turn that lasts 64ms and results in a 20.6 degrees clockwise turn. 2) The server receives the turn event, applies the difference in the spin variable, then fastforwards 27 ticks. It actually starts with an angle which is beyond what the client reached altogether (which makes sense since the client turned for 64ms which is less than half the latency). 3) The server receives the turn stop event and again rewinds, processes, and fastforwards. 4) The resulting turn on the server lasted 88ms and resulted in a 35.5 degrees turn. The reason for the crudeness are: 1) I am not calculating which tick in the history I should rewind to based on each ticks time, but based on the crude assumption that each entry in the history buffer equals 5ms. This is something I can easily fix, however: 2) The amount of time to compensate for is calculated crudely based on current latency. - This is the part I'm looking for some help with. What I had in mind is to add timestamps to user input events sent over the network. That way the server can more accurately calculate exactly which tick to go back to. However to accomplish this I need an accurate method of finding the time different between the server and the client during the connection phase. From there on everything else gets much easier. Ideally I would like to reach an accuracy of 20ms - 50ms. I have come across this: http://www.mine-control.com/zack/timesync/timesync.html which I think is a good starting point. However it says that the accuracy level is 100ms. My questions are: 1) Am I even in the right direction? Is it sane to expect to reach a time-sync accuracy of 50ms or less? 2) Any resources that anyone knows that can be of use, Be it working code examples, libraries or articles? 3) In folks opinion / experience, what is the best time-sync-over-lag technique? ( This is one brain cruncher! I'm really enjoying it.. :-) ) Thanks, A
Advertisement
I've found this as well: http://www.gamedev.net/reference/programming/features/clocksync/page2.asp

However it basically details the same algorithm: average of half latency of several samples.

I think the part I don't understand is that I was under the impression that the client-server latency can be very different to the server-client latency, which means that using [recv time] - [sent time] / 2 can be very inaccurate. So I guess what I'm looking for is to know if this is a fair assumption, or otherwise if there a more accurate technique than that?
It doesn't matter if your clock is off within the window that you get when the transmission time is asynchronous. The clock is used to make sure that events happen in the right sequence. As long as that is true (i e, the clock is monotonically increasing, and the server get event commands for time T before T is executed) then it doesn't matter how exact your clock is (except that jitter can lead to a jittery experience).

Btw: Most asynchronous timing comes from satellite networks, where downstream is fast but high latency, and upstream is a dial-up with lower latency but very little bandwidth. Typically, those connections don't work well for gaming anyway, for reasons other than timing asymmetry.
enum Bool { True, False, FileNotFound };
Quote:Original post by hplus0603
It doesn't matter if your clock is off within the window that you get when the transmission time is asynchronous. The clock is used to make sure that events happen in the right sequence. As long as that is true (i e, the clock is monotonically increasing, and the server get event commands for time T before T is executed) then it doesn't matter how exact your clock is (except that jitter can lead to a jittery experience).


This is spot on. You're not going to have perfectly synchronized time, and you need to accept that and write code that takes this into account. It's one of the fundamental complexities of network programming. You won't ever have two computers completely, reliably in sync with each other.

Quote:Btw: Most asynchronous timing comes from satellite networks, where downstream is fast but high latency, and upstream is a dial-up with lower latency but very little bandwidth. Typically, those connections don't work well for gaming anyway, for reasons other than timing asymmetry.


True, but it's still very possible to have asymmetric routing which can impact ping times. Still, halving the round-trip is generally good enough for gubmint work, especially if you've accepted that you'll never actually have totally synchronized clocks anyway.
Hi,

Thanks for the tips.

Just to clarify: I don't expect to have perfect time-sync, I just wanted to mitigate the problem of asymmetric latency to a minimum. Am I correct in understanding that you are basically telling me not to bother with asymmetric latency since it almost never occurs in gaming?
No. I'm telling you to not bother with asymmetric latency because it doesn't *matter*.
enum Bool { True, False, FileNotFound };

This topic is closed to new replies.

Advertisement