Jump to content
  • Advertisement
Sign in to follow this  
Mogwaii

Time Synchronization between Client and Server Method

This topic is 3076 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I plan to implement a variation of the time synchronization technique explained in this paper for a game I am working on, and I understand it for the most part, but I have a few questions before I begin.

In general for games, does one need to continually synchronize clocks (between server and client) throughout the life of the application, or only at the beginning of the connection?

I know the answer to this question may be dependent on the project in question.

So to simplify, maybe someone could tell me if the CPU clocks of different machines are the same or not. For example, is 1 second on a machine A identical to the amount of time of 1 second on machine B. I have searched for the answer to this but have not been able to find one. I know that computer clocks are computed by the frequency of an oscillator crystal in the CPU, but how accurate is this in comparison to other CPUs?

Any and all help is greatly appreciated. Thanks in advance.

Share this post


Link to post
Share on other sites
Advertisement
Looking at the article It doesn't say why we would want to use time synch. If you are using dead reckoning to cover network latency it doesn't matter if the clocks are in synch or not you just extrapolate from the previous position until you get your next position packet then correct.

edit to answer your question pc clocks are pretty bad. they can drift by a min a day. variations in temperature cause the crystal to vibrate at different speeds so it varies quite a bit by PC temp.

Share this post


Link to post
Share on other sites
Thanks for the answer, definitely good to know about the CPU.

I do plan to extrapolate the positions of objects to cover lag, but I'm also planning to implement a system similar to the Half-life lag-compensation methods explained here , in which the client's view of the world is actually lagged by 50-100ms (so that object positions can more often just be interpolated between 2 known positions). For this reason, I think its very necessary to synchronize clocks (unless I'm missing something).

In that case, I assume its pretty necessary then to continually synchronize clocks through the life of the connection. If this is the case, then what method is generally used to match the client to the server as it deviates beyond a certain threshold. Does the client's time instantly get switched to match, or does it slowly converge to minimize the effect?
Tell me I'm not way out to lunch on this one.

Edit: it seems like one would want to block other traffic to make calculations to synchronize clocks. Does this become a problem with games.

[Edited by - Mogwaii on July 11, 2010 7:44:50 PM]

Share this post


Link to post
Share on other sites
From the Latency compensation page linked by the one you linked to:
Quote:
The msec field corresponds to the number of milliseconds of simulation that the command corresponds to (it's the frametime).

That is to say you need relative time, not absolute time and the delta can easily be tracked by when the player joined the game. There is no need to synch the actual time clocks on the various computers.

Share this post


Link to post
Share on other sites
This is a quote from that same page here from footnote 6.

Quote:

It is assumed in this paper that the client clock is directly synchronized to the server clock modulo the latency of the connection. In other words, the server sends the client, in each update, the value of the server's clock and the client adopts that value as its clock. Thus, the server and client clocks will always be matched, with the client running the same timing somewhat in the past (the amount in the past is equal to the client's current latency). Smoothing out discrepancies in the client clock can be solved in various ways.


So while I agree that you do not need to send the time-stamp of the command, (because the server can simply extract that out of the latency, or half the ping) when they are referring to "the number of milliseconds of simulation that the command corresponds to (it's the frametime)", they simply mean the duration in ms of the command (i think). But it is still important for the client to be synced to the server, again unless I'm missing some important detail.

If they need to be synced, I'm still wondering what methods to use to continually sync the client to the server throughout the life of the connection, or how to "smooth out discrepancies in the client clock".

Share this post


Link to post
Share on other sites
Quote:
But it is still important for the client to be synced to the server, again unless I'm missing some important detail.

How about this detail, if they needed to synch clocks why don't they talk about the method they used or even the need to do so, since they are telling you exactly how they calculate other times.

Share this post


Link to post
Share on other sites
I don't know, I thought that the last quote there from footnote 6 pretty much stated that the clocks need to be synchronized. I guess they don't talk about it cause the topic of how to sync clocks together isn't the focus of the paper (my link seems to have died, so here is another link to the same paper). If the clocks don't need to be synchronized, then I'm very confused about why they would state that
Quote:
the client clock is directly synchronized to the server clock


In either case, I would still like to know how one would fix client time to server time as time goes on. More specifically, if the client time has drifted ahead of the server by x amount of time, how should the client be rewound or brought back to server time. Likewise if the client time drifts behind the server, should a world snap shot be sent to the client from the server, or should the client just simulate faster for a while.

I can't seem to find any documentation out there on this subject. Thanks in advance.

Share this post


Link to post
Share on other sites
What I've used in the past, and it has worked OK:

1) When connecting, the server sends a baseline value, and the client uses this value.
2) Periodically (with each heartbeat, say, or with each packet), the server sends a new timestamp.
3) The client compares this timestamp with the calculated timestamp. If it is within some allowable jitter range (say, +/- 20 milliseconds) then it's assumed to still be in sync.
4) If two successive server timestamps appear to be out of sync, then adjust the client clock offset, and make a slight adjustment to the client clock rate (skew).

If you need to adjust the clock backwards, you may want to make that adjustment using a large amount of skew, rather than a time jump backwards, to avoid double execution of timed events.

To apply skew, you calculate server time as:

serverTime = (clientTime - offset) * skew

The trick is that you have to adjust the "offset" at the same time you adjust the "skew" to avoid jumping the time estimate at the time of adjustment.

If you want a better algorithm (which basically takes this, and makes it more statistically robust, and uses round-trip-time estimates), then look into the NTP protocol and how it's constructed.

Share this post


Link to post
Share on other sites
Thanks for the reply. Skew makes sense. I don't think I'll look into more methods of doing things, this seems like a good way to fix the time. At this point, at least for time-syncing, I think the simpler the better. Thanks again.

Share this post


Link to post
Share on other sites
Just make sure that you change the skew by very small increments.
For example, if you find that you have to set the clock back, multiply the skew by 0.99999 or so (which is 1.0 - 1e-5). This will change the skew by a rate of 36 milliseconds every hour.
Same thing, when you have to set the clock forward, multiply the skew by something like 1.00001 (1.0 + 1e-5).
And, of course, don't change the skew just because you have to make the initial adjustment after you connect :-)

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!