Queries on a synchronization model

Started by
2 comments, last by hplus0603 10 years, 9 months ago

I'm sure this has a name but I'm not entirely sure what it is:

  • Server and client(s) run the same deterministic simulation
  • Clients send input to server
  • Server aggregates input from all clients over period of time
  • Simulation at server [and after propagation, clients] integrates time/input.

The reasons I've chosen this model are:

  • To reduce bandwidth requirements and thus also latency (the game involves alot of densely packed units)
  • It's easier to extend and frankly,more elegant than a dumb-client model (assuming the determinism quality can be assured)
  • Enables a more thorough anti-cheating system

There are however three main issues I foresee which I'd like some input on:

Firstly, as the simulation client-side will only advance upon receiving a 'frame' of input it will be highly sensitive to network jitter. I believe this could be solved by using the usual latency compensation/hiding techniques.

Secondly, the entire model is predicated on what is essentially a perfect information game thus enabling unscrupulous players to gain information otherwise hidden. Obviously this is not unique to this model but it's exagerated.

Lastly, filtering client input in respect to the relevance of that input to another client (e.g, proximity) would be desireable (for both bandwidth and security) however it's clearly going to raise problems with synchronization. Possibly partitioning of the world-space and have clients only run the simulation for those sectors which are within proximity would be a solution.

Advertisement

It seems like you have most of the basics covered. You can also hide jitter by simply using a big enough jitter buffer (which introduces artificial lag as a cost, to pay for always seeing a correct simulation.)

If the draw-backs you list are not acceptable to you, you'll have to change something about your model. For example, if you are making a poker game, a shared consistent simulation is a poor model, because the effects of cheating will kill the game.

enum Bool { True, False, FileNotFound };

You're right; a jitter buffer would be a good solution.

There's one optimization to this model that I'd like to implement however I suspect it may be a problem: Given that a large proportional of the frames of input are likely to be empty there would be a considerable amount of wasted bandwidth (e.g, at a rate of 25Hz; 1KB/s of protocol overhead (IP+TCP) alone). Thus, would it be possible to omit this redundant traffic? Assuming that there is minimal jitter I figure the client would be able to set a time threshold for receiving input per frame and if exceeded assume that there was no input during this frame and opportunistically advance the simulation. If the assumption is wrong however it would require reversing the simulation, integrating the missed input, and then somehow fixing the descrepency between the current (errorneous state) and the correct state. I haven't heard of this being done so I'd be interested in hearing in any experiences with such a method.

I should point out that the game in question is effectively a carbon copy of Diablo II; the simulation computation requirement is minimal and thus it would be quite feasible to dump the entire game state (~250KB client-side) per frame (which is something I'm considering for implementing the reversing)

Can you omit entirely empty frames? Possibly. The question is: What do you do if there is a temporary network glitch, and a packet finally comes in saying "here's the data from frame X," when you already timed out receiving frame X and have moved on to frame X+1 (or X+N)?

In general, I don't think 1 kB/sec/player is particularly bad overhead. Compared to the cost of bulk bandwidth ($2/Mbit/month,) that's about $0.02 per simultaneous-online-player (CCU) per month. But if you want to optimize that, you can easily cut down on the packet rate. Send the input for four steps, eight times a second. The end result will be that you have higher apparent latency, as there is more buffering going on. Or send frequent packets when there is data, but bunch up "empty" packets until you reach some limit -- 500 ms or whatever. The drawback there is that the game will "stutter" between periods of activity and non-activity.

For Diablo-style games, I don't thing 1/8 second of latency would reallybe a problem, though -- 25 Hz seems way high for that style of click-to-walk gameplay.

enum Bool { True, False, FileNotFound };

This topic is closed to new replies.

Advertisement