Sign in to follow this  
SquareDanceSteve

Smooth Multiplayer

Recommended Posts

My game (www.squaredancestudios.com) is a naval wargame. The server and the client use identical game engines. I am using udp. The client is a viewer for the game running on the server. The client recieves updates as to the position of each boat, its direction of movement, its rudder angle and its current angle... That way the client can continue to predict the movements of each ship after each update from the server. The movement on the client end even over lan is a tad choppy, so I made a packet numbering systems so that packets that are out of order are dropped. Unfortunatly it appears that the client is always a little behind the server or is the client ahead of the packets that are arriving? Im not sure. I was woundering if anyone had ant tips how to smooth out the integration of the updates from the server with the simulation running on the client.

Share this post


Link to post
Share on other sites
I've been banging my head over this one a bit too, but with spaceships =). From what I can gather, I'm thinking the best way to do this is when you get input on the client, go ahead an immediately move the ship on the client, send off the move to the server, the server then also calculates the move and sends it back to the client. The client then compares it's movement (in the past which it's kept in a history buffer) to what it just got from the server and decides if it's gotten too far out of line with what the server is thinking, and if it has, then rewind for a second, go back to the move where they started disagreeing and then run through all the moves up to that time to get to where they should be, possibly using an average or interpolation system to smooth between what the server says and what the player thought it should be.

If you find any good tutorials for it, besides Gaffers, i'd love to know, as i'm still a little foggy on some of the details.

Share this post


Link to post
Share on other sites
Either you decide to run the client some amount behind the server (say, one RTT), or you have to extrapolate. When you extrapolate, you will guess wrong. Your goal is to hide this wrong-guessing. The easiest route is to not immediately snap the position when you get a new update, but instead smoothly move the ship towards where it "should" be over a successive number of frames.

Here's a simple example of what you might do:

/// Call Entity::receivePacket() when you receive a new position/entity update.
/// Only pass it in-order data (i e, drop out-of-order packets).
/// Call Entity::posAtTime() to estimate the position of an entity at a given
/// simulation time, based on latest available data.
/// Entity will do piecewise linear interpolation to smooth movement towards an
/// estimate of the actual server movement, without any sharp jumps in position.

class Entity {
// reported data
Vector3 lastPos;
Vector3 lastVel;
Time lastTime;
// interpolation data
Vector3 targetPos;
Vector3 targetVel;
Time targetTime;
public:
/// @note "time" means game/simulation time (synced between client and server)
/// @param pos is the position of the entity in the packet
/// @param vel is the velocity of the entity in the packet
/// @param forTime is the time of the pos/vel reading, out of the packet
/// @param rtt is the estimated round-trip-time for the connection


void receivePacket( Vector3 pos, Vector3 vel, Time forTime, Time rtt ) {
targetPos = pos + vel * (forTime+rtt);
targetVel = vel;
targetTime = forTime+rtt;
lastVel = (targetPos-lastPos)/(targetTime-lastTime);
lastTime = curTime;
}
/// @param time is the simulation time for which you want a position
Vector3 posAtTime( Time time ) {
if( time < targetTime ) {
return lastPos + lastVel * (time-lastTime);
}
return targetPos + targetVel * (time-targetTime);
}
};

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
How often does the server update the clients?
We tried a few frequencies in our (commercial) game, and settled on 12 updates per second, and we scale it down when a lot more players join in.

Share this post


Link to post
Share on other sites
right now I am updating the boat positions 4 times a second (boats are neat they go slow not a lot of direction changing) its smooth even up to once a second unless the player is doing some crazy stuff (like they wouldnt in battle)

The simulation appears to be right on, its just that the packets are out of date by the time they arrive on the client. so the ship lurches a few steps every time a packet arrives at a slightly difrent frequency

i beleave i have a couple of options

A:
I can use a clock value to adjust how many steps the packet should be advanced. And smooth it out (like hplus)

B:
I could move the camera farther away from the boat (the lurching is undetectable at a little distence)

C:
I could use some sort of rate control to toggle when packets that arrive will be implemented to adjust for unreliable frequency


for the sake of getting this project running as soon as possible i will go with B for now

Share this post


Link to post
Share on other sites
hplus, could you describe the time values in your code a little more? Like who is generating them (player or server) and are the times unique on each client for every object in the game? That's where i'm getting stuck with this stuff. Thanks for your patience explaining this stuff.

Share this post


Link to post
Share on other sites
I'm assuming that client and server attempt to keep one synchronized time (call this logical time). This logical time is a measurement of how far the simulation has progressed since start (or some assumed start, sometime back in time).

For a simple implementation of this, the client can add its physical time (the actual clock) to each outgoing packet, and the server, when receiving the packet, can remember that value, as well as the server logical clock at time of receipt. It puts both those values, as well as the server logical clock at send, in the next packet to the client. The client can then use these values to calculate both RTT (ping) and offset from its physical time to get logical time.


Client Logical Time = Client Physical Time + Client Offset to Logical Time

Client->Server:
CP1 = clientPhysicalClock()
send( CP1 )

recv( CP1 )
SL1 = serverLogicalClock()

...processing...

Server->Client
SL2 = serverLogicalClock()
send( CP1, SL1, SL2 )

recv( CP1, SL1, SL2 )
CP2 = clientPhysicalClock()
ServerProcessingTime = SL2-SL1
RTT = CP2 - CP1
OneWayPing = (RTT - ServerProcessingTime)/2
ClientOffsetToLogicalTime = ((SL2+OneWayPing-CP2)+(SL1-OneWayPing-CP1))/2


You might want to smooth your estimation of the actual offset, to avoid too abrupt changes, and too much jitter because of a single delayed packet.

There are fancier way to calculate client/server time synchronization; for example, look at the way that NTP calculates time.

Share this post


Link to post
Share on other sites
Ok, one more question for you hplus. What goes on in the serverLogicalClock() function, and how are any times it uses initialized? I know you said the logical time was the time elapsed since the simulation started, but is that the servers simulation, or the clients, or what when we're figuring out the server logical clock?

BTW you should write up a tutorial covering this stuff. It'd be a tremendous learning tool to have it all in one place instead of having to jump from tutorial to tutorial in the FAQ and attempting to figure out how they work together. Or better yet a book, I haven't found one that goes in depth like this yet, they only seem to cover winsock basics.

Thanks again.

Share this post


Link to post
Share on other sites
Here's a typical implementation of a clock:


class Clock {
public:
Clock() : offset_( 0 ) {}
double time();
void adjustTime( double dt );
private:
double offset_;
};

double Clock::time() {
struct timeval tv;
gettimeofday( &tv, 0 );
return (double)tv.tv_sec + 1e-6*(double)tv.tv_usec - offset_;
}

void Clock::adjustTime( double dt ) {
offset_ -= dt;
}


At start-up, the server will call clock->adjustTime(-clock->time()) to re-set it to 0. serverLogicalTime() would return clock->time() after that adjustment.

On Windows, you typically use QueryPerformanceCounter() instead -- although it has some bugs. In fact, all the PC hardware timers have bugs -- see this article.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this