Multiplayer Movement

Started by
7 comments, last by hplus0603 19 years, 7 months ago
I'm having a bit of a hard time deciding the best way to do a version of Dead Reckoning that allows for players to move smoothly. I was thinking of just have each player send their current velocity and transform matrix(It's LAN only so bandwidth is no real concern right now, although it's 4 player tops so I'm not going too nuts, I think it came out to 5KB/s). I then take a time reading before sending that packet, when the server gets the packet it takes another time reading, creates a scaled velocity with that information, adds it to their velocity and applies it to the transformation matrix. Each player does this and the server updates all players with the current transformation matrix of all other players as well as their own at 1/20th of a second(20 matrix updates to all players every second) and their currently last reported velocity. The client is basically just reporting and going with a simulation but then getting the updated matrix from the server and doing any corrections needed. My concern is this seems like it's going to cause jerky motion even at 20 updates a second. I would like this to be smooth as possible and I'm also sure there is a much better way then the way I am currently doing. Any opinions or suggestions?
Advertisement
You need to either interpolate between received values (which will put your entities back in time), or interpolate between extrapolated values. The key is to never "snap" the displayed user location.

Each time you receive a packet, you update your extrapolation values. Each time you render, you call the extrapolator to figure where you should be rendering a packet interval into the future, and you look at where you actually rendered the last time you received a packet, then interpolate to the current time based on that data.

For position, it would look something like this (in pseudo-code):

onPacketReceive( Packet packet ) {  oldTargetPos = curTargetPos;  oldTargetTime = curTargetTime;  curTargetPos = packet.pos;  curTargetTime = packet.time;  prevPos = curPos;  prevPosTime = curTime;}onEntityRender() {  futurePos = oldTargetPos + (curTargetPos-oldTargetPos) *     (curTime+packetInterval-oldTargetTime) /     (curTargetTime-oldTargetTime);  curPos = prevPos + (futurePos-prevPos) * (curTime-prevPosTime) /     (curTime+packetInterval-prevPosTime);  renderEntityAtPos( curPos );}


edit: breaking a long line

[Edited by - hplus0603 on October 4, 2004 11:34:18 AM]
enum Bool { True, False, FileNotFound };
To simplify my networking structure, I basically handle each frame in the same way which HPlus brought up just simplified at this point. Basically I take the entity's last known velocity and simply update it each frame until I receive something different from the server.

So, if a client was moving forward and strafing right at time x....from time x until time y (when I get a "gamestate" update from the server, I continue to move the entity forward and right. This can cause for some error should the entity's get too far off but at this point I'm not too worried about it as my timeline is kind of strict.

If you have the time I'd recommend HPlus' solution which is a slightly more elaborate one then the one I'm using :).

Permafried-
I've actually got characters ingame and running around without any lag compensation right now and it's not too bad at all.

I've basically decided to get the latency between the client and server and every update simply scale the velocity of that player by their latency and update their location serverside with that scaled vector.

The only problem is I'm getting corrupted values on my basic ping. It works like this, every second the server sends the client a request that tells it to get it's tick count and send it to the server. When it recieves it the server then sends another request telling the client to do it again. The server then takes the difference of the values and uses that as the latency value in milliseconds.

Here is my Client side code:

Ping request is received
char tempBuffer[10];tempBuffer[0] = (unsigned char) PING;unsigned int tempPing = (unsigned int)GetTickCount();memcpy(&tempBuffer[1], &tempPing, sizeof(unsigned int));send(connection, (char*)&tempBuffer[0], 10, NULL);


Server gets it and stores it:
memcpy(&traverse->lagTime, &recvBuffer[1], sizeof(unsigned int));char tempBuffer[5];tempBuffer[0] = (unsigned char) ACK;memset(&tempBuffer[1], 0, sizeof(unsigned int));send(traverse->connection, (char*)&tempBuffer[0], 5, NULL);


Client gets the acknowledgement
char tempBuffer[10];tempBuffer[0] = (unsigned char) ACK;unsigned int tempPing = (unsigned int)GetTickCount();memcpy(&tempBuffer[1], &tempPing, sizeof(unsigned int));send(connection, (char*)&tempBuffer[0], 10, NULL);


Server gets it and updates the client's lag time
unsigned int newTime = 0;memcpy(&newTime, &recvBuffer[1], sizeof(unsigned int));traverse->lagTime = newTime - traverse->lagTime;


I'm getting corrupted data when I try and read the data from the recv buffer and I'm not quite sure why. It seems all good on the client side when it's packaged and sent but when I recieve it I get junk. Any ideas?
Both client and server are using the same endians (are run on computer using the same family of processor ) ? And are you running on a 32 bit processor or 64 ?

Could you post your receive code on the server ?

Gizz
Quote:Original post by Gizz
Both client and server are using the same endians (are run on computer using the same family of processor ) ? And are you running on a 32 bit processor or 64 ?

Could you post your receive code on the server ?

Gizz


They are the exact same machines.
// Clear the receive buffer.memset(&recvBuffer[0], 0, 1024);// Get all pending data.recv(wSocket, (char*)&recvBuffer[0], 1024, NULL);// What packet type?switch(recvBuffer[0]){case ACK:{	unsigned int newTime = 0;	memcpy(&newTime, &recvBuffer[1], sizeof(unsigned int));	traverse->lagTime = newTime - traverse->lagTime;}break;case PING:{        memcpy(&traverse->lagTime, &recvBuffer[1], sizeof(unsigned int));	char tempBuffer[5];	tempBuffer[0] = (unsigned char) ACK;	memset(&tempBuffer[1], 0, sizeof(unsigned int));	send(traverse->connection, (char*)&tempBuffer[0], 5, NULL);}break;.....
The corruption might be in your algorithm:

traverse->lagTime = newTime - traverse->lagTime


It's un-clear whether you store absolute times (tick values, on the order of 1000000 in size) or relative time (deltas, on the order of 10 in size) in "lagTime". I assume that "newTime" is an absolute tick-count-value.

However, if "lagTime" is supposed to be relative, then you subtract a relative value from an absolute, to get an absolute in the past. Then you assign it to lagTime, which is supposed to be relative. If "newTime" is relative, then you subtract a current relative value from the new relative, yielding a value with illegal type. If "newTime" is already relative, then you should just assign it to lagTime without subtraction.

If you keep track of absolute values from the other end, but also want a relative value, then you should store the absolute in one variable, and the relative in another. Say, create a new variable "lastTime" to store "newTime" in each time you get a new reading, and another variable "lagTime" that always stores the difference between "newTime" and "lastTime".

If you use lagTime for temporary storage while your lag query is outstanding, i e the type flip-flops between "absolute" and "relative" with each query in order to save 4 bytes of storage, then that's a really bad habit. Someone might want to read the value while the query is outstanding, and will get the wrong (absolute) value.
enum Bool { True, False, FileNotFound };
Quote:Original post by hplus0603
The corruption might be in your algorithm:


traverse->lagTime = newTime - traverse->lagTime


It's un-clear whether you store absolute times (tick values, on the order of 1000000 in size) or relative time (deltas, on the order of 10 in size) in "lagTime". I assume that "newTime" is an absolute tick-count-value.

However, if "lagTime" is supposed to be relative, then you subtract a relative value from an absolute, to get an absolute in the past. Then you assign it to lagTime, which is supposed to be relative. If "newTime" is relative, then you subtract a current relative value from the new relative, yielding a value with illegal type. If "newTime" is already relative, then you should just assign it to lagTime without subtraction.

If you keep track of absolute values from the other end, but also want a relative value, then you should store the absolute in one variable, and the relative in another. Say, create a new variable "lastTime" to store "newTime" in each time you get a new reading, and another variable "lagTime" that always stores the difference between "newTime" and "lastTime".

If you use lagTime for temporary storage while your lag query is outstanding, i e the type flip-flops between "absolute" and "relative" with each query in order to save 4 bytes of storage, then that's a really bad habit. Someone might want to read the value while the query is outstanding, and will get the wrong (absolute) value.


I actually found the problem after a fresh look at it today. There where 3 issues:
recieve buffer was char and not unsigned char
I was storing a DWORD in an unsigned int
Memcpy needs me to type cast the recieve buffer segment holding the DWORD value into a DWORD*.

After I did that it all works great and movement is much smoother than it already was, very few hiccups if any.
1) Char vs uchar should not matter at all -- they're the same bytes in memory.
2) Unsigned int and DWORD are the same size on all current gaming platforms, so this shouldn't matter.
3) Cast a DWORD to a DWORD*? If you provided the wrong type, then the compiler wouldn't have accepted the code in the first place. If you somehow used a value instead of pointer-to-value, I'd have expected you to crash, not just run "laggy".

I don't think that any of these "cures" would, by themselves, fix any real cause of a problem like you describe, unless there's something else also involved. But whatever. You don't have a problem right now -- yay you.
enum Bool { True, False, FileNotFound };

This topic is closed to new replies.

Advertisement