Sign in to follow this  
Ozymandias42

Two more client prediction questions

Recommended Posts

Those of you following my amazing adventures in writing a small space shooter game may notice this is probably my third or fourth question on the subject. This stuff is hard. Happily, though, I've actually got client prediction more or less working now. Major thanks to hplus and the gaffer tutorial on networked physics. I've got some more questions, though. So, let's say the client's time to the server is about 15 states' worth, so at time t, the client is rendering time t+15. The user presses the shoot button, and the client sends a message to the server that the client has fired. Now, my question is, if the lag drops suddenly, and the server gets the message at time t+5 instead of near to t+15, should the server hold off on recognizing that input for another 10 states or so? I could fix it by having the server buffer incoming inputs until the right time, but it adds quite a bit of complexity to the server, and a client who presses the fire key in 95% of cases would want the bullet to go a little earlier anyway. Second question, rotations. If there are two clients, A and B, and A begins to rotate, after a trip to the server and a trip to B, B will see A begin to rotate. Because of the prediction and the smoothing, A will get a short initial burst of turn speed, which is fine, but when A stops turning, B won't know about it for a while, so A will appear to keep turning until it suddenly reverses its turn a little bit. This can be really annoying because every single turn another ship does will always have that little hook of reverse-turn at the end. I considered not predicting the direction a ship is facing, adding a bit of roughness to the turns but not ever overguessing, but the direction a ship is facing affects its position if it's accelerating, which could introduce prediction errors. My question is whether there is another, better solution to this, and if not, which is better, an annoying small correction at the end of every turn (which is larger and larger with more lag), not predicting rotations entirely, or some middle ground?

Share this post


Link to post
Share on other sites
1. I would say just process the command a bit earlier than expected. Unless you have a very volatile lag profile this isn't going to be a big issue anyway, and if you wanted to 'hold back' commands you'd need some way of determining what the 'real lag' was – seems a bit unnecessary to me.

2. Hmm, this is the eternal dilemma of prediction, right :P? How have you solved it for linear motion (i.e. a craft stopping, does that jump back?); seems to me that rotation ought to be analagous? I think I would just let the view of the rotation lag, so it didn't have to jump back; effectively this is no prediction for rotation.

Share this post


Link to post
Share on other sites
nPawn - I was actually planning on it. Once it works pretty well, I think I would in fact like to write a nice tutorial on how implement it. I've gotta seriously recommend this tutorial as a starting point for now, though.

Bob - I haven't actually "solved" this problem for motion. The ships don't really burst in directions, since unlike FPS games, they're not either moving or not, they're accelerating. It's much easier to smooth the motion out.

But you're right they're the same problem. Anyone know why in Counterstrike or that sort of game, you don't see the other players take a jump back after they run forwards a little bit? Perhaps it just leaves them in the wrong spot if it's pretty close? Anybody?

Share this post


Link to post
Share on other sites
So, I've been thinking about this problem some more and tinkering with my algorithms. At this point, my prediction works nearly flawlessly as long as the latency between the server and the client remains pretty constant and consistent between packets. The problems turn up when I use the Internet to test between my computer and a friend's. Somewhere along the line, something seems to hold onto several UDP packets for 20-30ms and release them as a block.

Now, it should be the case that a good prediction system can handle packets arriving a little earlier or later than they were supposed to without causing a problem, but I haven't really seen a discussion on how to handle this particular issue.

My current best idea of a solution is that if the server receives an update early, like for instance if it's in state 5 and receives an input command for state 10, the server could just hold onto it for 5 states before processing it. Then I'd just bump up the client's prediction a little bit to try to minimize packets arriving late, and it should wipe out most of the other sort of error. This seems like a pretty obvious solution, but I haven't heard it mentioned anywhere, so perhaps it's crazy. I think. Thoughts? Ideas? Input?

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this