Epic Interpolator accuracy

Started by
2 comments, last by Int19 16 years, 8 months ago
Lo board, Im currently trying to improve the simulated (display) position of computer simulated cars in our driving simulator (which are sent over network to rendering clients). I stumbled upon the Epic Extrapolator by hplus0603 which according to the posts seems to work really well. That is not the case for me though, the extrapolation is really jerky. The problem might be my "globaly synchronized clock" which is a clientside hack atm. What i do is this : * Use the time reported by the 32 first messages recieved from traffic simulator to setup the variable "dExpectedTimediff", which represents the difference between our local timer and the trafficsimulators local time * When a sample is recieved from trafficsimulator map the time recieved into our(client) local time by : dTimeDiff = dTimeLocal - dTimeTraffic; dAdjustment = C - ( dTimeDiff - dExpectedTimediff ); dTimePacket = dTimeLocal - dAdjustment; Where "C" is a constant allowed latency set to something like 0.05 when i expect to get 30hz updates. This is to ensure that the adjustment is allways (well more or less) positive. Otherwize the last line could end up with a time that is ahead of our own. I then add the sample : extrapolator.AddSample(dTimePacket,dTimeLocal,rgdPos,rgdVel); And this causes some very jerky extrapolation. I tried the same thing with just a hardcoded constant latency (which is more or less what it is on this LAN anyways) : extrapolator.AddSample(dTimeLocal-0.01,dTimeLocal,rgdPos,rgdVel); This is just as bad, and the part that I find most amusing is that the extrapolation is actually MUCH worse than just setting the position to whatever I get in. I would appreciate any ideas. Greetz David
Advertisement
Are you using the time stamp of the received packets, or are you using time step numbers actually put in the packet by the sender?

Also, are you stepping your physics using a constant frame time, or variable frame time?

If the extrapolation doesn't work right for you, chances are that you're putting the wrong numbers into the time variables. One thing you could do is log all the times you put data in, and take data out, into a file, and then examine that file in a tool like Excel to plot jitter and trends.
enum Bool { True, False, FileNotFound };
Im using time numbers actually put in the packet by the sender. What I explained above is that I start by creating a "difference-between-clocks" which I use for all the later messages to measure the jitter (dExpectedTimeDiff-dTimeDiff), and I also add a "constant" latency of "C" to ensure that the timestamp will not be ahead of the time we want to extrapolate.

The reason for doing this rather than adding a global timer in the networklayer is that I dont have access to the serverside(traffic) code at this time. We will have this in the near future, but I wanted to try out your code first anyways.

The AI and physics of the drivers are done serverside so my only concern is interpolating/extrapolating actual positions.

The spreadsheet idea is not a bad one, that might make me catch on to a trend of some sort and will only take a few seconds to implement. Will do this on monday.

Greetz David
After writing my own traffic server dummy I found out it was the traffic simulation that was sending me some bogus values at times. After getting the dev of that part to fix it your extrapolator now seems to work perfectly. Great little template :)

Greetz David

This topic is closed to new replies.

Advertisement