Sign in to follow this  

Running a simulation loop on the server, use fixed or variable delta time?

This topic is 2121 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

So, I've implemented a generic server (generic as in the server core is not tied to the game itself, it can be re-used) for a game I'm developing. It has a simulation loop running which is timed using the multimedia timers from [b]winmm.dll [/b](though I am using C#, so I'm P/Invoking). They give me a very good accuracy, at about +/- 1ms. But, even at this accuracy there will be times when time drifts backward or forward due to this, and depending on the tick rate of the server this may (or may not) drift a whole tick back or forth.

Now I've figured out two ways to deal with this, [b]the[/b] [b]first one[/b] is to, even though I have a fixed tick rate of say 20Hz, calculate the delta between each tick and by doing so let the game automatically compensate for the sway of the timer. Sometimes the delta will be 1 ms behind at ~49ms, sometimes it will be a head at ~51 ms and sometimes right at the spot at ~50ms. This seems like the best approach as the simulation will be at the correct time constantly. Pseudo code this would look something like a normal frame-based game loop:

[source lang="python"]while true:

waitForTick()

now = currentTime()
delta = now - prev
prev = now

update(delta)[/source]

This is the current version I'm using. The reasons I like it because it smooths out the discrepancies in the clock over a longer period, giving no sudden jumps. But there's also a possible to get to a situation where you have run one tick less then you should have, but that ticks "time" has been spread out over all the previous ticks - so I don't know how much this matters.

[b]The second solution[/b] I've come up with is to fix the delta time between ticks at the set tick interval, so 50ms (assuming a tick rate of 20Hz), and then detect when I have drifted far enough to perform one extra tick to catch up again. Basically running the server as a fixed time step physics simulation, pseudo code it would look like this:

[source lang="python"]while true:

waitForTick()

while currentTick < expectedTicks:
update(0.05)[/source]

This feels more robust in theory to me, but the fact that you can slip almost a full tick behind before you catch up worries me, as it feels like it would cause a sudden jump in the world when I do my "extra" tick to catch up.

Share this post


Link to post
Share on other sites
Yeah, deltas can be a pain. How about this: Maintain what tick you should be on based on the start of the game, then delta against that? e.g. game start = '21/05/2012 05:00:00', time now = '21/05/2012 06:12:13', elapsed time = '01:12:13', there have been x ticks executed since the start of the game, but there should be x+2 ticks within a time of 01:12:13, so do 2 steps.

Share this post


Link to post
Share on other sites
Yeah I would keep track of the absolute time at which you started your game and each frame's absolute time value.
Also I usually pass both delta and absolute time into my simulation, as some things lend themselves to be more easily updated using one or the other measurement technique.
e.g. a count-down can be implemented as:
if( start ) m_countDown = duration;
[font=courier new,courier,monospace]bool complete = (m_countDown -= deltaTime) <= 0;[/font]
or
[font=courier new,courier,monospace]if( start ) m_countEnd = absoluteTime + duration;[/font]
[font=courier new,courier,monospace]bool complete = (m_countEnd <= absoluteTime);[/font]

[code]u64 systemTicksPerSecond = Os_timer_frequency();
u64 systemTicksPerFrame = systemTicksPerSecond / 20;//20 fps
u64 startTicks = Os_timer_function();
u64 lastFrameTicks = startTicks;
while(true)
{
u64 thisFrameTicks = Os_timer_function();
u64 deltaTicks = thisFrameTicks - lastFrameTicks;
while( deltaTicks >= ticksPerFrame )
{
deltaTicks -= ticksPerFrame;
lastFrameTicks += ticksPerFrame;
Update(lastFrameTicks/(double)systemTicksPerSecond, ticksPerFrame/(double)systemTicksPerSecond);
}
}[/code]

Share this post


Link to post
Share on other sites
The winmm timers are *terrible*. They skip, they jump, and at the end of the day, the best they can attempt to give you is milliseconds.
In C#/.NET, you should use System.Diagnostics.Stopwatch. Internally, it uses QueryPerformanceCounter, which is a much better timer in all ways than the old Windows-95-era winmm library.
If for some reason you don't like System.Diagnostics, use P/Invoke to call QueryPerformanceCounter and QueryPerformanceFrequency to build your own.

Share this post


Link to post
Share on other sites
[b]hplus0603[/b]: Maybe I should have explained myself better, I do use System.Diagnostics.Stopwatch for keeping time. But it's just a timer as you obviously know, and I need a recurring event to keep track of game ticks. This is what i use the winmm.dll functionality for, as it's possible to get 1ms accuracy out of it and it's the only timed and recurring event on Win32 you can get accuracy out of to my knowledge? I'll post an example of my simulation loop below.

[b]Hodgman: [/b]Thanks for this, it's pretty much what I ended up with after tinkering a lot yesterday. Here's how a simplified version of my simulation loop looks:

[source lang="csharp"]long stepLength = 50; // milliseconds
long currentStepTime = 0; // milliseconds

float deltaTime = 0; // seconds
float prevTime = 0; // seconds
float now = 0; // seconds

while(true)
{
// Wait for any networking event or a simulation tick event
switch(WaitHandle.WaitAny(networkEvent, tickEvent))
{
case 0:
processNetworkInput();
break;

case 1:
while((elapsedTimeInMS() - currentStepTime) >= stepLength)
{
now = elapsedTimeInSeconds();
deltaTime = now - prevTime;
currentStepTime += stepLength;

update(now, deltaTime);

prevTime = now;
}
break;
}

//

}[/source]

This seems to give me the best of both worlds:[list]
[*]I get a fixed step simulation
[*]I get a delta time between steps
[*]I can run several steps to catch up if I fall behind, but I still get a proper delta time (albeit small, as they run back-to-back) between each step.
[/list]

Share this post


Link to post
Share on other sites
I generally don't use the OS's timed events/waits, as they're pretty unreliable. For frame limiting in a game, I'd prefer a simple spin loop.
e.g.[code]template<class Fn>
void SpinUntil( Fn& fn )
{
while(true)
{
for( int i=0; i!=10; ++i )
{
for( int j=0; j!=10; ++j )
{
if( fn() )
return;
YieldProcessor();
}
SwitchToThread();
}
if( fn() )
return;
SleepEx(1, TRUE);
}
}

struct TimedWait
{
TimedWait(u64 waitUntil) : waitUntil(waitUntil) {} u64 waitUntil;
bool operator()()
{
return GetTicks() > waitUntil;
}
};

void UpdateLoop()
{
while(!quit)
{
update();
nextFrameTime = thisFrameTime + delta;
SpinUntil( TimedWait(nextFrameTime) );
}
}[/code]

Share this post


Link to post
Share on other sites
GetTicks() is also a terrible timer -- it's even worse than timeGetTime(). Also, at some point, your program will just freeze, and never recover, because you compare absolute tick times (which wrap) rather than differences of time stamps.

Share this post


Link to post
Share on other sites
[quote name='hplus0603' timestamp='1329845446' post='4915217']GetTicks() is also a terrible timer -- it's even worse than timeGetTime().[/quote]Wait, GetTicks() is a real function? I just meant it as a placeholder for whatever the reader's actual timer function is called ;)

Share this post


Link to post
Share on other sites

This topic is 2121 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this