Lockstep RTS: Time Scale

This topic is 4025 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

Recommended Posts

My lockstep RTS is coded to update the world ten times per second. If the player's frame rate can't keep up, the world will be updated at shorter intervals (i.e. 5 FPS = 200 ms/frame = 5 world updates per second). The length of each world update ("tick") is dynamically calculated based on elapsed time since the last update. If the world is being updated ten times per second, then 100ms will be elapsed in the simulation on each tick. So, what happens when I want to speed up time, like some RTS games1 allow you to do? For example, I could allow the player to speed up game time to 1000%, or ten times faster than normal (100%). The player's main loop would need to be running 100 times per second. What happens if it can't? What if the player's computer can only execute 30 FPS? I could run multiple ticks per frame, but I think that would lead to wierd issues like jittery entity movement. I can think of a few solutions, and I'd like to see what others have done and why:
1. Ignore the problem. If the player chooses a time scale of 1000%, but his game is only running at 50 FPS, the world will be updated at 500% time scale. I could restrict the player's maximum time scale based on system performance, but that seems hacky.
2. Don't allow 1000% time scale. Have the fastest possible time scale be what I currently consider "normal" speed. In other words, the setting "Very Fast" would update the world ten times per second (100ms), "Normal" would run once per second (1000ms), and "Very Slow" would only update the world once every ten seconds (10000ms). In reality, I would probably use less extreme values (i.e. "very fast"=20ms, "normal"=100ms, "very slow"=500ms), but hopefully you get the idea.
Regardless of how often the world is updated, I will be using extrapolation each frame to keep entity movement smooth. I'm hoping to keep variations in tick length to a minimum, so that the player can watch a game's replay and select a time scale independent of what the original game was played at. Any thoughts or wild revelations? 1 Starcraft (Very Slow, Slow, Normal, Fast, and Very Fast, IIRC), Kohan (25%, 50%, 100%, 200%, ... 1600%), Dawn of War, and Age of Empires come to mind

Share on other sites
If your replay file is formatted like this:

World state
State diffs
State diffs
...
State diffs
World state
State diffs

then you could have a fast mode that just skips from one world state to the next one.

Share on other sites
Quote:
 Original post by doctorsixstringMy lockstep RTS is coded to update the world ten times per second. If the player's frame rate can't keep up, the world will be updated at shorter intervals (i.e. 5 FPS = 200 ms/frame = 5 world updates per second). The length of each world update ("tick") is dynamically calculated based on elapsed time since the last update. If the world is being updated ten times per second, then 100ms will be elapsed in the simulation on each tick.

If you're using lockstep updates then the amount of game time elapsed per tick is a constant. I'm not going to go into graphics framerates because that can be an entirely separate issue; I'm just talking about logic ticks. If the system can't keep up with the logic ticks then it simply doesn't meet the minimum requirements (for that game speed).

To implement different game speeds just use a different number of ticks per second (resulting in more game time elapsed per second). In starcraft I think there's a 16x replay mode that slower systems can have trouble keeping up with, so it just goes slower than 16x. And I think it would be sort of unplayable if the 1x was reasonably fast.

Share on other sites
pTymN, that looks pretty close to what I'm planning. Time scale during a replay file playback isn't really an issue. During playback I could simply skip frames if the player's FPS is lower than the tick rate. My issue is related more with tick execution while the game is running.

Quote:
 Original post by VorpyIf you're using lockstep updates then the amount of game time elapsed per tick is a constant.

Ah, that's the kind of info I was hoping to hear. I've been debating this with myself quite a bit. My thoughts are that a constant tick length would make the game speed feel uneven when framerate fluctuates. Although, I suppose that's probably not something the player would notice, right?

Quote:
 Original post by VorpyIf the system can't keep up with the logic ticks then it simply doesn't meet the minimum requirements (for that game speed).

It sounds like your basically saying I should go with Option #1, right? If the player chooses the "Ludicrous" speed setting on a low-performance machine, the game will simply run as fast as possible, even if the full requested speed is faster?

Share on other sites
Quote:
 Original post by doctorsixstringIt sounds like your basically saying I should go with Option #1, right? If the player chooses the "Ludicrous" speed setting on a low-performance machine, the game will simply run as fast as possible, even if the full requested speed is faster?

Probably the best idea. It also avoids potential issues where things work slightly differently if timesteps are skipped, which could lead to unanticipated cheat scenarios. By not advertising the exact speedup you can make this a non-issue.

BTW, if the computer can't keep up with "Ludicrous speed", the screen should go plaid.

Share on other sites
Thanks for the replies guys!
my understanding of lockstep += 1ideas for easter eggs += 1                # thanks Sneftel!

Share on other sites
Any chance you can run graphics and game code in separate threads?

Share on other sites
AFAIK, traditional lockstep RTS games are single-threaded. I can't think of any reason why the rendering couldn't be done in a separate thread, though. I'll think about this more in the future, but for now I'm going to stick with my single-threaded design. I'd welcome any thoughts on this, either way.

Share on other sites
Quote:
 Original post by doctorsixstringMy thoughts are that a constant tick length would make the game speed feel uneven when framerate fluctuates. Although, I suppose that's probably not something the player would notice, right?

They will notice if you go about it the wrong way.
This is something that puzzled me for ages and I just recently figured out.
Also, you seem to have confused lockstep with fixed logic rate.

You want your logic update steps to be fixed and independent from FPS to have a consistent, repeatable behaviour. You want also to have a logic rate that's synchronized to the system rate, so slowdowns just skip frames. You also wanna know what to display at a given moment, so you can show what's happening.

To do this you'll have:
A system clock. This will tell you the elapsed time since the program's start.
A Logic clock accumulator. This will be used to accumulate spare time not used in a logic step.
A timestamped duplicate of the last updated visual data for each visible object.

You will use these things like so:
-check the system clock and check how much time has passed since your last loop. -Add this difference to the logic clock accumulator.
-if the logic clock accumulator is bigger than the fixed step timespan, you loop trough the steps marked * until it isn't:
*copy the current visual data to the last visual data buffer
*do a logic step and substract the fixed step timespan from the accumulator (IE: if you say you'll do 20 updates per second, the fixed step timespan is 1/20 seconds)
-now for your display: using the clock accumulator divided by the fixed step timespan, you interpolate between the current visual data and the last visual data.

this will give you smooth visuals (visually interpolating between logic steps) and frame-rate independent logic (reliable collisions and physics).

Tadaaaa (I may have got some stuff wrong, feel free to correct me if I did).

this is called fixed logic rate. Lockstep (at least to my knowledge) is referred to doing this across the network, where all players do every logic step together.

Share on other sites
That sounds a bit different than the way I'm doing things, but I think the final result is similar. I think the big difference is that you execute multiple world updates in a single game frame, if the frame length is long enough. That should help the world updates keep up with real time, even if the frame rate is low.

I probably could have been clearer with my terminology, too. I generally use the term "tick" to mean both a single update of the world simulation AND the time between network-synchronized calls to the world update function.