Jump to content
  • Advertisement
Sign in to follow this  
gsgeek

How to devise a "game clock/calendar" for a strategy game?

This topic is 494 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi,

I'm starting prototyping a medieval strategy/simulation game with pygame. I want the core of the game to be based on a "game clock/calendar" (for lack of a better term) that is, have an object tracking the passage of in-game time. I want the player to pause/unpause and change the speed of this "game clock".

 

I want the different actors in the game to take the actions possibly simultaneously (though I guess it'd be sensible for each agent to only evaluate an action over a certain period of time -e.g. check wether to marry off his daughters only every in-game months)-.

 

So my problem is how to code it in a way that fullfills the requirements and make the passage of time uniform (i.e. every "tick" of the "clock" happens each n real-time seconds, depending on speed.

 

The main interface of the game will be a risk-style game which the player can zoom in and out and move the screen across. I figue it will not be graphics-intensive, but I don't want the ai calculations to make the game stutter....

 

So do you have any advice in how to apporach this problem?

 

Sorry if these explanations come across as little detailed or mbiguous. If so, tell me and I'll try to explain better whatever I may have failed to convey.

Share this post


Link to post
Share on other sites
Advertisement

Every game needs a concept of time. Old arcade games had the hardware update at a certain real world rate so they just moved things to equal the desired real world elapsed time. Then we moved to using timers to determine the passage of time between points. Game time is just a virtual construct, you should be thinking about everything in real world time.

 

How you approach the problem depends a lot on specifics. For instance think of the old sim city games, they had ways to let you pass time but when time was ran at an accelerated rate all of the wandering visual characters onscreen would generally move at the same rate. Clearly the update time for the game was decoupled from the in game world time.

 

On the other hand a game like the sims couples them together completely, when you speed up time the characters all move, think and interact in half the time it would normally take them, or less. Your game loop is still updating at conceivably the same rate, so how are they moving?

 

One simple way to do it is to consider the fact than you may have 100 ms or something pass between game update loops but the code for say, simulating a unit moving across the screen may only run for 0.01 ms. You could technically then run it dozens of times while taking very little real time and they would appear to be moving at an accelerated rate. So then it stands to reason that some of your game code should be multiplied based on the game time scale and some of it should not. You may want the units to move quickly when the time is sped up but you may not want the ui to start doing things 2-3x as fast.

Which leads me to my point that it kind of depends on what requirements you have for things moving at accelerated time. For a general case you would maintain some kind of counter of what the current "game time ticks" are. Maybe we start the game clock at 0 and every single digit increase of the clock represents a second of game world time. Now maybe you want one second of game world time to equal 100 ms of real world time. All you have to do then is accumulate real world time into a counter and then "consume" it and translate it to game time. I.e. if 314 ms have passed you can do a little math to now have 3 seconds of in game time elapsed and 14 ms left over. However you might notice in this scenario that you are technically "in between" game seconds. On the bright side something like this is easy to multiply, if you want to run the game at double speed you just multiply the time.

 

Since we have game time as a simple value it is then easy to have in game events happen after a certain amount of game time has elapsed instead of real world time. However you have to make sure you update things that depend on each in "blocks." If you had a platformer game and could speed up time you would want the hero to run into spikes and die twice as fast if the game was running at double speed, you wouldn't want them to take a double distance step and appear in the middle of our spike block and THEN suddenly get hit twice by the spike block and die instantly. You can see the difference between the two would be that in one case each event is ran multiple times by themselves as opposed to all events being run together in a loop that happens twice.

Share this post


Link to post
Share on other sites
So my problem is how to code it in a way that fullfills the requirements and make the passage of time uniform (i.e. every "tick" of the "clock" happens each n real-time seconds, depending on speed.

Caveman 3.0 does everything you describe.  It has a game clock. it has accelerated time.

the game clock is incremented at the beginning of update_all().  the game clock has frames/turns/updates  (at 15Hz), seconds, minutes, hours, days, and years. update runs at 15Hz.

for accelerated time, it varies the number of updates per render. at normal game speed, the game runs in true real time: 1 game second = 1 real world second.  By comparison, Skyrim runs at 20x-30x, and the SIMS3 runs at ~60x.

Note that the update multiple does not equal the real time speedup. IE if you update 2x per render, the game does not run at twice the speed.  it runs at somewhat less than twice the speed. you cut out a render, but not an input or update.   for true doubling of the game speed, you'd multiply all update deltas by the accel rate.   Caveman also does something like this. It has a user defined target FPS, and a variable framerate limiter. This determines desired frame time and thus the update multiplier used to scale all update deltas. This means the game runs at the same speed at different user defined frame rates.  

 

update takes the form of:

do everything that gets done each frame.

if frame==0 && seconds==0 { do once per second stuff }

if frame==0 && seconds==0 && minutes==0 { do once per minute stuff }

and so on.

 

PS: you don't specifically say it, but it appears you're talking about a real time, not a turn based game, right?

Edited by Norman Barrows

Share this post


Link to post
Share on other sites

Hi gsgeek,

 

Usually when you program a game, you have the main game loop on a timer.  This allows you to lock the game timer at a specific frame rate (30fps, 60fps, etc...).  So if you had a timer in your main game loop and allowed the player to change the fps lock variable, you could speed up or slow down the game simulation.  The only limits would be how fast the player's computer can display graphics.  Hope that helps.

 

Derek

Share this post


Link to post
Share on other sites
Usually when you program a game, you have the main game loop on a timer.

Most folks around here prefer fix your timestep to framerate limiters. 

in the case of fix-your-timestep, you'd simply divide DT by your accel rate.  2x speed = DT / 2.  ET is consumed in DT sized chunks. cut DT in half, physics runs twice as fast.

That may be how SGTM (set global time multiplier) in skyrim works.

Edited by Norman Barrows

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!