C++: Easiest way to implement an in-game Time/Date system?...

Started by
32 comments, last by Sean_Seanston 9 years, 3 months ago

Right now I want to put a time system in a game that will allow me to implement a day/night cycle as well as keep track of in-game days, years etc. passing by. Probably something like the game starts in a near future year, then maybe a real-life second is an in-game minute or some such, and time acceleration allows it to progress faster, with accurate months/years/weekdays etc. being displayed.

I'm pretty sure I've used something quite simple like this in Java... but this is definitely one of those areas where C++ is less immediately accessible...

I've looked around and I see Boost has a date/time library, though there's also ctime.h which allows dates in some way, but I suspect it's probably not adequate for my purposes.

TL;DR: Is using Boost pretty much the most practical way to implement an in-game time/date system?

I have used Boost before for a few things, but it might take some time to wade through the unnecessary stuff to find what I really need, so I thought I'd ask before delving in. Some of the examples also refer to a header file that doesn't exist in the version of Boost that came with the Unofficial OpenGL SDK for some reason... but I'll see if I can get around that...

Advertisement
C++ chrono library (part of the standard)

If you want to measure the passage of time while your game is running (for day/night cycles), you get the current session time using std::chrono::steady_clock() (probably; if it has a resolution suitable for your game mechanics. Otherwise, go with std::chrono::high_resolution_clock()). When your user's session begins, get the start time, and when you need to calculate how much time has passed, get the current time. The amount of time passed is (startTime - currentTime).

If you want to measure bigger jumps of time, for example the passage of time while the user is logged off, use something like std::chrono::system_clock() to get the real-life year/month/day/hour/minute time once when the user very first starts a new game ("originalNewGameStartTime"), and then get it again every additional time he loads the game to ("currentSessionStartTime").


amountOfRealLifeTimeThatPassed = (currentSessionStartTime - originalNewGameStartTime);
currentInGameTime = SomeFormula(amountOfRealLifeTimeThatPassed);
The key point to recognize in both these situations is that you don't actually care (and don't want to know) the values of the times of your measurements. You only want to know the duration between the two measurements.

For example, if you start your game and try to measure the number of milliseconds, the raw measurement you get give you a seemingly arbitrary value like 735343434. It doesn't matter what value they give you, what matters is the amount of time between startOfMeasurements and endOfMeasurements.
(end - start) = interval

start = doesn't matter
end = doesn't matter
interval = what you care about 

One elapsed second == 1 simulator minute is a fairly common value in life simulators. You'll find that value in major games like The Sims. One simulated day lasts just under a half hour, giving enough time for players to control their characters.

But be very, very careful about figuring out real life time. DO NOT USE THE CURRENT CLOCK TIME, the OS provides elapsed time stopwatch values you should use instead.

Real life time skips around. It is not always consecutive, sometimes jumping a few seconds or microseconds, sometimes jumping hours or even days. Sometimes it moves backwards. The most obvious examples are things like daylight saving time and leap seconds to adjust the clock. Slightly less obvious are the usually minor clock adjustments on your system clock when it re-synchronizes with time servers. A user could alt-tab out of their game, adjust their system clock forward or back several years, and resume playing the game.

So be careful in converting real life time into simulator elapsed time. A person playing at 2:00 AM on daylight saving adjustment day may be upset when their clock switches, either because now they need to wait for two simulator days for the real life clock to catch up, or because the game suddenly launched forward two simulator days.

But assuming you correctly compute the elapsed real word time, keeping the simulator time is straightforward.

Use a simple counter preserved at the simulation. The epoch, the beginning of time, is zero. Every minimum time unit is +1. Do not tie that to real life time. Simulator time 0 could be midnight starting the Sunday of the week 0 of the game, but you could start a new game as 7:30 AM on Tuesday of week 1, a simulator clock time of 691650. A player might speed up time making a simulator minute take one second or two seconds or ten seconds, a player might slow down time by making a simulator minute take twenty seconds, or even making a simulator minute take one clock minute.

Make sure animations and effects and motion and other elements are based on simulator time, not real-life clock time. If the player fast-forwards the game you want animations to play completely but quickly, if they slow down time you want them to move correspondingly slow. Always remember to base game content like animations and effects and potentially even sound on game simulator time rather than player update clocks time.

So if you were building a life simulator and decided a simulator tick equals one second, all it takes is a bit of division to get time of day and other factors. You've got:

CurrentSecondOfDay = SimTime % 86400;

CurrentHour = CurrentSecondOfDay / 3600;

CurrentMinute = (CurrentSecondOfDay % 3600)/60;

etc.

You can build a bunch of similar functions to get the current day of the week or the current year of the calendar. I strongly recommend you also simplify time to a 28 day month for easy calculation, but you can make it as complex as you want.

Then you can work with whatever time-related simulation values you need. Light of the sky may change based on the current second of the day, certain triggered events may happen at certain minutes, in-game events can happen on computed days of the week or computed weeks of the year.

C++ chrono library (part of the standard)

Interesting... I didn't know about that. Though when I do "#include <chrono>" it says it can't find it... it is something new with C++11 by any chance? I'm still using MSVC++ 2008 Express on this old computer...

Seems you'd still have to use ctime to get date functionality, if I've read that page correctly. That seems to mostly be useful for telling the current date. Maybe it could be offset somehow to tell a future date, but I feel it might become fiddly and probably a more complex system would be in order ideally...

As for boost, I tried to use the boost::gregorian::date class, and while it included ok and intellisense picked it up... I get an error when I try to run/compile it:

1>LINK : fatal error LNK1104: cannot open file 'libboost_date_time-vc90-mt-gd-1_53.lib'

Hmmmmm. There isn't such a file in the boost folder that came with glsdk where the include files are

Also hate how the documentation for the class doesn't appear to list all the required includes/libs etc.:

http://www.boost.org/doc/libs/1_57_0/doc/html/date_time.html

(If indeed it's normal that I need that lib file at all... perhaps something else is wrong somewhere else...)

I'll go see if I can find more information on how boost's date_time works.

EDIT: And actually, now that I think about it, I almost certainly will need something rather sophisticated because my idea is a strategy game involving a large amount of traveling between various places in the world. So I may need some degree of time zone support, depending on how I eventually decide to actually implement time. A fictional near-future setting could go far towards simplifying things for the sake of both development and gameplay convenience, so it wouldn't have to follow the real-world necessarily, but it would definitely be good to have the flexibility for later on.

Interesting... I didn't know about that. Though when I do "#include " it says it can't find it... it is something new with C++11 by any chance? I'm still using MSVC++ 2008 Express on this old computer...


Yes.

You can use Boost to get a work-alike.

As for boost, I tried to use the boost::gregorian::date class, and while it included ok and intellisense picked it up... I get an error when I try to run/compile it:
1>LINK : fatal error LNK1104: cannot open file 'libboost_date_time-vc90-mt-gd-1_53.lib'


Not all of Boost is header only. You'll have to build some of Boost's libraries. The date_time library is one such. The Boost documentation has information on how to build it with their custom-ish build system.

So I may need some degree of time zone support, depending on how I eventually decide to actually implement time.


Remember that you're making a game and not a game (I think). Time zones just piss people off in real life. Ask yourself very hard if they'll add to the fun or hinder it. If you're aiming hard for simulation over, ask yourself whether any piece of complexity will actually make a different in the simulation to the point that the user will notice it and appreciate its addition. Personally, I'd think time zones or the like would be an anti-feature and unnecessary complexity.

Sean Middleditch – Game Systems Engineer – Join my team!

As for boost, I tried to use the boost::gregorian::date class, and while it included ok and intellisense picked it up... I get an error when I try to run/compile it:
1>LINK : fatal error LNK1104: cannot open file 'libboost_date_time-vc90-mt-gd-1_53.lib'


Not all of Boost is header only. You'll have to build some of Boost's libraries. The date_time library is one such. The Boost documentation has information on how to build it with their custom-ish build system.

Found some information about that here...

http://www.boost.org/doc/libs/1_57_0/more/getting_started/windows.html

Though it says date_time's binary component is only needed "if you're using its to_string/from_string or serialization features, or if you're targeting Visual C++ 6.x or Borland."... I guess the Gregorian stuff must use to/from string functionality somewhere and not work at all if you don't use the binary.

That build system thingy... is that the bcp program I've heard about? Apparently it tells you everything a library needs by copying it into a folder or some such... which should be useful.

Time zones just piss people off in real life. Ask yourself very hard if they'll add to the fun or hinder it. If you're aiming hard for simulation over, ask yourself whether any piece of complexity will actually make a different in the simulation to the point that the user will notice it and appreciate its addition. Personally, I'd think time zones or the like would be an anti-feature and unnecessary complexity.

I think it's something worth experimenting with in the context of what I have in mind. And some very full-featured library like presumably boost date_time would make that kind of experimentation a lot easier. I may settle on some kind of universal time (could even be shoehorned in as a plot point), or just a few time zones implemented in an unintrusive way that merely serves to make things more intuitive, e.g. so it isn't dark where a character is when the only clock reads 1pm.

Could be madness too, but I think only iteration can tell me for sure. Definitely not going to invest too much time into it or make it some kind of core mechanic where the player has to do complicated time maths in their head or anything, but some things may not make sense otherwise and it may add to the strategy.

Your game time shouldn't really be coupled in any direct way to real life time unless you're writing a game like animal crossing or something, where the basis of the game is using real life time. But that would require different code to account for.

In general you'll end up thinking of games having different timelines and all your internal code will do is pass around delta time. At a simple level think of time passing in your game, you want every second that passes of real time to increment an hour of your ingame clock, but if you pause your game then the times will always become desynched.

Usually you just want to take a delta time at an iteration of your game loop and then use that as the basis to advance your game time. I.e. assume when you take the time at 15 ms since your program started and it has been 5 ms since the last loop iteration, you essentially "pretend" all your game logic is now happening at the 15 ms mark, even though the code will actually take real time to complete.

The important part comes down to how and when you pass that time to your internal code and if you manipulate it at all, thats how you can have effects like making a game run in slow motion, by distorting the "real life" delta time.


... maybe a real-life second is an in-game minute or some such, and time acceleration allows it to progress faster...
Is it just me or posts here are assuming the in-game time will increase at a more or less constant rate?

If you plan to run the simulation for a lot of time with "time scale" changes, I suggest against using a single reference point in time. In the past I've had some nasty things with accumulation errors and I cannot be completely sure the new frameworks solved them. Keep around a list of "time speed" changes and reset your time reference to last time multiplier change.

Previously "Krohm"


... maybe a real-life second is an in-game minute or some such, and time acceleration allows it to progress faster...
Is it just me or posts here are assuming the in-game time will increase at a more or less constant rate?

If you plan to run the simulation for a lot of time with "time scale" changes, I suggest against using a single reference point in time. In the past I've had some nasty things with accumulation errors and I cannot be completely sure the new frameworks solved them. Keep around a list of "time speed" changes and reset your time reference to last time multiplier change.

That is an (important) implementation detail.

If you have simulation clock based events, you need to make sure all the simulation clock events trigger. It doesn't matter if the rate is one simulator tick per wall-clock second, or a thousand simulator ticks per wall-clock second, or a "skip two weeks" debugging cheat code, your implementation needs to ensure that anything tied to the clock is handled "properly", for whatever your game's definition of that means.

For example, if you've got an event that a player's buff gets cancelled after 525600 ticks, and you jump your simulator clock by a million ticks through a cheat code, you will need to ensure the important action is triggered. Other things like ensuring a particle system has the particles updated at a constant rate are probably not be important events, and those could be skipped over. That is all game specific implementation detail.

Ok, well I've been trying to get my head around actually using boost::posix_time and while I'm not 100% clear on various best practices, I think I know enough to at least botch together something reasonable once I have the right idea about the game logic.

Here's my current critical issue:

- I've been able to get a time and date displayed, and I can seemingly set it to whatever time/date I want and progress it by so many hours/days etc. without too much trouble (hopefully date problems don't arise later on but I'll deal with that then...). I even got the seconds ticking forward like it was an actual clock,

HOWEVER... while I've timed it and it seemed to me that e.g. 10 seconds was indeed taking about 10 seconds to pass, the rate at which a second was added to the timer wasn't always constant. Sometimes a number was clearly displaying for longer than others, then soon after a number would go by more quickly and compensate. Always seemed to average out alright but it looked ugly and distracting.

I've messed around but I haven't been able to fix it. I've looked around at my game loop and thought about how many times the logic update() and render() functions are being called, and the ratios between them and blah blah but I seem to be stumped at this rather basic problem. I guess it doesn't have to be EXACT and it does seem to average out to the right time, but I assume the human eye is going to notice differences of what might be 250 - 500 ms or more between ticks.

My game loop is based around the one in this well-known article here:

http://www.koonsolo.com/news/dewitters-gameloop/

(With the same 25 ticks per second, 40 skip ticks and max frameskip of 5 used in the example)

I guess the problem is that the gap between every logical frame will likely be different, and the timer is presumably only showing the increase of whole seconds, so one time it might do something like... 1946ms + 54ms = 2 seconds, then perhaps go 22 ticks without quite going over the 3000ms threshold, before doing 2994ms + 43ms = 3037ms, thereby making each second come up at a different rate.

Rendering of game objects is solved by using the interpolation value to draw between 2 logical frames, but if that's the obvious proper solution here then I haven't been able to get it to work... but I may have missed something or it could be I'm misusing the library but I think I'm using it ok if a little sloppily.

What's the obvious method I'm missing here? It must be simple in theory to display a clock with apparently perfect consistency. Or is 25 updates per second just not enough? I suspect it would be plenty to at least be imperceptible to the human eye.

Also: I just did some testing with OutputDebugString(). I notice I often get 3 or 4 rendering passes in a row with the same interpolation value. Is that normal...? Then again, maybe when GetTickCount() is being added to another value and having the result divided, that's perfectly expected since the effect of a few small ticks on such a result might be minimal.

Just poking around trying to understand better what's going on...

This topic is closed to new replies.

Advertisement