• Announcements

    • khawk

      Download the Game Design and Indie Game Marketing Freebook   07/19/17

      GameDev.net and CRC Press have teamed up to bring a free ebook of content curated from top titles published by CRC Press. The freebook, Practices of Game Design & Indie Game Marketing, includes chapters from The Art of Game Design: A Book of Lenses, A Practical Guide to Indie Game Marketing, and An Architectural Approach to Level Design. The GameDev.net FreeBook is relevant to game designers, developers, and those interested in learning more about the challenges in game development. We know game development can be a tough discipline and business, so we picked several chapters from CRC Press titles that we thought would be of interest to you, the GameDev.net audience, in your journey to design, develop, and market your next game. The free ebook is available through CRC Press by clicking here. The Curated Books The Art of Game Design: A Book of Lenses, Second Edition, by Jesse Schell Presents 100+ sets of questions, or different lenses, for viewing a game’s design, encompassing diverse fields such as psychology, architecture, music, film, software engineering, theme park design, mathematics, anthropology, and more. Written by one of the world's top game designers, this book describes the deepest and most fundamental principles of game design, demonstrating how tactics used in board, card, and athletic games also work in video games. It provides practical instruction on creating world-class games that will be played again and again. View it here. A Practical Guide to Indie Game Marketing, by Joel Dreskin Marketing is an essential but too frequently overlooked or minimized component of the release plan for indie games. A Practical Guide to Indie Game Marketing provides you with the tools needed to build visibility and sell your indie games. With special focus on those developers with small budgets and limited staff and resources, this book is packed with tangible recommendations and techniques that you can put to use immediately. As a seasoned professional of the indie game arena, author Joel Dreskin gives you insight into practical, real-world experiences of marketing numerous successful games and also provides stories of the failures. View it here. An Architectural Approach to Level Design This is one of the first books to integrate architectural and spatial design theory with the field of level design. The book presents architectural techniques and theories for level designers to use in their own work. It connects architecture and level design in different ways that address the practical elements of how designers construct space and the experiential elements of how and why humans interact with this space. Throughout the text, readers learn skills for spatial layout, evoking emotion through gamespaces, and creating better levels through architectural theory. View it here. Learn more and download the ebook by clicking here. Did you know? GameDev.net and CRC Press also recently teamed up to bring GDNet+ Members up to a 20% discount on all CRC Press books. Learn more about this and other benefits here.
Sign in to follow this  
Followers 0
ic0de

Whats a reasonable timestep

39 posts in this topic

You are making some common mistakes I have outlined here: [url="http://lspiroengine.com/?p=378"]Fixed-Time-Step Implementation[/url].

The first one being that you should never handle absolute time with floating-point numbers. [color=#000080]float[/color] is only useful for delta values—the time since the last update.
So, firstly, always start your game time from 0 and use unsigned long long (64-bit unsigned integer) to store it, with a resolution in microseconds (as mentioned by Hodgman, milliseconds are too slow). This is explained in more detail in my link.


Secondly your method for pausing is very much a hack. I used essentially the same thing long ago when I was just first starting out but luckily on my next project I realized how lucky I had been that it worked, and making it work correctly in all areas took a huge amount of effort.
Mentioned in my link are virtual timers.
Your time class (another problem is that you don’t have one of these) should maintain both the actual time and the virtual time. The virtual time is the time that does not advance when the game is paused, so when the class is told to update and advance time, if the game is paused it does not update its virtual timers. Since they don’t update when paused, the time now and the time of the last update are the same, so the delta is 0, so the physics simulation moves objects by t=0, or in other words they don’t move.

This is basically the industry standard on how to implement pausing in games.


The main issue is that you are using too many floating-point values.
Change these to 64-bit unsigned integers with a microsecond resolution and report back. You will likely see an improved stutter-free result only from this change.
Once again, this is all outlined in the link above.


L. Spiro
1

Share this post


Link to post
Share on other sites
[quote name='L. Spiro' timestamp='1353726549' post='5003649']
The main issue is that you are using too many floating-point values.
Change these to 64-bit unsigned integers with a microsecond resolution and report back. You will likely see an improved stutter-free result only from this change.
Once again, this is all outlined in the link above.
[/quote]

Is there any library you can suggest that use to measure really accurate time? bullet has a btClock class but that only gives time as an unsigned long int not an unsigned long long like you suggested.
0

Share this post


Link to post
Share on other sites
L_Spiro :

A question to clarify things, if the game logic is running at 30 fps for example but the drawn frame rate is something like 60,
it is acceptable to wait ~33 milliseconds before you get visual response from the game that you input has had an effect?

Best regards!

[edit] When I said as soon as possible, I didn't mean something like "asynchronous handling of input" Edited by kauna
0

Share this post


Link to post
Share on other sites
kauna, it depends on the game.
It was mentioned earlier that when you send commands to the GPU, the driver is allowed to buffer them for several frames before executing them. e.g.
[code] CPU GPU
Draw Frame 1
Draw Frame 2
Draw Frame 3 Draw Frame 1
Draw Frame 2
Draw Frame 3[/code]So even if you do process inputs and update the game immediately, you still might not receive a visual change for ~100ms if the driver feels like it!

To test this, you need a high-speed camera filming your screen and your input device, then use a robot to hit an input button suddenly. You can then count the frames between the button being pressed and a change appearing on-screen. Even in 60Hz games, sometimes this delay is as high as 66ms, and players don't notice too much.

Typically "twitch" action games, like Counter-strike, should put in a lot of effort into reducing this latency, while slower-paced games don't really have to worry. Edited by Hodgman
0

Share this post


Link to post
Share on other sites
Hodgman, you are right that naturally there are different scenarios and types of games where the input latency has bigger or smaller effect.

Best regards!
0

Share this post


Link to post
Share on other sites
To answer the OP question:

[b]Will increasing the Hz update cause a "smoother/more responsive" simulation?[/b]
YES.
There is a limit though. If the hz is too high, the simulation delta will be too low. For example at 10000hz, the delta time in seconds will be 0.0001, which can't even be represented accurately using base-2 floating point.
The lack of precision can cause several artifacts, for example your objects not moving at all, or in very rare cases appear as if they're going backwards (or back and forth) due to how the integration resolves the frame.


You can resort to using doubles, but is considered a very bad practice in game development. (Check out [url="http://home.comcast.net/~tom_forsyth/blog.wiki.html"]Tom Forsyth's blog[/url], article named "A matter of precision" for an explanation).
Probably the best option would be to resort to integers and/or fixed point in those cases.

[b]Is stutter & responsiveness a direct cause of low simulation update frequency?[/b]
If you're above 30Hz, probably not. There's either a bad setup of the physics config, your world size, or a lag between the frame you're processing and the one you're showing (see Hodgman's explanation)
Also check there are no NaNs being passed to Bullet.
Another problem is that your timer code for waiting until the next update is probably flawed. QueryPerformanceCounter, Sleep, timeGetTime, even RDTSC are all now seriously flawed because of Cool 'n Quiet/SpeedStep, multi-core and even OS/HW/Bios bugs. Each timer method has it's own quircks. Google about how to implement them properly. It's very tricky.

[b]What do most people use as their timestep?[/b]
Most games use 30hz (FPS & RPG games), some use 25hz (RTS games, Slow paced games, a few games limited by HW like Zelda OoT), and some use 60hz (Car driving simulations, music games, fast action games, fighting games). Other values are possible i.e. some cellphone games update at 5hz, or when the screen needs to be refreshed, or 120hz games, etc.
Note that on consoles, some companies tweak the hz for PAL games (i.e. 60hz becomes 50hz, 30hz becomes 25hz, etc)

Try to keep the Hz a multiple of the screen's refresh rate (unlike what L. Spiro suggested, 48hz is NOT a good idea, 45 or 50 is probably better)

With a monitor refreshing at 60hz, probably a 67hz game will feel less responsive than a 60hz game. John Carmack does a lot of intensive research about this. You should check his website, articles, and follow his Twitter account. Note that he complains a lot about LCD/LEDs monitors adding a lot of input lag because of unnecessary postprocessing; which is another problem of it's own.

Some games (DMC 4, Sonic Heroes) [b]default to 60hz but include an option to change the frequency to other predefined values, like 15-30-45. I personally find this method the best one[/b], because if the PC can't handle your "ideal" update rate, the user can tweak it down, and he'll know that the gameplay experience willl be affected and will blame his old PC, instead of the game. But at least he will be able to enjoy it without going into slow motion.

Cheers
Dark Sylinc Edited by Matias Goldberg
2

Share this post


Link to post
Share on other sites
Ok, here is my new code using long longs and and microseconds. It feels noticeably more stable and consistent. However I don't know why but it just seems off in a way that I cannot describe, like I'm moving across sandpaper or something. I know it's not the graphics because I can rotate freely of the physics engine and that's smooth as butter.

[CODE]
unsigned long long TimeUS()
{
return physicsTimer.getTimeMicroseconds();
}

float dt = 1.0f/60.0f;
unsigned long long prevtime, accumulator, frameTime;
unsigned long long dtInUS = dt*1000000.0f;
int substeps;
unsigned long long gameTimeInUS;
void integrate(unsigned long long pausedtime)
{
gameTimeInUS = TimeUS() - pausedtime; //find out how much time has been spent "in game" and not paused
frameTime = (gameTimeInUS - prevtime); //compute the amount of time required for the last frame also convert from ms to seconds
accumulator += frameTime; //this is the amount of time that needs to be simulated
while ( accumulator >= dtInUS ) //compute the amount of substeps
{
substeps += 1;
accumulator -= dtInUS;
}
prevtime = gameTimeInUS; //store this time for the next update
}
[/CODE]

I'm using btClock::getTimeMicroseconds(); Edited by ic0de
0

Share this post


Link to post
Share on other sites
[quote name='Matias Goldberg' timestamp='1353774478' post='5003765']
[b]Will increasing the Hz update cause a "smoother/more responsive" simulation?[/b]
YES.
There is a limit though. If the hz is too high, the simulation delta will be too low. For example at 10000hz, the delta time in seconds will be 0.0001, which can't even be represented accurately using base-2 floating point.
[/quote]
Define "accurately," please.
[code]
#include <stdio.h>

int main()
{
float f = 0.0001f;
printf("0.0001 -> %.50e\n", f);
f = 1.0f / 60.0f;
printf("1/60 -> %.50e\n", f);
}
//0.0001 -> 9.99999974737875163555145263671875000000000000000000e-05
//1/60 -> 1.66666675359010696411132812500000000000000000000000e-02
[/code]
Looks to me like 0.0001 is represented just as accurately as 1/60. IMO, the issue of updating too quickly comes from mixing large numbers and small numbers (i.e. 12345678.0 + 0.1 is, effectively, 12345678.0), not from the actual timestep's representation being inaccurate.
1

Share this post


Link to post
Share on other sites
Wow this is embarrassing, turns out that bullet does most of this heavy lifting for me, all I needed to do was pass the time since the last update. My efforts to control the timestep seemed to do nothing but compound the problem.

http://bulletphysics.org/mediawiki-1.5.8/index.php/Stepping_The_World
1

Share this post


Link to post
Share on other sites
[quote name='Cornstalks' timestamp='1353780715' post='5003785']
[quote name='Matias Goldberg' timestamp='1353774478' post='5003765']
[b]Will increasing the Hz update cause a "smoother/more responsive" simulation?[/b]
YES.
There is a limit though. If the hz is too high, the simulation delta will be too low. For example at 10000hz, the delta time in seconds will be 0.0001, which can't even be represented accurately using base-2 floating point.
[/quote]
Define "accurately," please.
[code]
#include <stdio.h>

int main()
{
float f = 0.0001f;
printf("0.0001 -> %.50e\n", f);
f = 1.0f / 60.0f;
printf("1/60 -> %.50e\n", f);
}
//0.0001 -> 9.99999974737875163555145263671875000000000000000000e-05
//1/60 -> 1.66666675359010696411132812500000000000000000000000e-02
[/code]
Looks to me like 0.0001 is represented just as accurately as 1/60. IMO, the issue of updating too quickly comes from mixing large numbers and small numbers (i.e. 12345678.0 + 0.1 is, effectively, 12345678.0), not from the actual timestep's representation being inaccurate.
[/quote]
Yeah, my bad. Thanks for clearing that out.

However, 9.999..e-05 sounds like it's a rounded aproximation, rather outputting the actual representation in the PC, because 0.0001 has a periodic representation which is bigger than 0.0001
0

Share this post


Link to post
Share on other sites
[quote name='Matias Goldberg' timestamp='1353787258' post='5003806']
However, 9.999..e-05 sounds like it's a rounded aproximation, rather outputting the actual representation in the PC, because 0.0001 has a periodic representation which is bigger than 0.0001
[/quote]


What do you think 0.0001f is, if not what was posted? I don't know what you mean when you say the periodic representation of 0.0001 is bigger than 0.0001.
1

Share this post


Link to post
Share on other sites
[quote name='Matias Goldberg' timestamp='1353787258' post='5003806']
However, 9.999..e-05 sounds like it's a rounded aproximation, rather outputting the actual representation in the PC, because 0.0001 has a periodic representation which is bigger than 0.0001
[/quote]
Not sure what you mean by that last part, but 9.99999974737875163555145263671875e-05 is the actual representation in the computer of the computer's approximation of 0.0001...
0

Share this post


Link to post
Share on other sites
Just a small sidenote about Bullet (or at least the 2.80 version I've been using), currently it AFAIK uses the following order of operations:

1) detect collisions
2) apply forces/constraints
3) extrapolate forward motion

This means that after each step it will extrapolate based on the latest velocity & acceleration. That may result in interpenetration until the next collision detection. The effect will be smaller with smaller timesteps, but is especially noticeable when scaling down the real-world elapsed time for "bullet-time" effect.
0

Share this post


Link to post
Share on other sites
Something tells me that the simulation frequency vs. display frequency ratio is an important aspect, as in, [b]sampling theorem[/b], aliasing artifacts and such, which may surely be a 'nice' contributor to perceived jerkiness of movement.

After all, changes in position over time, generated by the physics engine, is a signal, which is output by the physics engine, and sampled again by the graphics engine, at a possibly different frequency. Now if the resampling method used is not good, the end result, i.e. graphics output, will have artifacts.
From my experience I'd say, as others have noted, odd ratios like 67 Hz : 60 Hz, make simple resamplers produce especially bad results.

Upping the simulation rate, i.e. samplerate of the "source" (physics engine), certainly is one approach that works, but it's a low-tech one, imposing high computational cost - such goes the common wisdom in DSP land I think. (I'm no expert, but dabbling in that sort of stuff recently).
There would have to be some sort of filtering / interpolation going on as an alternative. Edited by UnshavenBastard
2

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0