What argument are you using?
For starters: that your gameplay logic will always behave correctly
e.g. they had an issue in the past where we had a game that ran at 30Hz on consoles, but 60Hz on our development PC's. Gameplay programmers and designers mostly played the PC build as it was easier, and tuned the gameplay until they were happy with it... but then when they finally tested it on a console (running at half the frame-rate, double the delta time), the AI were... stupid! A lot of the AI code just so happened to be responsive and challenging with a 17ms delta, but was unresponsive and easily defeated when it operated with a 33ms delta!
They fixed this at the time by adding an option on PC to cap the frame-rate to 30FPS, and then re-tuning the entire game when this option was enabled...
However, that doesn't fix the underlying problem, it just swept it under the rug -- if a user's PC can only muster 20FPS, is the AI going to be "stupid" for them? If a user runs the game at 120Hz, will the AI be too "smart"??
If they'd be using a fixed-time-step, this would've never been a problem as every one would always experience the same AI.
Another example is Call of Duty The physics of jumping is approximating a curved arc via lots of little linear steps. The higher your framerate, the closer it matches that perfect arc, and the lower your framerate, the more approximate it is. COD4 had an issue where in a multiplayer match (which is crazy, as the server should be authoritative over this!) it was possible to jump over certain "impassable" walls, but only if your PC was running the game at about 600 FPS so that your jumping motions were just that tiny fraction smoother than everyone else's...
Another reason can be performance. For some games, fixed-time-step may increase your CPU time per frame, because you may end up with >1 update's per rendered frame, and you have the overhead of game-state interpolation... e.g. my racing game simulates vehicles at 600Hz (and tires at 1200Hz) for accuracy, but costs me a lot of CPU time!
However, for other games, fixed-time-step may actually decrease your CPU time per frame, because you may end up with <1 updates per rendered frame!!
e.g. in the first example, their minimum spec hardware could only run it at 30Hz, so they could have chosen a 30Hz fixed simulation rate. On high-spec PC's, the simulation would continue to run at 30Hz, but the renderer would be free to run at 60Hz, or 144Hz, etc, and it would do so more efficiently due to the simulation rate being fixed...
Say that updating is 20ms and rendering is 7ms:
* with variable timestep, one rendered frame takes 27ms, two frames takes 54ms, three takes 81ms, four takes 108ms -- four frames in 108ms is 37 FPS.
* with fixed update of 30Hz, first frame will take 27ms, but then it will skip an update (because 33ms/a 30Hz step hasn't elapsed yet) and render again, bringing us up to 33ms. The third frame then updates+renders (in 27ms) bringing us to 60ms, and the 4th frame just renders, bringing us to 66ms. Four frames in 66ms is 60FPS.
It's common for RTS games (with thousands of units to simulate) to use a very low fixed simulation rate -- as low as 10Hz -- while rendering at a variable rate, such as 30-60Hz.
Ya - don't know how I feel about slowmo - seems almost as "physics-breaking" as object behaving unpredictably in variable-step under large dt.
You should implement the same slowmo effect in a variable time-step system as there surely will be a limit to how large a time-step can be without being completely full of garbage errors!
e.g. if a player's jump takes one second, and your physics engine is integrating via "small" linear steps, then jumping at 1FPS will be equivalent to... not jumping at all...
At this limit, you can either choose between letting the simulation continue on in this ridiculously approximate state, or you can slow down the game to a point where the user's PC is able to keep up with the simulation...
Fixed-time step is exactly the same, but as well as choosing a "minimum acceptable framerate", you've also set the "maximum simulation rate" to that same number
In any case, this slow-mo effect should only ever occur if the user's PC isn't actually capable of running your simulation at the minimum frame rate. It's your job to specify the game's minimum hardware requirement and then make sure that it does run at an acceptable framerate on that hardware.
Edited by Hodgman, 07 August 2016 - 08:59 AM.