Quote:Original post by Mike2343
Articles please, I've tried it on about 30 systems now and nothing but more accurate time happens. 95/98/ME was defaulted to 1ms resolution only 2k/XP changed it. I also do my best to use QPC/QPF unless it's an unpathced AMD dual core processor. Then I roll back to timeGetTime().
Quote:From Guidelines For Providing Multimedia Timer Support
The obvious drawback of this implementation is that a large clock interrupt period could result in long delays. For an application to be notified with more precision on todays systems, it must request a smaller clock interrupt period. In the previous example, if the multimedia application wanted its code executed precisely on time, it would request a clock interrupt period of 1 millisecond. Then, the system would check every millisecond to see if there was work to do, as is illustrated in Figure 2.
While this allows the multimedia application to execute its code and play a sound on time, it also degrades overall system performance. Microsoft tests have found that, while lowering the timer tick frequency to 2 milliseconds has a negligible effect on system performance, a timer tick frequency of less than 2 milliseconds can significantly degrade overall system performance. On faster systems, the cost of lowering the clock interrupt period below 2 milliseconds may become affordable, but the subtle effect of the increased interrupt frequency on cache consistency and power management may not be desirable.
Quote:From General-Purpose Timing: The Failure of Periodic Timers
We have calibrated an empty loop (a computation phase) to finish after 1 ms, and ran it a million times on a Pentium-IV 2.8GHz Linux machine with 1000 Hz ticks, saving a cycle-resolution timestamp after each phase. No other user processes were executing. At the end of the benchmark we computed the duration
of each phase by subtracting successive measurements.
...
Instrumenting the kernel to log all interrupts revealed that the only activity present in the system while the measurements took place were about a million ticks and 3,000 network interrupts, indicating ticks are probably the main cause of the problem. This was verified by repeating the measurements with kernels compiled with 100 and 10 Hz ticks, which experienced far smaller time variability, respectively. But measuring direct overhead of the tick handler indicated that it only accounts for 0.8% of available cycles (using the data from Fig. 3, indirect overhead is found to be about 14% significant even for a uniprocessor). We therefore concluded that most of the effect is indirect overhead, due to cache misses. This was verified by repeating the experiment with the cache disabled.
(Note: 1000Hz is the same as 1ms resolution)
This was running Linux, but I doubt matters are much different on Windows. Of course it might not have been your average game code, but any code depending on cache consistency is likely to be slowed significantly down even if 14% might be a bit higher than what occurs under real circumstances.
Quote:If you're making the next oblivion, Unreal 2037, Doom 12 or whatever, you'll likely be using 100% of all CPUs to blow peoples minds. Take the extra time and be kind to the user. If your game kills their batteries in 15mins they won't be playing your game that much.
But you still can't expect that Doom 12 will be able to fully utilize the user's computer, because by the time Doom 14 is out most computers will be able to run Doom 12 smoothly on full settings at 100FPS and still only use 10% CPU. Both GTA2 and Age of Empires could easily use 100% CPU without being wasteful when they were released, but today we have much faster hardware.
I somewhat agree with the desktop assertion that we can claim the whole CPU, but I still prefer to save CPU power if I know the extra processing won't improve the gaming experience. Taking up 800W when we can achieve the same with 400W is stupid.
Also it depends on the game, if some of your users will run in windowed mode to take advantage of other applications at the same time, then you should consider that. It is somewhat rare that users do other things while playing, but I know at least one person who uses an Excel spreadsheet to computer his chances while playing MMORPGs in windowed mode. Even in fullscreen mode you might need to consider this, because the player might have a dual screen setup and run other applications at the other screen.