What if a client can't run at desired fixed time step?

Started by
13 comments, last by Hodgman 10 years, 1 month ago

if your game has a competitive aspect 60MHz is probably too low for graphics


Even if you divide that by 1,000,000, I think you're buying into a little too much hype ;-)
(I'm assuming you meant 60 Hz frame rate here.)

Console first person shooters (with competitive online matches) often run graphics/physics at 30 Hz, and networking at 12 Hz or 10 Hz, and still seem to be mightily popular.

Also, if it's competitive, then the competitive players will have nice graphics cards and CPUs, and thus the min-spec experience isn't necessarily what they will compare to.

When it comes to graphics, you can also scale back a LOT on quality in favor of speed -- simpler shaders, opaque instead of translucent foliage, lower display resolution, fewer (or no) shadows, lower-resolution textures, etc. It all depends on what your actual bottleneck is.
enum Bool { True, False, FileNotFound };
Advertisement


Console first person shooters (with competitive online matches) often run graphics/physics at 30 Hz, and networking at 12 Hz or 10 Hz, and still seem to be mightily popular.

I think he means esport, not just competitive multiplayer.

On console, you are limited by the sluggish control scheme anyhow, and as long as everyone have the same limitation, its ok.

Sure, the game doesn't break down as much as if the physics where to be updated less, but high framerate does a lot for almost any title imo...

Even Hearthstone plays nicer and is nicer to look at in "high" setting, where its not capped to 30fps. (no other difference what I can see between those settings)

I'm a bit sad that so many in the industry seem to think that "30 fps is enough".. It really isn't imho.

Only reason movies get away with very low framerates like that is because of massive amounts of motion blur...

Motions get so much more fluid in 60fps+

high framerate does a lot for almost any title imo


Oh, I agree! In this thread, though, we're talking about how to support old laptops with single-core CPUs and Intel GMA 950 graphics, and other such low-end devices. There's lots of them out there.

If you want to fix your time step to 60 Hz for simulation, but render faster, you can do that by treating the simulation as an input to an interpolator or extrapolator for rendering. Or you can run simuation at 100 or 120 Hz -- as long as you have a well optimized physics system, and don't do crazy n-squared algorithms between thousands of NPCs, it's likely that most PCs used for playing games these days will keep up. (Not true if you're using JavaScript in web browsers on ChromeBooks / phones.)

As part of making a game, you need to decide what kind of experience you want to deliver, and then what kind of platform will support that experience, and tune those selections based on the technology, time, and skill available for implementation, to reach a cohesive deliverable. You can do a whole lot of tuning and optimizing here, with various fallback paths, etc -- at the cost of not getting to spend that time on other things that may make your game "better" for whatever the "core player" is for the game. Your choice!
enum Bool { True, False, FileNotFound };
Ah, yes, I was referring to the sport-style environment, not "friends on LIVE" environment. When I write of the competitive environment, I am referring to the people (including one of my brothers) who routinely compete in semi-pro game leagues and competitions. These are often run by the various companies, sponsored by hardware vendors, and usually end up inviting all the semi-final competitors to Las Vegas from around the globe for the final few rounds of web-broadcast events. They also involve prizes of cash and hardware. (My brother just finished one such tournament that offered a $100,000 cash prize to the winning team, they took fourth with each player in his team getting $5000 in addition to the all-expenses-paid vacation.) The competition rigs that people bring to these events are rather impressive pieces of hardware, often with every piece of non-essential software removed and the entire machine dedicated to that one game's optimum performance. The networks provided are also often rather intense with very low latency network switches and similar cards in most boxes.


The main takeaways from my post were basically that (1) reduced coupling between unrelated components is usually a good thing, and (2) the update frequencies of all the systems should be based on what is best for the game, not because of the old TV standards.

Update as fast as you need to give a good simulation, whatever that means for your game.
Draw as fast as reasonable for your game. Obviously a network simulation of go or chess will have different display requirements than a twitch shooter.
Many game styles benefit if you decouple players from each other, some work well with completely independent simulations with just a few key pieces kept synced.

Finally, figure out update rates that make sense for your game. If 60 Hz makes sense for your simulation, then great. If instead 15 Hz makes sense for your simulation, or 85, or some other number, then that is what is best for you. If your game has AI that runs at 4 hz, player input that is processed at 30 hz, and a physics world that gets simulated at 120 Hz and that works for you, then great, whatever works for you.

FWIW, it's possible to play 'professional' counter-strike on a client doing rendering/input/prediction at 120Hz, but a server that's only simulating and forwarding network data at 20Hz wink.png

They do constantly rewind their simulation to inject inputs at the appropriate times though...

This topic is closed to new replies.

Advertisement