Arcade racing game network strategy.

Started by
10 comments, last by oliii 18 years, 4 months ago
First of, I have experience with online FPS games, our method was very similar to counterstrike/quake clones. I'm mainly concerned about control protocols, and wether a server-client architecture is actually of any use in car racing games. With FPS, the client runs the controls through it's own physics simulation, and sends results, as well as controller data back to the server, who verifies the legality of the move. If the move is invalid, the server sends a corrected position back to the client, and the client position is adjusted. All this runs through interpolators, but even then, the local player would teleport and jitter when hitting anything that isn't static or path-planned. You can see this effect in counterstrike, when you run into another player who is also moving. You'll get a small jitter. This is due to discrepancies of the other player's position on your machine, and on the server machine, due to latency. You run physics on your machine, but on the server, when verification and correction comes, the object is actually slightly away, and a correction message is sent. I wonder how this 'problem' might affect a car racing game, since you'll be mostly colliding with other cars, who also move pretty fast and unpredictably. So, I can see problems with FPS-style controls. I wonder if you guys had any magical ideas, or useful experiences designing online racing games. ta.

Everything is better with Metal.

Advertisement
You have to design your game to hide those problems; typically by making small collisions be client-side only. There's no getting around the problem of transmission latency.

You can forward-predict the other players using some algorithm smarter than extrapolation; for example, run the "car AI" on the last known position/velcity of the player, so that he'll turn into corners and whatnot; this may give a somewhat closer on-screen location to what the player mostly does.

For large wipe-out type collisions, you just have to bite the bullet and spend a few hundred milliseconds of putting cars where you want them on screen and then play the big wipe-out, if the server decides that that's what happened. In fact, if all collisions are either just local (jerk the steering wheel, play a crunch sound, deform the model) or big-wipe-out, then the design may be simpler -- it's the kinetic-but-not-stopping collisions which are really susceptible to latency.
enum Bool { True, False, FileNotFound };

I had a problem like this about two years ago, but with spaceships instead of cars.

The big problem with fast moving objects is that a little difference in their velocity vector translates to huge position differences just after few frames... the most notorious sample I've seen of this was the classic XWing games, where the ships were constantly jittering due to continuos direction changes... they were little jumps, but very noticeable.

The best solution I found at the time, was to blend the position of the objects between where they're and where they should be.

basically, you have two position sets... the visual and the real one... when there's a warp you update the "real" position inmediately, but, instead of update the "visual" position with the correct value, you just make it converge to the "real" position over several frames.

The advantage of this method is a much more smooth movement, which is important for games where the speed feeling is important... and the disadvantage is a bit less precission on collision detection... but hey, you're not going to aim for the head's driver, aren't you?

Hope it helps

Vicviper
Quote:Original post by oliii
I'm mainly concerned about control protocols, and wether a server-client architecture is actually of any use in car racing games.
How about a hybrid system based on a standard client/server architecture, but with direct p2p connections (if possible) to improve the predictions/interpolations?
In theory this can reduce the latency between clients by more than half, without all of the complexities behind a true p2p system.

Quote:Original post by vicviper
blend the position of the objects between where they're and where they should be.
Kinda reminds me of this..
Quote:Original post by doynax
Kinda reminds me of this..


Please, ... shoot me now....

I kinda know about interpolations and blending techniques, dead reckoning and stuff. The problem is not just rendering. The object has to be positioned precisely where the server tells it to be, or you'll run into corrections virtually every frames, or much more than you should be.

The way the system works, uses the reporducibility of computer calculations. If the objects have the same velocity-position, then the simulation will be EXACTLY the same, and there will be no deviation. This could be corrected with increasing the tolerance (which is tiny). But, if you increase the tolerance, when the correction kics in, it correct from a much bigger difference.

It's hard to predict without a practical case. Which I don't have right now :)

Also, car phyiscs are much more complicated than FPS physics, I don't think you can keep the objects in sync withing a small margin.

Maybe I need to loosen up the system. A peer to peer system can be interesting. The problem with a full P2P with two clients colliding, each clients will have a slightly different representation of the world. One may collide while the other may not. Car-Car collisions is too much of a headache. I do think server-client is still better.


....Thinking back to why we got a jitter, we did something wrong, which is correct the player, using the position of objects AT THE TIME OF THE SERVER, whereas obviously, the client is lagging behind.

finding the position of the objects on server at the time the controls were issued would give a better approximation, and less of a jerk.

Everything is better with Metal.

Quote:In theory this can reduce the latency between clients by more than half, without all of the complexities behind a true p2p system.


If someone is serving from their closet, then this might be the case. However, if the servers are on a well-connected backbone, the hop into the server, processing, and out to the greater internet again will only add a few dozen milliseconds at most (unless you have degenerate cases, such as the only two players being in New York but servers in San Francisco).

Quote:The way the system works, uses the reporducibility of computer calculations.


If you want exact collisions, you can't really do it with such a system. When you race cars in There, you will always be ahead of your competitor on your screen, whereas he will be ahead of you on his screen. This is because it uses an input-synchronous system, and there is input latency between the sending and the receiving thereof. Time trials work much better in this kind of situation :-/

If I were to build an arcade racing system where people race head-to-head across the internet, first, I would see if the game design could be changed; if not, then I would build a networking system that uses baseline updates (a la an FPS) rather than input-synchronous updates. I would forward extrapolate other players on a players machine using the computer driver AI (not just dead reckoning), and I would split collisions into "small touches" which just affect the current user's input, and "large collisions" which makes all of the involved parties wipe out, tumble over end, etc.
enum Bool { True, False, FileNotFound };
Quote:Original post by hplus0603
Quote:In theory this can reduce the latency between clients by more than half, without all of the complexities behind a true p2p system.


If someone is serving from their closet, then this might be the case. However, if the servers are on a well-connected backbone, the hop into the server, processing, and out to the greater internet again will only add a few dozen milliseconds at most (unless you have degenerate cases, such as the only two players being in New York but servers in San Francisco).
Eh?
Considering that we're talking about a racing game it seems likely that the server will run on one of the clients, instead of centralized server (but perhaps using one as matchmaker).
Obviously any input sent by one client to another in a pure client/server system has to be send through the server. Whereas a P2P architecture would send it directly instead. So assuming that the latencies are similar between everyone involved and that the server forwards everything as soon as possible (instead of once per frame) we get half the latency.

If we were talking about modem users playing some MMORPG things'll be different, but nowadays most DSL connections seems to have about the same latency as commercial servers anyway.
In my experience, a well-connected co-lo (close to the backbone) is significantly lower latency than DSL or cable circuits.

Anyway, yes, if this game does not use hosted servers, then the latency problem becomes worse. The problem with P2P is that your bandwidth starts being significantly compromised (especially since lots of DSL still only has 128 kbps upstream), and the task of deciding what packets to trust, and how to get a player back in sync, becomes even trickier.

Plus the normal problems with P2P such as NAT gateways, firewalls, etc.

If you're using only a little bit of bandwidth, and you're willing to go through with it, it could probably be done. However, you still need to be prepared to handle multiple frames worth of latency, so from a user experience point of view, it'd be a smaller optimization. I'm wondering whether it would be worth the effort.
enum Bool { True, False, FileNotFound };
DISCLAIMER: I don't have any expericnce with this

this paper/thread seems to have some interesting ideas, but a lag before you see yourself move in the first place seems kind of annoying..

[lag in a racing game seems more of a problem than elsewhere.... Notably when the race starts, you will always see yourself go forward but none of your opponents until say half a second later...]

well anway, storing the input and location/rotation of all objects back say a second and interpolating forward a similar amount of time should allow you to at least approach the ideal solution.
Quote:Original post by hplus0603
In my experience, a well-connected co-lo (close to the backbone) is significantly lower latency than DSL or cable circuits.
I have don't much information to back up my statement but I've never seen a significant difference in the past (at least not for 'our' server, hosted at a nearby ISP). In any event, judging from a 2-minute test I just performed on my ICQ list and a few public servers I tend to have a 30 ms roundtrip pretty much irregardless of who I ping *within* the country (for the record I'm on a lowend DSL connection).
Quote:Original post by hplus0603
Anyway, yes, if this game does not use hosted servers, then the latency problem becomes worse. The problem with P2P is that your bandwidth starts being significantly compromised (especially since lots of DSL still only has 128 kbps upstream), and the task of deciding what packets to trust, and how to get a player back in sync, becomes even trickier.
I seriously doubt that anyone would waste their money on setting up a common server for a racing game, and even if they did I'd be truly surprised if we got one over here. Additionally it doesn't strike me as the kind of game we're you'd get many peer-run dedicated servers.
But I agree that a full P2P system is a mess to develop. However my suggestion was that the OP simply broadcast the input from all players as an optimization to improve the quality of the prediction, instead of actually trusting anything. Verifying whether a player is 'cheating' (by sending different input to different players) would be as simple as comparing what you get from the client to what the server forwards, and ignore the peer connection if there's a mismatch.
And, unlike traditional P2P systems, you don't need to communicate with all players that way. So a basic optimization to conserve bandwidth would be to only use it for the nearest couple of players. Thus even at 40 bytes per packet (you only need send the input after all) for 8 players at 25 Hz you only get 64 kb/s, safely below the capability of a 128 kb/s connection.
Quote:Original post by hplus0603
Plus the normal problems with P2P such as NAT gateways, firewalls, etc.
I'll admit that this is a serious problem. Judging from the (again, not so scientific) tests we conducted on our NAT punch-through I'd say that it works in around 75% of the cases. But hopefully you'll be smart enough to let someone else's library take care of the messy details however.
Quote:Original post by hplus0603
If you're using only a little bit of bandwidth, and you're willing to go through with it, it could probably be done. However, you still need to be prepared to handle multiple frames worth of latency, so from a user experience point of view, it'd be a smaller optimization. I'm wondering whether it would be worth the effort.
Well, it certianly won't give you half the latency for all connections in practice (hence the 'in theory' in my original post) but it would certainly help for a reflex-based game with lag issues. And with the right network library it should be relatively easy to implement.

This topic is closed to new replies.

Advertisement