I've just implemented some new networking code in my multiplayer RTS/Action game 'Gang War' ( homepage ), I made a video of a quick test session I had earlier today...and comments would be appreciated. I use Newton for the physics, and RakNet for the networking. I wrote the rest of the engine from scratch, and use Direct3D for rendering and OpenAL for sound.
This footage is raw sorry about that [grin].
Here is the download link (24MB) - GangWar_NetPhysics.mpg
Here is a preview image -
The game now supports full multiplayer physics, vehicle->vehicle collisions, object physics, actor physics, etc. Each city consists of ~500 vehicles, and ~800 actors, and 1000s of street side objects. Obviously there were some issues that had to be overcome to keep everything running smoothly.
Before this code was implemented, I was doing some things client side for simplicity purposes...now everything is done server side in multiplayer. Only the clients input/keypresses are sent across the network. The server then sends back only the bare minimum to render the scene.
The obvious downfalls of the delay between user input and the visible result, can be combated by doing client side input prediction. In this video, and for the time being, all client side input prediction (not interpolation) is disabled.
In this particular test, I have a ~50 ping to the server, and there are only 2 clients connected (out of a possible 6). Ignore the visuals / sound / GUI at the moment, and focus on the netgraph, and the vehicle physics.
Another interesting point to keep in mind when watching the video...currently I don't send ANY wheel information from server->client. The vehicle's wheel positions are inferred client side by a raycast and a few calculations. Also the 'wheel roll' value is determined client side by the vehicle's longitudinal acceleration. Server side / offline though each wheel is represented as a true rigid body, connected to the chasis by a set of springs.
I've also implemented a netgraph which is used to debug the network. Each client frame all received packets are added to a list, and rendered as a single vertical line. Different packets are color coded, for instance vehicle data is red, and actor data is green, with uploaded input / misc. packets being white in color. The last 150 frames are stored in the netgraph.
Here is what the netgraph looks like -
Some notes about the video -
- Bandwidth usage in the video ranged from 1.0KB/Sec to 3.0KB/Sec downstream, and <= 0.7KB/Sec up...which should be good enough for 56k users. Though in some situations, large gang battles for instance, with 50 members in each gang...the 56k user would get saturated. As a result I think I'm going to require a broadband connection for the game.
- Game was limited to 25 FPS by the video recording process.
- The delta value for positions/rotations was lowered to 0.00001 for this video, which will result in more bandwidth usage, but better reuslts. Raising the value to 0.001 will cut the bandwidth in half, but make things look more jittery. The delta values just require a certain change before the element will be sent to the client again.
Some more notes about the multiplayer physics -
- All physics code is executed server side @ 150FPS using the Newton.
- RakNet is used for the networking library.
- Client sends input changes to server 30 times a second.
- Server sends back image of game state ~20 times a second.
- 3x3 Newton rotation matrices are converted to quaternions before being sent across the network.
Whats next on the chopping block?
- Stress testing (both client #s and active object #s)
- Lag compensation ( > 150ms pings)
- Client side input prediction ( > 150ms pings)
- Networked streetside object physics
- Sync some things, like engine sound / RPM, particle effects, and collision sounds.
Anyways, comments on all this would be appreciated.