virtual worlds... are they CPU or Bandwith bound?

Started by
5 comments, last by hplus0603 15 years, 6 months ago
Quick and simple question... In commercial, distributed, high volume games... are these typically CPU bound or bandwith bound? Are you distributing your servers because they are eating up too much CPU cycles, or because they can't put out enough bandwith? Obviously I'm sure it's both, but in general I'm sure its one is worse then the other... NOTE: this may be a stupid question as I'm not sure how bandwith limitations work in data centers... (which is relatd to my next question..) Also.. I'm working on a 2D virtual world with very simple physics. In my case, running from my desktop computer on my cable connection... I will sure as hell run out of network bandwith way before I run out of CPU speed. How would this scale if I was had dedicated hosting though? About how much bandwith per machine do you typically get? Obviously every data center has different hardware... but ballpark... Thanks!
FTA, my 2D futuristic action MMORPG
Advertisement
For servers, there is no such thing as bandwidth bound. You can always just pull a fatter pipe. For game designs, it's important to not require more bandwidth than the client can sustain, though.
If the server does physical simulation and/or collision detection, plus the n-squared work of figuring out which of the other players that each player should/would see, then those parts will tend to dominate.
Regarding your data center question: you can get as much bandwidth as you want to pay for. Typically, at high volumes you'll pay $10 per megabit per second per month, plus power and cooling for the server hardware, plus perhaps some floor space for the cage. When you're just worrying about one box or two, though, there are other costs that dominate.
A typical self-managed server plan might be $129 per month and include something like an Athlon X2 CPU and 2000 GB/month of transfer. If you divide that out over 8 hours of active playing time in a month, you get 8 GB/hour, which is about 20 megabit per second. (To actually sustain that, you need to upgrade the port to a 100 Mbit port from 10 Mbit port, for an additional $10/month)
The particular plan I'm looking at is the lowest cost mid-tier plan from ServerBeach.com; other providers are in the same category.
enum Bool { True, False, FileNotFound };
I had a feeling that's how bandwith worked on a data center like in a commercial environment...

What about RAM vs CPU though. Are you distributing your app because you run out of memory or because of CPU?

I'm guessing adding a 1 gig stick of RAM to an existing PC node is cheaper then adding a completely new node (which I'm guessing you do before you add new processors, otherwise what would be the point of doing a distributed environment). Speaking of which, are all your machines single CPU?

Also, where is your break point for sending more data downstream to the client? e.g. what is the 'minimum' and 'recommended' bandwith for an end user to expect a good game experience?

Thanks again... BTW good to see you again hplus, it's been a while [grin]. I'm still waiting for you to write a book that explains all the different networking models for all different game types (or at least one for virtual worlds in great detail)!!!
FTA, my 2D futuristic action MMORPG
Quote:Are you distributing your app because you run out of memory or because of CPU?


For us, it's because of CPU. With 64-bit processors being standard these days, and RAM being so cheap, running out of RAM just isn't the problem. Think about it: if a CPU has 10 GB/s memory throughput, and you have 8 GB of RAM, it'll take you almost a second just to touch all of that! Your working set can't be bigger than, say, 300 MB if you want to run at 30 fps (and ideally it should be smaller still by a factor of 10).

There may be systems with different limitations -- I would imagine that an Instant Messaging server, serving 10,000 users or more, might have a different trade-off. In fact, I would consider multi-homing a single box with multiple IP addresses to avoid running out of port numbers for TCP connections in that case...
enum Bool { True, False, FileNotFound };
Hi,

I'm also very interested in graveyard's question

Quote:
Also, where is your break point for sending more data downstream to the client? e.g. what is the 'minimum' and 'recommended' bandwith for an end user to expect a good game experience?


I am also interested in the acceptable upstream bandwith for an average client. I was wondering how useful it is to allocate server work to clients (safety and cheating aside). Would it be beneficial to trade bandwith for CPU cycles by using the client as a distributed server. If CPU cycles is the overall bottleneck for the server, it seems a reasonable trade.

Dietger
Quote:Original post by dietepiet
I was wondering how useful it is to allocate server work to clients (safety and cheating aside).

Given that safety and cheating are paramount, the answer would be "Not useful". Instead, you do something else to gain more CPU time, should you need it.
The range of bandwidth requirements for game data vary from about 1 kbps per user on the low end (synchronous, modem-targeted systems) to 1,500 kbps per user on the high end (geometry streaming protocols like Second Life).

A multiplayer game on Xbox Live is supposed to work well with 64 kbps up and downstream networking to achieve certification, which means 8 kbps per user in an 8-player match. I think that's a pretty good goal to shoot for until you have a better handle on what, specifically, your game will require.
enum Bool { True, False, FileNotFound };

This topic is closed to new replies.

Advertisement