Real time game servers on a VM on your dedicated server.

Started by
5 comments, last by Wilhelm van Huyssteen 11 years, 8 months ago
Hi.

Some time ago I had an issue with my FPS having constant lag spikes when played online. With hplus's help i figured out it was because I was hosting the server from a rented VM. The lag spikes went away when I moved the server to a rented dedicated server (because of "time slicing" or w/e it was called)

Now its still happily running on my dedicated server along with a few other servers. But with all my tinkering my centos installation is becoming messy. So i was thinking of creating a couple of VM's on my dedicated server since VM's are easy to backup and I cant break my entire server taking everything offline (including my website) in one go. This has happened recently :D. However this of course takes me back to my original problem...

My question is. Before i start setting up VM's (wich will probely take me a while like everything else i attempt in linux) Is there a way for me to make a VM run in "real time" so it can host real time game servers just as well as Im currently hosting it. If theres not then I probely wont bother with the idea of VM's.

Thnx in Advance!
Advertisement
a good VM solution should let you dedicate one or more cpu cores entierly to the important VM(s) and that should prevent most issues. (The big problem with cheap virtual hosts is that they squeeze in multiple VMs per CPU core with the assumption that they won't all do heavy work at the same time, a single VM can then fully occupy multiple cores for short periods of time to get through a piece of work quickly which is great when you primarly host for example webservers but really awful if you also host things that need the CPU all the time (such as a gameserver).

In your case i'd recommend dedicating a few cores to the important VMs (The game servers) You could possibly run multiple gameservers per core but things like the webserver VM cannot be allowed to borrow the gameserver cores. (If the webserver gets hit by heavy traffic it should slow down, it shouldn't steal from the gameservers)
[size="1"]I don't suffer from insanity, I'm enjoying every minute of it.
The voices in my head may not be real, but they have some good ideas!
Thnx! thats ideal.
Note that, even with dedicated cores and RAM, the VM will have a higher scheduling jitter than "bare metal." If your game is action oriented, you may have to run the actual game server on bare metal, and perhaps you can move the other servers to VMs?
If the needs are only "soft action," and/or you can compensate for jitter with increased lag compensation, then it's possible to run the game server, too, as a VM, with dedicated resources, but it just isn't as optimal as bare metal according to what I've measured.
enum Bool { True, False, FileNotFound };

Note that, even with dedicated cores and RAM, the VM will have a higher scheduling jitter than "bare metal." If your game is action oriented, you may have to run the actual game server on bare metal, and perhaps you can move the other servers to VMs?
If the needs are only "soft action," and/or you can compensate for jitter with increased lag compensation, then it's possible to run the game server, too, as a VM, with dedicated resources, but it just isn't as optimal as bare metal according to what I've measured.


This.

Anything requiring decent CPU resources from a server is going to suffer badly on a VM. Hypervisors just aren't quite as great as we'd all like them to be. Scheduling can jitter very badly, and additions of CPU features like hyperthreading can make VM performance even more flaky. If you really are CPU heavy on a game server, run bare metal.

Support/auxiliary servers (such as login, rankings, web pages, etc.) can all handle VM jitter pretty effectively, so they'd be my first candidate for offloading.

Also, a good host should have a way to image your dedicated server directly, so you might look into that.

Wielder of the Sacred Wands
[Work - ArenaNet] [Epoch Language] [Scribblings]

Though choosing the "right" virtualization technique (what's right, and what's wrong?) should be able to avoid this scheduling jitter.

OpenVZ and more recently LXC do virtualization simply by using [font=courier new,courier,monospace]chroot [/font]and by putting processes in each "virtual machine" into a separate namespace. It is not "real" virtualization, if you want to look at it that way, but in my opinion, this is virtualization how it should be.

All processes on the machine run from one scheduler, they all pull their memory pages from one pool, they're all running just like on one machine. Except... they're isolated by their namespaces and jails, and you can give them quotas (if you want). No emulation crap, no obscure hacks, no overhead.

Of course, a malicious process subverting (or even crashing) the kernel will affect all virtual machines, but who cares. When it comes to that, you're kind of lost anyway.
Like other's have said, running on the host OS will give you the highest performance and thinnest margin for latency and throughput variance, but the biggest problem for your initial contact with VMs is probably that your VM provider was over-subscribing their host machines (say, putting 20 VMs on hardware that only *really* could support 12) on the assumption that not everyone is going to be doing heavy-lifting at once. The other issue, also mentioned, is that VMs are "usually" optimized for "bursty" workloads -- Ones that go through brief periods of intense work, rather than a constant load of moderate to heavy work -- often times they're only guarantee you the equivalent of a few-hundred megahertz, but allow you to burst to several times that when resources are available to make up for it. It's a perfectly acceptable strategy for latency-tolerant activities, but bad for things that need a latency guarantee.

I don't think it would be a wasted effort to experiment with hosting the game on a VM on a platform that you have direct control over, it might work out better and if the management benefits are worthwhile it might be worth taking a small performance hit. I'd imagine there must be a way to do it, because there are hosts that are optimized for gaming, and I can't imagine that's a manageable/profitable business without virtualization.

For what its worth, most bigger games separate out a lot of the sub-systems involved in getting into an online game -- login, match-making, load-balancing, purchases and persistent storage are the typical candidates. These all serve as the front and back-end to instances of the game server running on real hardware. Note also that you can run several *game instances* on one physical machine, just make sure to insulate them from one another so that if one goes down, they don't all go down (assuming the one that goes down doesn't take out the kernel itself).

throw table_exception("(? ???)? ? ???");

Thnx for all the replies. Im going to start by moving web+email server to its own VM and il create a "sandbox" VM for messing around with. Il keep my gameservers + database on the bare metal for the time being. Not 100% sure about database since its being used by gameservers and webserver but my gameservers never "need" realtime database access. If theres a delay in database requests it would have no impact on the players so maybe I can move the DB to the webserver+emails VM as well. Although at this point it probely gets a little case specific and ill have to experiment :D

Thnx for all the help!

This topic is closed to new replies.

Advertisement