Cloud Gaming

Started by
39 comments, last by hplus0603 14 years, 6 months ago
I was thinking about Cloud Gaming (onLive, ...) and wanted to know how it is really done. I am assuming the following about it: - Client is a dumb rendering machine that sends keystrokes to server. - Server sends keystrokes as emulated keyboard input to a game instance and gets a result as a frame buffer, the buffer is compressed and sent to the client which decompresses and renders. The problems with the scenario above: - Sending key input and waiting for result would give a delay. - Server running a game instance is a 1 to 1 ratio which is not practical (and expensive). - Compressing the frame buffer (at least 24 times a second) gives a delay. Also according to the above, to see a decent 24fps video the server has to actually send the framebuffer 24 times every second. If the resolution was 800x600 => it would have to send around: 24 * 800 * 600 (assuming each pixel representation as one byte) that is around 11MB of data every second. Now unless online and other have figured out a way to compress a video stream and keep the quality I do not think it is playable not even if every household has 100Mbps connection. The above is video, there is also audio. I've always thought video/audio compression is lossy i.e. It will be hard to deliver a 720p experience, but onLive is promoting it as feasible on a 5Mbps line that is around 640KB/s at its peak. I am reading some articles about the subject: http://www.techradar.com/blogs/article/cloud-gaming-is-broken-and-unplayable-626487 http://news.softpedia.com/news/Crytek-Attempted-Cloud-Gaming-Way-Before-OnLive-110232.shtml http://www.shacknews.com/featuredarticle.x?id=1090 I do not see it working on LAN even, or am I missing a key info?
Advertisement
We'll see once the services go live.

Raw numbers are far from a problem, the only question is latency. And that can only be tested subjectively in practice.

Quote:It will be hard to deliver a 720p experience, but onLive
is promoting it as feasible on a 5Mbps line that is
around 640KB/s at its peak.


As far as video goes, HD-like quality can be realistically streamed, you can test it using a Silverlight example(up to 3Mbits/second) or a random youtube
">example (around 1Mbit/second).

Yes the above exmaples are right but I was talking about latency
as the videos that should be sent over a cloud gaming service
should be an interactive video, the input of the client determines
the result of the next stream.
Dedicated hardware can do realtime compression on the order of 1 ms per frame, that's what they are claiming anyways, which seems fesible. You can buy a USB device to compress video streams at 30fps so it's not an impossible technology.

As for latency most people will accept about 100-200 ms latency for most games as imperceptible, that gives them a working window well within technological limits.

Sending key : ~1 ms delay at most (the client isn't do anything but waiting for input so its not limited to a normal update loop, it can send the input as soon as its generated)

Key travels to server : ~50 ms ( assuming 100ms ping between server - client for instance )

Frame is processed on server : ~33 ms ( assuming 30fps update rate for game, some games have 60fps update which will reduce this by 1/2 )

Frame is compressed : ~1 ms (with dedicated hardware)

Frame is sent to client : ~50 ms

Frame is decompressed on client : ~15 ms ( Onlive specifically states their codec can do 60fps decompression )

Total time from input to decompression of frame : ~150ms, about as fast as u can blink.

Technologically this is feasible, but from a business model standpoint that has yet to be shown (and that is where their real challenge will be imo)


-ddn
Im not sure what this thing is doing -- is the rendered frame sent to many watchers or is each frame going to each client significantly different (a custom view having to be prerendered each) ???

Depending on what is composed for the picture being generated, might it be better to compose the picture on the client with subfeeds and render directives instead
of the entire bitmap (even if compressed).

Also I would think that besides single frame compression, some kind of delta compression be done to (possibly significantly) cut down the data size. Even if there are crunch intervals (like when full sync frames are sent) video presentation usually buffer ahead some frames and the average savings of delta compression could still be significant.
--------------------------------------------[size="1"]Ratings are Opinion, not Fact
Your understanding of the client and server seem to correct for OnLive. There are interviews that explain the concept. Also their site explains it very well. The main thing is that they have a compression algorithm they're using that's apparently fancy.

I signed up for the beta a long time ago. I was hoping to get in and test it with my work computer from Michigan. I get 20 ms ping to virginia which is where one of their servers is located, and I have a fiber line, so I can see this being very promising.

I could see myself playing with HD video quality. I can stream HD videos and such online just fine at work.
Actual problems that I see which will be true even if the bandwidth/latency problems turn out to be truly solved:

- Server farms still have to be located very close to the clients (as far as network topology goes, this is still very closely related to spatial relationships). The result would be that everyone gets off work and wants to play games at the same time, slamming the same server farm at once, while the server farm across the country would be completely empty because everyone's still at work. This is a HUGE waste of potential server time.

- You basically need at least $500 worth of server equipment per player. The most expensive components of a gaming PC are the CPU, GPU, and RAM, and each client needs to use most of each of those components.

- Hardware becomes obsolete too quickly; You will have to buy brand new servers every 2-4 years. This is a big reason why consoles are as popular as they are compared to PCs for the average gamer.

- You would have to lisence the actual software you want to make available. Game developers barely break even as it is (if you take the average of the entire industry), so I can't see how a publisher would want to provide some kind of bulk discount since the service would detract from their own retail sales.

- Subscription costs to make a profit under the previous constraints would need to be too high to attract very many customers.

[Edited by - Nypyren on September 12, 2009 1:06:38 AM]
Quote:Original post by Nypyren
- You basically need at least $500 worth of server equipment per player. The most expensive components of a gaming PC are the CPU, GPU, and RAM, and each client needs to use most of each of those components.


RAM is cheap, im not sure why your saying its one of the most expensive parts. My power supply was a bit more than the cost of my 4gb of ram and my mother board was a lot more. If I would have gotten the case I really wanted, that would have also been much more than the cost of the RAM but I can see why you left that off the list ;)


Quote:Original post by Azh321
Quote:Original post by Nypyren
- You basically need at least $500 worth of server equipment per player. The most expensive components of a gaming PC are the CPU, GPU, and RAM, and each client needs to use most of each of those components.


RAM is cheap, im not sure why your saying its one of the most expensive parts. My power supply was a bit more than the cost of my 4gb of ram and my mother board was a lot more. If I would have gotten the case I really wanted, that would have also been much more than the cost of the RAM but I can see why you left that off the list ;)


Sometimes RAM is cheap. Assuming they try to do the whole "put multiple CPUs and GPUs in the same box", they will start running into something that most PC game developers have forgotten even exists: BUS CONTENTION.

Can you imagine the poor system trying to cope with 4 quad-core CPUs and 4+ high end GPUs? You are obviously going to need some kind of ridiculously badass motherboard and RAM, otherwise the box will starve itself. Although the motherboard (and in particular the bus/memory controllers) cost would be higher than usual, I suppose the RAM might not need to be that much better.

If they decide to go with "have a conventional, high end gaming PC per player instead of trying to run multiple sessions on a huge beast of a server", then the PSU, motherboard, and hard drive costs start taking a higher % of the overall cost, while RAM can be a bit cheaper, and CPU and GPU costs can more or less stay the same.


As far as the other hardware stuff is concerned:

- Power supplies have to be pretty beefy, but nothing out of the ordinary compared to what people have to put in triple SLI gaming rigs.

- Large/high performance hard drives would be on separate networked file servers instead of on each the gaming host itself, so those costs would scale far better than any other bit of hardware involved.

- Most components in server farms have been optimized for less power / less heat. However, traditional server farms don't typically need ANY gaming-class GPUs. High end GPUs are by far the worst offenders for both heat and power consumption. The cloud gaming server farm's power and cooling would have to be better than usual, and better = more expensive.

[Edited by - Nypyren on September 12, 2009 1:29:04 AM]
Quote:Original post by ddn3
Dedicated hardware can do realtime compression on the order of 1 ms per frame, that's what they are claiming anyways, which seems fesible. You can buy a USB device to compress video streams at 30fps so it's not an impossible technology.

As for latency most people will accept about 100-200 ms latency for most games as imperceptible, that gives them a working window well within technological limits.

Sending key : ~1 ms delay at most (the client isn't do anything but waiting for input so its not limited to a normal update loop, it can send the input as soon as its generated)

Key travels to server : ~50 ms ( assuming 100ms ping between server - client for instance )

Frame is processed on server : ~33 ms ( assuming 30fps update rate for game, some games have 60fps update which will reduce this by 1/2 )

Frame is compressed : ~1 ms (with dedicated hardware)

Frame is sent to client : ~50 ms

Frame is decompressed on client : ~15 ms ( Onlive specifically states their codec can do 60fps decompression )

Total time from input to decompression of frame : ~150ms, about as fast as u can blink.

Technologically this is feasible, but from a business model standpoint that has yet to be shown (and that is where their real challenge will be imo)


-ddn


Correct me if I am wrong but 100-200ms on a network game is acceptable if the client is using some kind of client side prediction, but in this case there is none,
the client has to remain still for 200ms and wait for next frame from server.
I think that it is feasible if there are localized servers for every city in the world and that the city has connections of +5Mbps.
It will cost the providers a lot of money as mentioned, the servers for running games at full resolutions have to be amazing and the hardware would be update at least once per year.

This topic is closed to new replies.

Advertisement