Archived

This topic is now archived and is closed to further replies.

Themonkster

p2p games is this the way forward

Recommended Posts

hi Guys, I have followed the discussion of people trying to create games that work on p2p networks and can support an large amount of number of players. I seem to recall the main issue is trusting the client. I think I might have the answer all though it does involve a central server this server does not do anything in the game. The idea is to use web services to validate the client via encrypted messages. using webservices would be a very cheap method of running a server as most web hosts can have web services run on them. not sure of how the validation could work but I''d just thought I get the idea out to see if anyone could get some use of it. thinking of writing an article on it for gamedev.

Share this post


Link to post
Share on other sites
mmm thanks for the feed back.

how about saying something more than one word and the reasons behind it.

[edited by - themonkster on March 19, 2004 4:48:26 AM]

Share this post


Link to post
Share on other sites
If i''m reading your post right then ->
The problem isn''t validating the client, thats relatively simple with any sort of public-private key encryption. The problem lie in trusting what the client is sending you.

It wouldn''t be too hard to replace the data in the packets being sent, once you can do that and figure out what messages are doing what you can wreak havoc online. In a p2p situation you''ve got no way of running simple sanity checks the way a server might, you can still check through agroup decision or something similar but it will be far more complex and resource hungry to do so.

I haven''t really looked into p2p that much, so feel free to jump in and correct me.

Share this post


Link to post
Share on other sites
One of the big problems from my own perception is validating what actually happens. There has to be a single process that decides who hit who first, what weapons missed, etc. For this reason, it seems best that there should be a single machine that detects all game events and sends them to all clients, including what takes damage from who, etc.

You may be able to trust what comes from a client, but from a timing standpoint (such as lag, lost packets and ping time), it is too easy for two machines to think that a single close call situation works out differently on each.

Battlezone 1 was a p2p multiplayer deathmatch game. When a guided missile is fired, the client whose controlling craft it is locked on to is given the charge of whether or not the missile hits. Problem is, the missile is the type that can be redirected to a different target if a heat signature passes in front of it, and the current client might not detect it when another one does. In this way, a single missile could take out two targets, baffling all three players (the player that fired, the player who was the intended target, and the player whose craft the missile reacquired and struck). For reasons like this, it is best for a single machine, the server, to make these kinds of decisions.

As for validating what is sent, I think most network games try only to send controls to the server, and only send controls that have on/off states and weapons that fire with a recharge time, so that clients can't spam the server with fire and move messages and get an advantage over the others. The clients can approximate their own game states for a time, but the server is the one that is in authority and has to keep the clients in sync. This is what Battlezone 2 does, but it still suffers from game states that suddenly change as the server corrects the clients.

The ability to have a single judge for situations becomes VERY important in sponsored tournaments. If your network paradigm can't handle the inconsistencies, your multiplayer will have a very short and sporatic lifetime.


[edited by - Waverider on March 19, 2004 2:06:08 PM]

[edited by - Waverider on March 19, 2004 2:07:49 PM]

Share this post


Link to post
Share on other sites
quote:
Original post by Themonkster
ah cool. what if and its a big if the messages were verified via the webservice before being sent to the other client.




nope. as long as something is running on the client machine it can be hacked and modified. what''s to stop me from coding a fake webservice that "validates" my packages and sends you correct looking ones? all that matters in the end is what the other client receives. as long as those packets "look" right then i can send you whatever i want to send you. that''s why you need a centralized server in an MMPOG setting that is validating the packets AND the actions of the players.

so in a p2p setting you would need to have every client validate every other persons actions. that''s a lot of overhead and solving that problem will be significant challenge for P2P MMPOGs

Share this post


Link to post
Share on other sites
ah yeah but we could encrypt it and then someone would have to crack the encryption but I should imagine thats fairly easy with todays computers.

maybe webservices could be used in others ways like delivering xml levels or central score boards maybe even a trading game of sorts.

maybe a p2p network could be set up as a meeting place for gamers who want to play certain games of a p2p nature.

then game developers of these sorts of games could publish them there.



Share this post


Link to post
Share on other sites
quote:
Original post by Themonkster
maybe webservices could be used in others ways like delivering xml levels or central score boards maybe even a trading game of sorts.
Fine for any "intermittent play" game; no good for rapid response action types.

XML may be flexible and generic, but all that genericity and flexibility comes at a significant performance cost. There''s a reason that relational databases typically use fixed-length fields.

Share this post


Link to post
Share on other sites
A bit confusing this. P2P means both pay to play, the MMORPG model that most of them use, play for $10 a month. Or peer-to-peer, but I don''t know if that is really relevant in games.
Doom was peer-to-peer, as a result it moved as fast as the player on the slowest PC, after that everyone abandoned this model.

Share this post


Link to post
Share on other sites
I believe you''re confusing peer-to-peer with naive implementation of lockstep mode. DOOM had the latter, as well as the former.

A typical Quake server is also peer-to-peer. Yeah, it''s called a "server" but that server is a "peer" on the Internet, and needs to be visible to the other "peers" (the players). Pretty much all non-pay-to-play service is peer-to-peer (as is most file sharing services).

Or perhaps you''re confusing it with (logical) topology. A Quake server is peer-to-peer in star topology. A fully connected topology is also sometimes called "peer-to-peer" just because anyone can send to anyone else in one (logical) hop.

Share this post


Link to post
Share on other sites
Trusted clients don''t work, as they can always be comprimised.

An intresting concept a few years back involved breaking a file down into a series of recurisve codes. Which only a certain percentage needed to be downloaded to recompose the original file. I think they were called tornado codes. Anyways the idea applied to a static file, however what if the file was a state of a machine running a mssively distributed simluation, say a MMORPG world state. You could have literally 1000s of soruces to download from and recomposing the state of the world. The hacker could comprimise a single or a set of machines however the way the tornado codes worked, they were self validating, due to their recursive nature. Thus the eventual choice of which code would be valid could be easier to determine, based upon percentages.

Ofcoruse the protocols for working out how to validate various state transformation of the world would also be an issue, as the world is dynamic and its states changing.

It''s an intresting concept.

-ddn

Share this post


Link to post
Share on other sites
"naive implementation" has a pretty well defined meaning in academia and software engineering circles. I''m assuming most people here know at least that much -- without a common language, communication is impossible.

Share this post


Link to post
Share on other sites
quote:
Original post by hplus0603
A typical Quake server is also peer-to-peer. Yeah, it''s called a "server" but that server is a "peer" on the Internet, and needs to be visible to the other "peers" (the players). Pretty much all non-pay-to-play service is peer-to-peer (as is most file sharing services).

Or perhaps you''re confusing it with (logical) topology. A Quake server is peer-to-peer in star topology. A fully connected topology is also sometimes called "peer-to-peer" just because anyone can send to anyone else in one (logical) hop.
What you have described is not a real peer-to-peer system. Nodes are considered peers when they are equal in terms of tasks they can perform and responsibility that they have. The server node can''t be considered to be an equal of the clients, hence it cannot be considered one of their peers.

A peer-to-peer system doesn''t need to be fully connected either, in fact many of them use phenomena such as the small world effect to have as wide a reach as possible without having a fully connected network. Take a look at Gnutella, this isn''t fully connected, but it is peer-to-peer

Omnes arx vestrum sunt adiuncta nobis

Share this post


Link to post
Share on other sites
@bob: I think we''re talking about different layers. The networking transport layer (IP) is clearly peer-to-peer. The application structure layer ("doom") is client-server, in this specific case.

When you use a loosely connected web, your latency increases. It''s really hard to get a good game going with multi-hop client-based routing. That''s why application-level client/server star topology is pretty much what everyone chooses, if you can''t use a fully connected peer-to-peer topology.

Share this post


Link to post
Share on other sites