The Cost of Insecurity - Griefing: from Anonymity to Accountability

Published April 21, 2005 by Steven B. Davis, posted by Myopic Rhino
Do you see issues with this article? Let us know.
Advertisement

Copyright (C) 2005, IT GlobalSecure Inc.

The dark side of the explosive growth of online gaming has been an infestation of griefing and cheating. Griefing is truly an "Internet" problem. The relative anonymity of Internet communications has resulted in a decline in civility, and the massive scale of the online games has resulted in a loss of "adult supervision". After all, if you were playing a face-to-face role-playing game, you would likely not "grief" for fear of being slugged or snubbed by your soon-to-be-former friends. First, to define somewhat formally our terms:

Griefing - an act, action, or communication that is technically legal under the rules of the game (as implemented and enforced in software), but is disruptive to the game experience of others. Such activities may be prohibited by Terms of Service and are often beyond the pall of ordinary human decency and common sense.

Cheating - an act or action that is illegal under the rules of the game (as implemented and enforced by software) that has been circumvented by any means including altering of game state, communications, race conditions, buffer overruns, etc.

Griefing and cheating are of concern to game developers, publisher and operators because they cost money.

Customer Impact:

  • Players quit because they don't like the negative experience
  • New players don't join because they don't want the hassle
  • Ethically ambivalent players get expelled (they wouldn't have cheated or griefed if they knew they would get caught).

Company Impact:

  • Increased customer service support costs to deal with complaints, real and perceived
  • Increased in-game customer service costs
  • Increased technical support costs to identify threats
  • Increased technical support costs to counter threats
  • Increased technical support costs to identify and remove malicious players

Of course, game companies can and do ignore security problems - sometimes the cost to fix a problem is just too high. Also, because of the structure of the games industry, developers, who are best positioned to address these problems, have little to no financial incentive to do so. They get paid for delivering a working game on time and on budget. There is often no reward for reducing or even planning for "lifecycle support". Publishers have not addressed these issues since most revenues come within the first several months of a product's release - long before annoying security problems (and bugs) become visible.

Online gaming has changed this dynamic. Simply adding network play can add 10% to 40% to the sales of a title. After all, if you want to play with a friend online, you can't borrow her copy; you have to buy one yourself (see Battle.net, Counterstrike for Half Life, and the forthcoming Guild Wars). This effect is magnified for expansions and follow-on products. If subscription or pay-for-play is added, the revenue opportunities for a single title (and the corresponding impact of security flaws and bugs) multiply. In January 2005, Bungie terminated thousands of Halo 2 players for cheating and griefing - costing Microsoft tens to hundreds of thousands of dollars in revenue per year [1].

This article will focus on one of the most pernicious aspects of griefing - harassment and abusive communications. We will explore existing solutions and a sample alternative from both a technical and business perspective.

No "Right to Anonymity"

Insults and harassment are virtually routine for many online games. The anonymity afforded to online game players has given rise to widespread and increasingly aggressive harassment. The hardcore gamers that are otherwise prized by game companies are often the worst offenders and taunt "n00bs" [2]. This, of course, is the riskiest time for a game operator - new players are liable to abandon a game that they find hostile. There is often a perception of a "right to anonymity". There is no legal basis for this and, in fact the Privacy Act in the US did not come into being until 1974 [3]. Sexual and racial harassment also, regrettably, occur too often.

In-Game, Community, and Customer Support

To fight these problems, as well as to address other issues, game companies provide customer support and community features. Apparently as many as 25% of customer support calls are due to griefers. This direct avenue for complaint is also a direct cost to the game company. The most common "griefer counter-measure" is to put in place a strong community system. Depending on the game, these community services provide clan features, friends lists, reputation stats, and other features both to tie players more closely to the game and create an environment that reduces anonymity for misbehaving players. One of the best features of a strong community system is that it can provide substantial security benefits at negligible cost. After all, the game developer is putting the system into place for business growth and marketing purposes.

There are two main limitations of community systems. First, malicious players can often create new accounts (especially for free game services) removing the effectiveness of the social stigma of griefing. And second, malicious players can use the anti-griefing system to cause further grief by wrongly accusing players of griefing. This is an excellent and unfortunate example of griefers using the game system against itself and other players. Some online game services, such as Battle.net, XBox Live, and Valve's Steam have the capability to tie a license key or other unique tag to a Player. XBox Live has apparently banned several thousand players for griefing.

The other major form of support is found in persistent world games where live monitoring of game play through Game Masters is often a part of the environment. This gives the game provider the ability to respond in near real time to griefing incidents. This solution is quite powerful, but it does come at a cost. First, the game operator must staff the live team with sufficient Game Masters to handle griefing as well as their other functions. Considering that online games are 24x7x365 operations, each in-game position may require 5 full-time employees. The cost of salaries and benefits for such an employee can range for $25,000 to $50,000, so the cost for having one live position equivalent dedicated to griefing problems will range between $100,000 and $300,000 per position per year.

The cost of managing griefing can grow rapidly for a game service provider - causing the problem to be neglected, redirecting staff from other assignments, or increasing the total staff cost for the game. For a small game, these costs can be the difference between success and failure. For a large game, these costs are a continual drag on the bottom line.

Given these numbers, a game company can make a decision as to whether a new security solution is needed. If griefing is at all a problem in a game, it is probably costing hundreds of thousands of dollars. Minimum. The question is, are there solutions, and what do they cost?

Answers

A security solution wouldn't stop, but should detect, and hopefully deter the griefers. Stopping griefing through "dirty word" lists is almost impractical - though Disney's Toontown Online architecture that eliminates "chat" except among trusted friends [4] is an ingenious exception. This solution worked excellently in Toontown's unique environment, but for most games, a general-purpose communication capability - either via text chat or voice - is integral to the game experience. Real "dirty word" lists are very vulnerable to "misspelling" attacks that will thwart the security system while effectively conveying the harassing message. Live license keys are fairly effective, but they do not have a strong binding mechanism to an individual message. Similarly, credit card controlled accounts for massively multi-player games can strongly identify an individual player during a session, but they also cannot be bound to a potentially offending message. The best tool to bind a message to an individual is a digital signature.

While there are numerous sources that can explain how digital signatures work, the important feature they support is non-repudiation. Non-repudiation is the property that only one individual could have signed a message. This works by taking advantage of the critical characteristic of public key cryptography. Namely, that knowing the public key (P) "half" of a public-private key pair will not allow the reconstruction of the private (secret) key (S). Thus, I can broadcast my public key to everyone and they will be able to decrypt my messages, but only I can encrypt them.

I can then use my private key to "sign" a message by encrypting a hash of the message (or the message itself). Then, anyone can use my public key to validate my signature:

The signed message:

S(message) or message,S(hash(message)) - where only I know S( ).

The verification process:

P(S(message) = message or P(S(hash(message)) = hash(message) - where everyone knows P( ) & hash( ).

Now, if we build our communication system (either voice or text) and we add in a digital signature system, players will not be able to deny their messages. If a player can store and forward these messages to the game operator, the operator will have an undeniable record of the conversation. This has several benefits. First, the potential griefers will be deterred knowing that their actions are not-deniable nor spoof-able. This is probably the most important characteristic of the system. Deterring griefing (like crime) has a much better return on investment than hunting down and catching griefers. This actually can be extended to regular in-game actions to deter spawn camping and other griefing problems. Second, the game operator can reduce the live team and customer support staffing for grief-management because the reliable capture of grief communications means that real-time response is less critical. Players can post messages for reliable adjudication and both sides of a conversation can be used as reliable evidence - which leads to the third benefit: fewer customer disputes and complaints.

This quick introduction to an anti-griefing system has, of course, short-changed many details. For example, a public key infrastructure needs to be put in place to make this system work (to issue, distribute, and maintain public and private keys). Also, there needs to be a way to ensure that the signatures are actually bound to an individual. Finally, the infrastructure for reporting and handling alleged griefing incidents must be created.

There are many tools to help you implement such a system, and some of them are free. One I like is Tom St. Denis's LibTomCrypt (http://libtomcrypt.org/), though there are many others. Companies like RSA (http://www.rsasecurity.com), Certicom (http://www.certicom.com), others provide commercial cryptographic toolkits. Our company, IT GlobalSecure, can easily support signed messaging as part of our anti-cheating product, SecurePlay (http://www.secureplay.com)

Implementing security systems securely is a distinct engineering discipline - good technology can be undermined by a poor system implementation and business processes. Also, there are export and other issues that may be a factor, depending on your markets and the specific security technologies you choose.

Conclusions

Security needs to be judged by the same standard as any other product or service - does it reduce costs or gain customers and revenue? In the discussion above, effectively reducing harassment griefing could result in direct cost savings of between $75,000 and $200,000 per position per year (by simply changing from 24x7 anti-griefing support to 8x5 coverage. In fact, even if the size of the problem remained the same, there would be a 10-20% savings by simply eliminating shift work). If, as cited, griefing accounts for 25% of support calls, a substantial portion of those resources can be allocated elsewhere.

This article has reviewed the cost of insecurity due to griefing for online games. We discussed both the business issues as well as the technology. Online gaming is unique - cheating and griefing security problems are clearly and directly reflected in your profits. The problems are real and their impact on your revenue can be clearly measured. As you have seen, security problems can be addressed without technical solutions. But you've also seen that a compelling case can be made for investing in protecting your game and your bottom line.

[1] http://www.bungie.net/News/TopStory.aspx?story=weeklywhatsjan14. Given the $50 ($49.99) yearly subscriptions to X-Box live, the cost per thousand banned is $50,000 (US). One should commend Microsoft, Blizzard, Bungie, and others for being willing to disclose security problems and to actively address them.

[2] "Inflicting Pain on Griefers", David Becker, http://news.com.com/Inflicting+pain+on+griefers/2100-1043_3-5488403.html

[3] Surprisingly, the US actually has quite weak privacy laws. Online game developers should be aware that in many places in Europe and Asia, privacy laws are quite strict and should be considered in the design of an online service. They may affect both the collection and storage of individual data and marketing information.

[4] Mike Goslin, "Postmortem: Disney Online's Toontown", Gamasutra/Game Developer Magazine, 1/28/2004.

Cancel Save
0 Likes 0 Comments

Comments

Nobody has left a comment. You can be the first!
You must log in to join the conversation.
Don't have a GameDev.net account? Sign up!
Advertisement