Jump to content

  • Log In with Google      Sign In   
  • Create Account

GDC 2011

Modular Component-Based AI Systems

Posted by , in Summits, AI 08 March 2011 - * * * * * · 2,140 views
components, AI summit and 2 more...
The GDC 2011 AI Summit opened up with three heavy-hitters from the AI world (Brett Laming, Joel McGinnis, and Alex Champandard) discussing the merits and motivations behind component-based architectures.

Although the term has gained popularity in recent years, and most people in the room expressed at least some familiarity with the concepts, there remains a substantial amount of uncertainty as to what exactly a component architecture entails. To address this, the lecturers presented an outline of how and why component architecture gained the spotlight in modern games engineering, and provided some tips and important rules on how to approach component-based designs.

Historical Trends
As object-oriented programming took hold and languages like C++ finally gained enough traction in the games industry to see widespread adoption, the typical design methodology involved creating rich, deep hierarchies of inter-derived classes. This quickly ran into issues such as multiple inheritance's "diamond problem," brittle structure, and questions of how to deal with "non-inheritance" situations (where some but not all functionality of a branch of the inheritance tree is desired in a particular leaf class).

One reaction to this was to push functionality towards the roots of the inheritance tree, essentially creating "fat base classes." This is even more problematic in a practical sense because bloat of code in the base classes decreases readability, clarity, and maintainability; simultaneously, bloat of data in the base classes led to immense memory wastage and overhead, which became thoroughly unacceptable as games grew in scale.

A more promising direction was to make the entire engine core highly light-weight and extremely data-driven, where virtually all of the behaviour and richness of the game simulation was accomplished in data rather than directly modeled in code. This approach still has its proponents, but suffers from a critical weakness: it lacks natural hooking points and specificity by which one can drill into the running simulation and inspect or modify its state. Put simply, offloading the complexity into data (away from code) deprives us of all the benefits of code-modeled introspection and manipulation.

Enter the Component Model
A central observation behind the introduction (and indeed the widespread adoption) of component-based architectures is that there are fundamentally four things in a simulation which need to be elegantly captured:

  • Classification of entities (Is this a weapon? An item? A door? A sharp weapon? etc.)
  • Key properties (How much damage does this weapon do? How much does it weigh? Which direction does the door open, and what key(s) does it require?)
  • Defined mappings of inputs to outputs (Weapon damage values modify health values; keys modify door lock states; etc.)
  • Interchangeability (Can I use this weapon in place of that one?)
Component architectures provide a modeling tool for all four areas; although other approaches can say the same, components provide a compact and highly elegant manner in which to reach these goals.

The main difference between the component mode of thought and older, less desirable approaches is the notion of systems. Indeed, it is worth noting that proper application of component architecture demands the use of systems richly; anything less will essentially collapse back into the same types of fat-architecture we were trying to escape in the first place. Moreover, in a systems-oriented model, granularity of functionality becomes desirable rather than problematic.

Systems are, fundamentally, the "glue" by which components are organized and compartmentalized. Moreover, systems formalize the interactions between components and other systems. This drives reusability in several key ways:

  • Inheritance can be used (sparingly!) to reuse logic and data relationships directly
  • The structure of interrelated components can be reused modularly
  • Data flow between components and systems can be interchanged as needed
  • Compartmentalization separates reusable elements into neat packages
  • As a bonus, parallelization can easily be accomplished between systems
Careful use of class inheritance, along with factory methods, serialization, and run-time type information (RTTI) frameworks, can provide a highly data-driven model without sacrificing the specificity and hooks of a richer code model. In addition, the deployment of systems can help identify dependencies and functional structure within the simulation itself, allowing for easier maintenance and iteration on existing code.

Another potential win of systems over gamegraphs and similar structures is the elimination of redundant searches. A system can keep track of all the components/entities which are relevant to that system directly, thereby avoiding the need to constantly traverse the game universe looking for those entities. This in turn provides a stark highlighting of the lifetime relationships between various entities, which can be a major advantage when it comes time to do dependency analysis on the simulation itself.

Last but not least, components allow for late binding and re-binding of type information. Have a set of logic that relies on park benches, which suddenly needs to be rewritten to use dumpsters instead? The code change amounts to tweaking a single "tag" within the appropriate system, rather than making large numbers of tedious and fragile changes to raw code dependent on the actual "park bench" or "dumpster" classes. The data-driven aspects of component architectures become a major advantage in this sort of scenario.

It is worth reinforcing the fact that component models are not "an architecture" but rather a paradigm in which architectures can be created. As with virtually everything in the engineering and architectural realms, the exact details will depend highly on the specific game or simulation we are setting out to make.

Component models can be a very powerful tool on modern platforms where concurrently-executing code is a central aspect of engine design. One important observation is that AI work (and indeed simulation work in general) essentially consists of reading and writing properties of entities in the simulation, and potentially rearranging the logical structure of those entities (moving objects, creating new NPCs, recycling old assets, etc.). Envisioning this as a sort of circuit diagram is a useful technique; data flows "down stream" between systems each frame. Any mutation of game state which can be passed down stream to later components can be accomplished using just the execution stack space, since later systems will always have access to that memory space safely. However, any "up stream" communication needs to be delayed by a frame by queuing a "message" which is read by the appropriate system in a subsequent tick. This decomposes nicely into a job/task system, which is a (deservedly) popular means of handling parallelism in modern engines.

As with any other parallelization tasks, a few fundamental rules apply:

  • Minimize the volume of data propagated throughout the system
  • Further, minimize the lifetime of any data that does need to be passed around
  • When possible, derive data rather than duplicating it; no need to store mass, volume, and pressure when any two will suffice
  • Locality of reference is key; custom allocation is, as always, a major win here
  • NULL checks can be eliminated by using dummy non-operative objects instead of empty pointers
  • Propagate RTTI information along with pointers in order to avoid duplicate virtual-table lookups
  • Vectorize component update operations via SIMD instruction sets
  • Perform jobs in batches across cores (helps with cache/false sharing issues)
  • Interleaved allocation is a powerful tool for leveraging SIMD and other parallelization techniques

Concluding Thoughts
The session was a great way to open up the AI summit, cramming in vast amounts of valuable advice and information in the one time-slot when everyone's mind was guaranteed to not already be turned into jelly. Although not necessarily new details to many of those experienced with component architectures, there were plenty of nuggets to guide the decision-making process of both novice and veteran architects alike. An informal poll of the audience suggested that a substantial portion of those in attendance learned at least something valuable to take back to their own individual design efforts - the hallmark of a truly successful session.

IGDA Business and Legal SIG official

Posted by , in Sessions, Summits, Education, Business/Management 03 March 2011 - - - - - - · 832 views
IGDA, Business, Legal, Education
A group of IGDA industry professionals and attorneys gathered yesterday at GDC to formalize the new Business and Legal SIG. The meeting was moderated by Dan Rosenthal; approximately 20-30 IGDA members and industry professionals were in attendance. Topics included goals for the new SIG, events and organizational structure. In addition to informational white papers related to business and legal issues in the games industry, the members proposed increased political involvement in addition to a greater focus on globalization/international issues for the SIG.

Members were particularly interested in seeing specific case studies for "freemium", free-to-play, and pay-to-play business models. Additional proposals included a one day business start-up summit and an entrepreneur track for future IGDA events to address the business and legal issues developers face when starting new projects and new studios. The Business and Legal SIG hopes to provide a wide variety of resources for industry professionals, including member-managed information compilation for regional and country tax credit incentives for studios and publishers.

Specifically, the new SIG discussed the possibility of creating a Business and Legal Wiki or other resource management tool that will tie into IGDA's Educational SIG. SIG members also proposed educational outreach in addition to resource compilation. Many members expressed the concern that valuable information doesn't reach academic programs related to game development.

Approximately six IGDA members volunteered to head the SIG's steering committee. If you would like to learn more about the new SIG, visit http://www.igda.org and sign up to join the mailing list.

How to Win the IGF in 15 Weeks or Less

Posted by , in Summits, Indie Games, Design, Production 01 March 2011 - - - - - - · 1,083 views

Andy Schatz, the developer of Monaco, took the stage to preface his session by saying he isn't going to talk about how to make an IGF winner but, rather, just to tell his story.

"What's [important] is your inspiration and motivation when making games," Schatz says as he recalls September 29, 2009, adding "I was depressed." He talked about being in a rut and making indie games for five years and having it go nowhere. He was working on the third title in his Venture sim series and "it sucked" (as Schatz recalls). He had reached the end of the time he was giving his independent break from AAA game development and was depressed. So, next, Schatz took on board games because they're "all about mechanics" and says it's a very good way to get your brain flowing.

Andy Schatz talked about wanting desperately to make a game about "stealing shit," but was concerned that the fanbase he earned from the Venture sim series (who were largely kids) would conflict with the goal of a game about stealing things. Despite that, though, Schatz made a Monaco board game. He did this on a break from his full-time project and then, when he took his next break from his full-time project, he took a break and he said he's going to make Monaco as an XNA game in a week and that would be his last break from the third entry in his Venture series. Schatz said he wanted to make this heist game like a roguelike.

After tackling some of the technical details of prototyping Monaco, Schatz moved on to talking about what type of tools to use when making a game. Schatz talked about using Torque for the Venture series and citing it as a mistake, then using Unity which (he feels) would have enforced a certain look on the game, and then customizing the look of Monaco by using the "just enough of a framework" XNA toolset.

An interesting side-note about Andy Schatz's presentation is his use of old Facebook status updates, which he uses to bind the session to a narrative spine and function as an ad hoc "digital archeology" (taking the term from GDC cohort Ben Abraham).

Schatz then talked about Ventura Dinosauria (the third entry in his Venture series) and that DINOSAURS ARE AWESOME. And he is correct. Unfortunately, Schatz couldn't make the game fun. "If you want to have a takeaway from this, this is it: I made sure I worked on one cool thing every day. [...] And I never worked on something that took me longer than one day." I'm editorializing here, but: this is awesome and a completely true and valid approach to independent development. "When you think of game development as a holistic thing," Schatz continued, "you get much farther when you're enjoying yourself."

"The number two thing that you can take away from this," Schatz says, "is that you should have people playing your game from day two." He cited his experience from the recent PAX expo. Schatz also makes a crucial difference between "advisors" and people who are just playing ("people who don't know shit about games"). "You can't have too many advisors," just people who align with your general goal and can give you good, pointed advice. The people who don't know about games, regardless of how bad their opinions are, their impressions are crucial. "There are three questions I ask every one of these people: 'What did you like?,' 'What did you not like?,' and 'What confused you?," and when they tell me what I should change, I ignore them."

In talking about how he financed his independent operation, Schatz cited contract work as the best way to make a lot of money, but "it's not fun." "If you're working on a project that just makes money you're going to make money or you go out of business. And if you're working on a project to make recognition you're going to make recognition or you're going to just make money," Schatz said of his independent development philosophy.

When talking about moving Monaco from tile-based visibility to movement to his new lighting and visibility algorithm (that went from being a pretty but distracting mosaic look to a more vector-based approach) took Schatz two months. This was his first and, by far, his longest feature to work on and broke his one-cool-thing-a-day work goal. "Even though it did take me two months, it was something I felt that was cool and interesting."

Schatz then turned to a discussion about game mechanics vs. "experience" and the difficulty in marrying these two things. "As an indie, you're never going to get over that uncanny valley hump, [...] but there are areas" in which independent developers can bridge the realism gap such as, as Schatz points out, sound. He creates a very complex soundscape in Monaco to help the game's overall "experience."

Andy Schatz ends the talk with a near-final Facebook status update that shows how much his mood has increased and how much more productive he was when he did something that he enjoyed. "And fifteen weeks later, I had won the IGF."

Some general thoughts on the social games summit

Posted by , in Summits 01 March 2011 - - - - - - · 397 views
I spent most of today and yesterday at the social games summit, at various tracks. I was going to write them all up as individual posts, but there were a lot of themes that seemed to cut across talks, so it makes sense to also write about them as a group.

I have to admit that a large part of the reason why I went to the social games summit was because I don't really get the appeal of social games. I don't have particularly strong feelings against them, mind you, I've just never been able to understand the audience they appeal to. With that regard, I think Eric Zimmerman and Naomi Clark's talk on "The Fantasy of Labor" was by far the best attempt to explain the popularity of these games, and I should have some coverage up on it soon.

While I'm coming from a place of curiosity, I think the more general sentiment is contempt. For instance, during Patricia Pizer's talk on "Putting Social in Social Network Games", she asked the audience how many people in the room actually enjoyed Facebook games. There were about two hands, out of a few hundred people. [1] I think that's pretty telling. A similar experiment was conducted during the "Are Social Games Legitimate/Evil" debate with similar results.

Speaking of that debate... wow. The audience was quite packed with self-identified social game developers, but the vibe seemed distinctly against social games. I'm not sure if that's just because the anti-social (har!) crowd was more vocal, or if there's just a lot of dissent within the social ranks.

During that debate, Ian Bogost had an interesting metaphor. He compared social games with being similar to the situation we have with ADM and High Fructose Corn Syrup. Like HFCS, social games (and the relationships they encourage) are cheap, convenient, ubiquitous; but they're also a poor substitute for other activities that could be healthier. Evil? Probably an overstatement, but they don't seem to be doing much to make society (or relationships) healthier.

Another journalist suggested that perhaps part of the vitriol towards social games might be because of its perceived threat toward the current AAA model. The thinking is that if you can beat the sales of a $60 million dollar game with something far cheaper and more simple to make, even if the quality is vastly poorer, at some point people are going to question why we're working on these very big expensive games. I think that's an interesting point, but I disagree. The gulf between the audiences is huge, I can't think of many hardcore gamers that are also social gamers, or vice-versa, so it seems both can easily coexist. The attitude I've mostly gleaned from both the AAA developers and indies is more a feeling of contempt than actual anger. They don't like what the social crowd is doing because they think it's sketchy.

[1] An astute reader might point out that there might be other explanations, such as that they may have been distracted. During the Q&A someone brought up this exact point, and so the question was raised again. With a retally the numbers ended up being pretty much exactly the same. The base case question of "who here likes computer games in general" obviously got a much more favorable and unanimous response.

A Debate: Are Social Games Legitimate?

Posted by , in Sessions, Summits, Social & Online Games 01 March 2011 - - - - - - · 1,165 views

The "A Debate: Are Social Games Legitimate?" panel opened with moderator Margaret Robertson (Moderator), and then went on to allow each panelist a small amount of time to make an opening statement on their pre-existing opinions on the matter of social games. This started with Ian Bogost, then Daniel James, Nabeel Hyatt, and finished with Curt Bererton. The order of these panelists seemed, intentionally or unintentionally, to take the order of "most negative" to "most positive" feelings towards social games.

Ian "Cow Clicker" Bogost is, in fact, in the house.

Moderator Margaret Robertson opened with a discussion of the panel's title and mused about the use of the word 'legitimate' in the title by saying "[i]nstead of 'legitimate," are these things... "evil?" She then polled the audience on a variety of social game-related questions. All told, the audience of the panel largely consists of people who make social games. When asked who in the audience plays socialgames, a majority raised their hands; when further asked who played these games for fun, a majority of those hands dropped.

Ian Bogost starts talking by illustrating the amount of high fructose corn syrup in a variety of food products (and points out its presence in unexpected foodstuffs like bread). Bogost wonders if Facebook is doing to friendship if what the leading maker of high fructose corn syrup is doing to food: homogenizing them. Bogost then tosses Zynga into the mix, insisting "you can toss any company into this mix but, you know, the colors matched." Bogost asks the point "is this the way we want to bring this infrastructure" to dealing with friends.

Daniel James started his bit by pointing out that this panel is partially for entertainment so the things said should not be thought to completely elucidate the panelists' opinions. He says "it's interesting that games like Farmville can be considered 'virtual world' games." "It's up to all of us to make ethical decisions about how we spend our time" and considering how the output of a creative work will be used by the end-user. James went on to to discuss the validity of making gambling games (slot machines used as reference) and that doing so would evoke a large amount of personal distress if his games relied too heavily on gambling tricks.

"First things, these games are very fun to play... but they're also a lot of fun to make," says Nabeel Hyatt to open up his opening dialogue on the panel. He relays an anecdote about a woman playing Cafe World and getting together with her like-aged and -gendered friends who all get together with their laptops and play Cafe World together while talking. Hyatt then goes on to talk about Brian Reynolds (who is, largely, responsible for FrontierVille at Zynga) who said that social games were the only area that Reynolds could go where he could "be a game designer." He concluded by saying a lot of people are not in social games "for the money, they just want to hone their craft."

The final opening statement came from Curt Bererton who started by calling him and his coworekrs "the indie evil" (for becoming a small company that makes social games). "You could say we're using metrics to make high-fructose slot machines," but what you end up with is "actual social value and an excuse to talk to [your family] more often." He also adds that since they started playing social games, "I actually talk to my family more." Bererton ends with an anecdote that he could send his mom a "birthday cupcake" in a social game and that's "pretty special."

The panelists then start open discussion with ways to get people talking to their friends and working with other people they know in these social games. Hyatt insists that while games that have strong social ties are effective in getting people to play and talk to one another, that they are actually "more insidious" (he goes on to cite a World of Warcraft raid group getting people into the game for a raid).

On the topic of user metrics, Hyatt says that "some companies use metrics and [it's essentially] pumping out sugar" but that only works for so long. He continues by saying "as evil as you can be with metrics, [...] they can also be used for good things too. [Metrics] also allow you to make better games." Bogost retorts "it doesn't matter how fun these kinds of games are to play, but is it the kind of fun we want" to have. Bererton responds to Bogost by saying that "in the broader strokes of things, these are all a waste of time," and in the broader scheme of things, he'd rather feel like it's a waste of time made better with friends. Bogost says the social game platform "is built on a foundation that I think is troublesome."

"Metrics are a measurement of people's behavior," Hyatt says and continues that metrics get "the game designer out of the ivory tower." Metrics on social games allow a direct feedback loop rather than a designer working on something for three-four years and espousing their opinions before they get any direct feedback from the audience.

Bogost says that the existing social game infrastructure "feels bad." Bererton responds to him that there is a lot of real social value on Facebook now. "I don't have a problem with making money [...] I just think the question is how we do so," Bogost says.

Margaret Robertson asks: "Should social games have ethics policies." "I think a lot of people do have unwritten ethics policies," Bererton responds, "and sometime they look at a feature and say 'I don't think we should be putting this out here.'" Hyatt brings out Jesper Juul's quote "social games are the video games of video games" (in the sense they are the games fighting for legitimacy in an industry fighting for legitimacy). Bogost says "we need not love every form of games, we need to be allowed to ask questions about the kind of games people want to make."

Bererton thinks the "hardcore industry" looks at social games as "not games." He goes on to say "You can't just say all modern art sucks just because you like impressionism." The discussion then goes to the point that if you can't ask questions and argue the validity of a certain type of game, then "what's the point?"

Hyatt is quick to defend the platform by saying that "there are reams of anecdotal evidence that social games are adding real value to people's lives."

"So... So long as social games are doing more good than evil, it's okay that they're doing evil?" Robertson quips in response before ending the discussion and opening the panel to questions.

The Full Spelunky on Spelunky XBLA

Posted by , in Summits, Indie Games 01 March 2011 - - - - - - · 619 views

This talk was given by Derek Yu and Andy Hull. Derek started off the talk by describing his development history, contrasting his "small" games (Diabolika, I'm OK, Spelunky) and his "large" games (Eternal Daughter, Aquaria, Spelunky XBLA.) He said that these smaller games were important to his ability to create the larger games, in ways he would go on to describe later in the talk.

Derek described his prototyping process, which he compared to doodling. He just makes things that interests him at the time, without worrying about whether others will like it, or whether it would be worth commercializing. The key, he says, is to "get [his] ideas out," to simply create things.

He then described Spelunky's influences, a combination between the replayability, improvisation and excitement of roguelikes, and the more visual, action-oriented, immediately gratifying platformer genre.

One of the most technically interesting points of the talk was Derek's algorithm for level generation in Spelunky. A larger area is broken up into a 4x4 grid of rooms, and a random "path" is chosen to progress room-to-room from the top of the area to the bottom. This establishes certain rooms as vertical or horizontal corridors, etc. Then, templates for each room type are used, with random embellishments added for obstacles, treasure, and monsters. According to Derek, working on a procedurally generated game is extremely gratifying, because not only does it allow small teams to create more content, but the game is always new and surprising to play, even for the developer.

Derek also pointed out the role small releases play elsewhere. He brought up Super Meat Boy, Google Labs, and Haruki Murakami's process of writing short stories inbetween writing larger works as "success" stories for the process of making small, exploratory games inbetween making larger, more draining works.

At this point Derek handed the mic to Andy Hall, the programmer for the Xbox Live Arcade version of Spelunky. Andy started off by describing the genesis of the XBLA version of Spelunky: Jonathan Blow contacted Derek and asked if he would be interested in making an XBLA version, and helped smooth the greenlight process with Microsoft. Blow even offered the source code for Braid's engine to the Spelunky team, although they eventually decided to create their own engine after running into some difficulties using Braid's.

Another thing Andy mentioned was the fact that Spelunky XBLA is treated more as a sequel than a port of the original Spelunky, which was difficult to adjust to -- it felt awkward to change things from the original game, but the original game will always still exist for people to play. Of course, they did not want to break what worked in the existing version, but instead make them better.

Some of the things they improved from the original version were the controls (even Andy had difficulty with the original controls), and the artwork (upgraded to beautiful HD.) Derek and Andy also announced that Spelunky for XBLA will support 4-player local multiplayer, and said that more information about the multiplayer game modes will be available in the coming weeks.

According to Derek, Spelunky for XBLA will likely take around two years by the time it is done. Derek and Andy primarily work over Google video chat, as Andy is located in Connecticut.

The Journey to Creating Shank

Posted by , in Sessions, Summits, Indie Games 01 March 2011 - - - - - - · 1,284 views

Jamie Cheng, one of the founders of Klei Entertainment, started his presentation on the team's development on Shank by showing a graph of Klei's games getting "[m]ore and more violent over time" starting with Eets, then Sugar Rush, and culminating in their most recent game, Shank. Klei experimented with the game early on by creating a quick demo of Shank in Flash and was done over a weekend; a lot of the moves were in and the demo, on the whole, gave them the confidence to move forward. The actual Shank character design came a few weeks later. Within two months, Klei had a very capable demo of Shank with a representative character design, a good sample of moves and abilities and combat flow.

With so much of the game representative in the early prototype, Cheng joked "What did we do for the next year?" Over the next year, Klei worked on the engine, polish, and the necessary work for releasing the game for multiple platforms. Publishers were telling Klei not to show the game, because that would "hinder [their] ability to promote the game." And Klei chose to, largely, ignore that and show the game (because it's "[theirs] to show") at PAX 2009 using a box gotten from Home Depot that was decorated pretty impressively to demo the game to PAX 2009 attendees.

Cheng moved on to discuss the "[t]echnically hard things" for Klei. The first mentioned were that the Playstation 3 port wasn't anywhere near ready for release with Klei only having about four months to their release date. Another issue was the 2GB download of Shank, of which 1.4GB were all in the game's cinematics. The last of the listed big, technical issues during development were the fragmentation issues on the Xbox 360 that were leading to great disparity in loading times and other issues during testing.

Klei worked on thirteen levels in three months after only having two playable, polished levels to show off to people. At one point on Shank's development, their development office also flooded due to a neighbor's dishwasher that flooded their office with water. Despite the flooding, though, their computers (which had their power supplies at the bottom of the case), survived three-four inches of flooding. Somehow.

Some of the post-release statistics on Shank: Average session duration for Shankplayers was 56 minutes. It had a 26% conversion rate on Xbox Live Arcade. And 30% of people who played the game (on normal mode) finished the game. An interesting thing that Klei learned after launch was the difference between "Aggressive vs. Defensive players." The team learned that aggressive players learned and loved the control scheme a lot due to the smooth combat transition, but that defensive players who wanted to run away from fights ran into issues with the control scheme since it wasn't really intended for that play style and it just felt "sticky."

The publisher relations with EA were a mix of "Less Good" and "Good." In the "Less Good" column was "arbitrary deadline," "'olde style' PR" ("press releases are horrible!"), "consumer expectations," and "possibly reduced upside" (versus self-funding). In the "Good" column, however, was the "creative freedom," "'true' support," "multiplatform," "marketing," "marketing," "reduced risk," and "less platform requirements." Cheng, overall, feels that the choice of EA as a publisher was largely a positive thing for the game as, otherwise, the game likely may not have shipped.

"Well, how'd we do?" Shank "sold more in the first 24 hours than Eets: Showdown did [in its] lifetime." XBLA reported 41,000 units in the first week but "multiplatform release was key for profitability." The game has had a particularly "long tail" on Steam, however, and Cheng waxed positively about Steam's role.

Overall, Shank had three months of prototyping, nine months of tools, and six months of miscellaneous other work. The crunch for Shank was largely on the designers and artists as the game was very content-heavy in a short timespan, but the engineers only had to work a couple weekends over the course of the game's entire development. It was a pretty tough time for the team (at times), Cheng said, but overall it ended up working out.

Cheng ended the session with his "thoughts on the future..." He cited the potential of the downloadable space but that, right now, the ceiling is relatively low. "The best breakout hits in the console downloadable space are making < $10M in profit, and there are comparably low number of games." He added that "the platforms really need to push the numbers upward and keep adding great features if they want this space supported and working in the long run." Cheng cited the role of Microsoft's XBLA "Summer of Arcade" which makes kings out of certain games, but other games get left in the periods before and after and get lost (comparatively).

The Humble Indie Bundle

Posted by , in Summits, Indie Games 28 February 2011 - - - - - - · 651 views

The first session in the Indie Games Summit this year covered the Humble Indie Bundle, and was given by Jeffery Rosen and John Graham of Wolfire Games. They gave a sort of postmortem for both the first and second Humble Indie Bundles, covering a lot of details about the design, preparation, launch, and results of the bundles. For those of you not familiar with the Humble Bundles, the Bundles were limited-time, pay-what-you-want, DRM-free, cross-platform game packs that ended up becoming runaway successes.

The idea for the Humble Bundles was inspired by the fact that every Steam bundle sale seemed to automatically get to the number one story on reddit. Wolfire itself put their toes into the water with the bundle concept by bundling Overgrowth with Natural Selection 2, which received a decent amount of press coverage and attention on Reddit, garnering 1600 sales. Another source of inspiration was 2D Boy's successful pay-what-you-want promotion for World of Goo. This led Jeff and John to begin considering how to top a pay-what-you-want sale.

The ideas Wolfire came up with to make an even more compelling sale were to add more games, add support for Mac and Linux platforms, make a better site, add charity donations, and release the source code to the bundled games.

Initially, it was difficult to get developers on board with the idea, especially because they had to limit themselves games already supporting both Mac and Linux. Getting charities onboard was somewhat easier -- 10 minutes into Jeff's complicated pitch, the EFF representative stopped him and asked, "So, you're asking me if you can give us money?"

Wolfire also covered some of the challenges with designing the Humble Bundle site. The challenges they faced included making sure the site was scalable, easy to use, while also providing good customer service.

For scalability, as Wolfire has documented in their own blog, they utilized Google App Engine to host their site. At the highest point, they had 70 instances of their app running simultaneously across Google's servers, and had nearly perfect uptime over both bundles. Amazingly, Google only charged them $10.

They also utilized two CDNs for the actual file downloads. For the first bundle, they used Akamai, and for the second they used MaxCDN. They recommend Akamai, which they say is the most expensive service, but also the best.

As for the ease-of-use goal, they designed their web site such that you did not have to register an account, fill a shopping cart, get a verification email, require a special client program download, etc. There was only one pre-purchase page, and only one unique post-purchase page per user.

Customer service wise, they used the Tender service to track support request emails as tickets, as well as the Olark service to do customer service via live web chat. They only had 18 live chat operators total, and most of the time only a couple people were active, but they managed to handle many many support requests in parallel once they got into the proper mental flow.

The gentlemen from Wolfire then described the specific pre- and post-launch experiences for both bundles. For the first bundle, there was initially very little press interest (with the notable exception of Ars Technica), and only once they began approaching the $1 million sales mark -- the threshold for releasing the source code to the games -- did other press coverage begin taking off. The first bundle achieved $1.27 million in sales over 138,813 contributions.

For the second bundle, they were concerned whether the first bundle was a fluke or actually a repeatable phenomenon. To improve the bundle, they planned new features to improve it beyond the bar set by the first bundle. They added new games, including Braid which was ported to Linux specifically for the bundle, and Revenge of the Titans, which was actually launched for the first time inside the bundle. They also made plans with Steam, OnLive and Desura to provide download keys for the games in the bundle, allowing users to unlock the game on those services instantly, something that was only offered after the fact in the first bundle. Humble Bundle 2 ended up an even larger success than the first, with ~$1 million going to developers alone, another ~$500,000 going to charities, and $133,000 as a "humble tip" which went to support the Humble Bundle as a business.

There were some unfortunate events that occurred post-launch. For example, some customers bought a thousand copies of the bundle at 1 cent per copy, apparently planning to resell the games elsewhere. Also, an estimated 25% of downloads were pirated directly from the Humble Bundle site itself (via shared CDN links), not even counting BitTorrent or other channels.

There were some other issues. After open sourcing Lugaru, a counterfeit version built from the open sourced code appeared on the Mac app store for 99 cents, significantly undercutting Wolfire's own offering of Lugaru HD in the same store, even coming in ahead of them in search because of the shorter name. Another problem (which I do not recall being publicized as much with the Humble Bundle as when it occured with e.g. Minecraft) is that Wolfire also encountered the same dreaded "account freeze" where PayPal placed their balance under hold for an undefined period of time. Wolfire's opinion is that while they were very satisfied with Amazon's payment system, and Google Checkout is at least good for merchants, despite PayPal's unpredictability it is virtually required to support because of how ubiquitous it is as a payment method.

In total, both bundles made $3 million dollars, of which $1 million went to charities such as the Electronic Frontier Foundation and Child's Play.