stimarco

Members
  • Content count

    2483
  • Joined

  • Last visited

Community Reputation

1071 Excellent

About stimarco

  • Rank
    Contributor
  1. Corporate Philosophy Comparisons

    Quote:Original post by zyrolasting Quote:Apple were right to deny that app. They already had a 12" stack of Human Interface Guidelines in the 1980s—long before good design was even fashionable—and they haven't gotten any less anal about interface design since then. Good user interface design is required. It is NOT optional. It never was. You are still running on an entirely juvenile "I love Apple" rave, and are assuming they are the keyholders to design in general. It takes a closed mind to assume that design stops at Apple. We all want a good design... Some of us just want more thing visible. No. Apple have their own design rules. Their own "House style", which underpins all their decisions. Some of these rules are, of necessity, arbitrary. Colours, button designs, look-and-feel, all need to be consistent, and someone has to lay down the basic thematic stuff—the "Thy Scroll Bars Shall Be Blue And Sort Of Aquatic-ish" foundations on which the rest of the GUI guidelines are built. Quote:I have. I've been praised for clean designs before. You still think I am 100% against design. I'm not. No one here is. Could have fooled me. Quote:I do not agree with Apple's approach to implementing their philosophy. No. You still don't get it: If Apple's guidelines say, "Thy "OK" Buttons Must Be Placed X% From The Right Margin", that's what you're REQUIRED to do. If you aren't willing to comply with their rules, you don't get to play in their sandbox. It really IS that simple. YOU might have other ideas about how UI design should be done. You may disagree with Apple entirely, or only slightly. But your opinion matters not one whit. It's their damned baby. Their house. Their rules. Not yours. This is the price you pay to develop for any platform. Not just Apple's. Try walking into a shop sometime and see how far you get telling the owner how to do his job. Trust me, they're unlikely to take it well. Why do you expect Apple—or, indeed, any other corporation—to be any different? There are plenty of alternative platforms for you to try your UI design experiments on. Use one of those. Prove Apple wrong by beating them at their own game. Or just carry on whining about how "unfair" the universe is on these forums. Your call.
  2. Corporate Philosophy Comparisons

    Quote:Original post by way2lazy2care Quote:Original post by zyrolasting Quote:Apple isn't user centric, it's Apple centric. Anything that threatens their lock-in must simply be erased from their iWorld. I'm all for quality standards, and I'd happily cheer Apple on if that's what they were trying to bring to their platform, but it isn't. Even I would say that's a demented "I hate Apple!" rant. Elaborate? Flash ...sucks. On OS X, it's buggy, slow and shit. Always has been. And it's not exactly rock-solid on certain other platforms I could mention either. In fact, it has a terrible reputation among Mac owners. Adobe also haven't been brilliant at supporting the Apple platform in the past—it took them forever just to catch up with the move to Intel CPUs, let alone 64-bit support. (And let's not mention their video editing apps, which they pulled entirely for some years.) They're a fair-weather developer at best. Why should Apple assume they can commit to their platform and write an iPhone plug-in worth a damn when they've yet to demonstrate that ability on a platform Apple have had since 1984? Nor is this a simple "Apple vs. Adobe" war: It's just a pragmatic business decision. OS X uses PDF in its display sub-system. It's why PDF creation is built-in on Macs. If Adobe and Apple were really that petty, this wouldn't be the case any longer. Apple could also have done what they did for Java and written their own SWF-interpreting plugin, but they chose not to. The problem is the SWF format itself: It's not owned by Apple, so they don't have any control over its evolution and future direction. They've been stung by this in the past with the PowerPC processor series, forcing them to make the move to Intel. I can well understand why they don't want to go through that again. Nor do they want "lowest-common-denominator" applications appearing on their flagship devices which simply port UI paradigms from other platforms. Design is the primary differentiator—the USP—for Apple's hardware. Flash apps would dilute that. Finally, Flash isn't an "application". It's a plugin. There's no App Store or similar mechanism for adding these to the iDevices, so Apple would have to build some infrastructure for it. Every time Flash were updated, Apple would have to roll it into a version update for their iDevices—and any apps written for the newer version of Flash would not run on iDevices running older versions, requiring additional customer support expenses. Why go to all that effort when easier, cheaper alternatives are available? Flash was conceived in an age before broadband and streamed media. It was originally a vector animation plugin; video and audio streaming were added much later. But HTML 5, with CSS 3, etc. can do vector animation. And video streaming. And a hell of a lot more. No plugins required. For all the occasionally misleading hype from Jobs and his marketing team, HTML 5 and its associated technologies is simply the better, simpler solution. Apple have done technology switches like this before, pushing USB and ditching the floppy disk drive long before any of their competitors. (Take a look at the original iMac if you don't believe me. Note, too, the criticisms it received from reviewers of the day. This HTML 5 storm is no different.)
  3. Programming and...math.

    [quote]Original post by Armendegga Quote:Original post by Fl4sh Quote:Original post by stimarco Nope. And those aren't "word" problems. They're "limited-data math" problems. The words are merely the user interface in which they're presented. Unless it's for grades, start with the problem and go from there. Oh, and check this out: http://projecteuler.net/ Anyone working in prototyping or architecture will definitely appreciate what this site offers, also message me with any good problems, i love 'em (; Regards, A. I'm guessing reading comprehension isn't one of your talents. If it was, you wouldn't have quoted my post. [Edited by - stimarco on June 22, 2010 10:14:53 AM]
  4. Corporate Philosophy Comparisons

    Quote:Original post by Valderman Quote:Original post by stimarco <massive wall of text about how Apple brings users salvation from oppressive, evil, incompetent developers>I don't quite see how arbitrary restrictions on what you can and can't do with your own devices or censoring content is for the benefit of the user. Really? Guess I need to have some strong words with Indesit and Dyson then. Last time I looked, they didn't let just anyone build and install apps on their devices either. And good luck building native code apps for the Nokia 2630. Oh, wait: you think that, because the iPhone and iPad have CPUs and screens, you have an inalienable right to stomp all over Apple's own corporate philosophy. You sincerely believe your approach to software design and development is the One True Way. Good luck with that. People buy Apple's stuff precisely because it has consistent user interfaces. I happen to be a fan of good design. I couldn't give a shit who produces it, but I've yet to see anything from Apple's rivals of late which seriously competes with them at this level. Only Nokia seem to come close with their low-end phones, though their smart-phones have been miserable. Quote: I also don't see how forcing developers to use outdated tools and placing other arbitrary restrictions on their development process, thereby making development slower, more expensive and more error prone, benefits the user. Outdated? Objective-C 2.0 is "outdated" how, exactly? And the rest of Apple's tools are pretty good too. It's all wrapped around the GCC suite, so you can use whatever IDE you damned well please if you don't mind losing the advantages of XCode, the debugging and instrumentation tools, and Interface Builder. Personally, I find Objective-C a breeze after having to fight the chimera that is C++ for so many years. Obj-C has a refreshing simplicity in its design. Quote:(from Zyrolasting): We heard a story in class of a person getting their app denied because of button placement. You still don't get it, do you? Apple were right to deny that app. They already had a 12" stack of Human Interface Guidelines in the 1980s—long before good design was even fashionable—and they haven't gotten any less anal about interface design since then. Good user interface design is required. It is NOT optional. I, a developer who has been programming, designing, documenting and developing software since the days of the Sinclair ZX81, agree with Apple on this. Because the code is NOT all there is to an application. (And, before you ask, yes: I did once think as you do. But not for nearly 20 years.) Once upon a time, the code was all you needed. It ran on operating systems with CLIs. The code usually took input, crunched it, and spat out a result, with minimal interaction between the code and the user. These programs would be written and operated by people expert with the computers of the day. But now it's 2010. In case you hadn't noticed, we've moved on a bit since those days of CLIs and teletypes. Anyone designing applications with poor, or inconsistent, user interfaces today is doing it wrong. End of. No excuses. User Experience and Interaction Design is a known science. There are textbooks on it and everything. Read some. Please.
  5. Corporate Philosophy Comparisons

    Quote:Original post by Antheus Quote:Original post by stimarco Many developers find this a very uncomfortable position to be in. They're used to being top of the heap. They could tell designers and end users to just do as the High Priests of the Temple Of Code tell them. This is no longer true. And it hurts. It burns many of the old-school programmers and hackers. They don't want to adapt. They don't like being a mere subject, when they used to be the very gods themselves. Here's a dirty secret. Why did software industry fall in love with this type of developer? It's cheaper to pay in ego than in cash. And it's still done today. As for old-school programmers. Adapt? To what? Nothing has changed. The folks in business are the ones who need to adjust a few process graphs. But the people in charge of development teams are often ex-programmers themselves. (CTOs don't spring, fully formed, from the foreheads of CEOs.) Being ex-programmers, they're also likely to have been old-school ex-programmers, only these guys no longer program computers actively any more, and have moved onto programming people instead. Programming them with old, outdated notions and ideas from an age when the Systems Analyst stalked the Earth. Quote:If anything, it's sales people that get completely left out. In small businesses, AppStore gives developers direct sales channel, cutting out most of sales and some marketing. Developers *love* these new app stores since they take care of the stuff they suck at, or could not afford. Some developers, yes. But many genuinely do fail to "get" Apple's design philosophy. Apple's App Store approvals process gets a lot of stick, yet most rejections are invariably due to a "good enough", rather than "best possible" approach to app development. Apple want quality applications for their hardware that their customers can actually wrap their heads around. Take a look at the "cream" of the apps available for Android handsets and contrast with the cream of the iDevice crop. Developers who believe Blender and WinAMP are good examples of interface design will try their hand at iPhone development and fail miserably, before proceeding to bad-mouth Apple and their "draconian" or "evil" approvals process and "restrictive" App Store policies. (Though why keeping badly designed and buggy crud out of the App Store is such a bad thing for consumers escapes me. It may suck if you're developing that kind of thing, but that's entirely your problem.) Quote:Seriously, this view of programmers is just too naive. Actually, my full opinion of the vast majority of programmers is unsuitable for family viewing. The wider development community has probably done more harm to the IT industry than all of Google or Microsoft's less ethically sound business practices combined. But this isn't really the forum for another of my rants on the subject.
  6. Tips on Writing and Development

    (ridiculously long post deleted) It's no good. I really need to write this all down in a book, or I'll spend the thread trying to condense it all into about 20-30 minutes' unedited, unproof-read typing. Suffice to say that I disagree. With pretty much everything you posted. [Edited by - stimarco on June 21, 2010 6:18:58 PM]
  7. Corporate Philosophy Comparisons

    Quote:Original post by zyrolastingI just... God, I can't think of a nice way to put it. Bottom line is, I see development in any form as more spontaneous and free than how I see Apple treat it. Apple treat developers like they treat any other supplier. That was my point. In the beginning, IT was, of necessity, dominated by the programmer. The hacker (in its original sense). The High Priests of the Temple Of Code. The companies and corporations that grew up back then had the same fundamental attitude. They were almost always started by such programmers and hackers. Microsoft are a developer tools and technologies company. Windows is a handy wrapper for all their developer technologies—.NET, DirectX, etc. Even Microsoft Office is, at heart, aimed at developers, not just end users. It's a platform in its own right. (OpenOffice.Org, please take note.) The FSF / GNU movement—and that's all it is; they didn't actually invent open source, nor do they own copyright or patents on it—are hacker*-centric. They're all about the source code, which is something 99.999% of end users really don't give a flying f*ck about. UNIX is one mammoth hacker heaven of an OS. Unfortunately, this is also why it has singularly failed to make a dent in most of the consumer markets, except when used as an embedded OS. The problem is that most of today's developers have grown up on PCs, using MS-DOS, various flavours of Windows and / or GNU / Linux. They've been spoilt rotten because these environments place the development process front and centre. But Apple are the exception. They were always about the design. The end user. The programmer served the user, not the other way around. For many years, they didn't even have their own in-house development tools, preferring to rely on Metrowerks' Codewarrior suite. Apple are not development-focused, but product design focused. Their priority is that end user. Sure, they've lost their way occasionally, but this attitude has always been the key to Apple's successes. Over the past decade or so, Apple's approach has gained increasing traction. No longer must users put up with "good enough". Good design has become mainstream, instead of merely an expensive optional extra. Customers are beginning to feel entitled to it as a default feature. Many developers find this a very uncomfortable position to be in. They're used to being top of the heap. They could tell designers and end users to just do as the High Priests of the Temple Of Code tell them. This is no longer true. And it hurts. It burns many of the old-school programmers and hackers. They don't want to adapt. They don't like being a mere subject, when they used to be the very gods themselves. OS X programmers are already used to this, however. And that's why I was so surprised by your opening outburst. I mean, seriously, none of this should be a surprise. Apple have never kept this sort of thing a secret. And the PR stuff is natural; check out MS' own certifications sometime. It's marketing, not formal education.
  8. C++ or C# as first C language

    Just a heads-up: C# is not quite as tied to Windows as you seem to think. Unity supports it as one of its languages, and that platform can target XBox360, Wii, OS X, iPhone / iPod / iPad and more. C# and C++ are general-purpose programming languages. They're not games-specific. In fact, an increasing amount of game code is being written in higher-level languages like Lua, Python and even javascript. C# is definitely easier to learn than C++, though neither are easy to master. C# is also a lot more forgiving and has better 'native' support for technologies like .NET, for which it was created. (Using C++ and .NET together is possible, but recommended only for masochists.)
  9. Securing the Rights to Freelance Assets

    Quote:Original post by Vincent_M I'm a new game developer, and I was wondering what kind of legal obligations would I have to meet to securely own the rights to any assets such as 3D models and textures? You need to talk to a lawyer. Preferably one local to you and with a background in IP and contract law. Laws vary from nation to nation, jurisdiction to jurisdiction, so be wary of taking advice blindly from random people on the internet. At a general level, you'll need to draw up a solid contract and—in most territories—specify the job as a "work for hire" contract. Even so, some jurisdictions can overrule such contracts under certain conditions, so—and I cannot stress this enough—talk to a qualified lawyer. If you want quality information, there's no substitute. And no shortcuts.
  10. Quote:Original post by maspeir On the Mac, windows and menus are defined by procedure. The OS has several different types of window and menu procs that the developer can choose from, but there is more than adequate documentation on how to write your own. Now, Windows DOES have the ability to make custom windows, but there are many caveats. However, my issue is with the menus. On the Mac, the menu proc takes messages from the OS and processes them... Hi, I'm Clippy, the friendly paperclip! It looks like you're not using the right tool for the job! You want to build a cross-platform tile game. May I suggest Blitz Basic or some such? It's cross-platform and a piece of cake to work with. Instead of fighting an API which was never designed to be used the way you want to use it, you can use a tool that simply lets you get on with making your game. (And, yes, before all you purists jump in, Blitz Basic is plenty fast and powerful enough. There are professional casual games published with it.) Another option is Unity, but that's heavily biased in favour of 3D games. It's probably best to wait until the v3 release, which is expected to include some improvements to make 2D games easier to build. If you're hell-bent on using Windows' own APIs—and I've no idea why you would: they are, by definition, not cross-platform—then I suggest looking at DirectX or even XNA, which are actually intended for games.
  11. New Game Design Company startup

    Quote:Original post by ChurchSkiz I'm thinking about starting my own fashion company. I don't know anything about fashion. I have a lot of ideas about clothes and I'm thinking the company could be run by attractive models. Good idea? Well, a similar approach seems to have worked for Silvio Berlusconi.
  12. Corporate Philosophy Comparisons

    Quote:Original post by zyrolasting—from your own blog: As of the time of this writing, I have one week left in the program. We are taking an iPhone programming course. My instructor told us that Apple requires us to register before we can even start developing on the iPhone. We can only work on Macs, and can only deploy on Apple devices that are registered in a provisioning profile. We must also submit security certificates for each Mac that works for them. You must also shell out $100 and everything you develop from that point on must be approved by Apple for resale. What the hell? Seriously? You only just found out how Apple's development kits work? Is Google something that only happened to other people? Did you genuinely fail to do ANY RESEARCH WHATSOEVER into Apple and iPhone development? This is NOT news. Why the hell are you even raising an eyebrow over this, let alone whining about it? Your blog post is yet another an ill-informed "I hate Apple!" rant written from the perspective of an insufferable ignoramus who thinks he knows everything. Newflash: Apple are a design boutique company who sell hardware. Nothing more. They have never made a secret of this. Nor is there any secret to their general success over the past decade or so. That's "hardware". Not "software". To Apple, software is just another component. It's not special. It's just one of many pieces of the jigsaw. Apple don't make much (if any) profit from App sales, music sales, video rentals, etc. iTunes and OS X are just a means to an end: selling more Apple-branded hardware. Apple are not developer-centric, like Microsoft. They're not hacker-centric like Linux and the FSF movement. They're user-centric. And doesn't it just burn to find yourself brutally kicked off the top of the entitlement heap most developers have been taking for granted! You don't like Apple's way. So? Develop for some other platform then. Don't let the door hit you on the way out. Apple won't shed a tear. They have plenty of developers ready and willing to code for their hardware already, and I don't see them complaining too hard about the money they're making. You have a choice here: develop for Apple. Develop for Android. Develop for Symbian. Develop for (snigger) MeeGo. Develop for anything you damned well please. There's nothing "evil" going on. Apple just don't happen to agree with your philosophy. Sucks to be you, I guess. Either grow the hell up, or get a job as a tabloid journalist.
  13. Game Development Questions

    Quote:Original post by Ampd533 Hi, I just had a couple of questions I wanted to ask. I am currently taking Computer Science at a 2 year school and then I am going to transfer to a 4 year and continue in Computer Science. Questions 1. Is game development a realistic path to go for as a career? I know you can make good money if you get a good job. But, how realistic is it to get a job after college? It's no more or less realistic than any other career path. The only person who can tell you if you'll succeed is you: we have insufficient data to give you a useful prognosis. Yes, it can pay well. It can also pay peanuts. It depends on how good you are at your chosen profession, as well as how good you are at spotting opportunities and taking advantage of them. All I can offer is some advice: I'd suggest looking at doing some small-scale games and building up a portfolio / demo reel which you can use to get yourself a job later on. As you learn more, you'll probably cull some older projects and add new ones. The key is to not only build demos, but finish what you start. Commitment is an important selling point and the best way to prove you can commit to a project over the long haul is to show some projects you've already completed. Oh yes: there are some parts of the job you'll find boring, but necessary. The difference between the pro and the wannabe is that the pro will push through the boring stuff and finish the job. Quote: 2. What classes should I focus on? (Programming? C++/C#)? C# is probably the easiest to learn for a beginner. C++ is still quite popular in the industry, but tends to be more common in lower-level engine code rather than in the game logic layer. The latter is increasingly built using a programming language like Lua or Python. Don't let anyone trick you into believing any one language is a "standard", because there's no such thing in the games industry. It's also utterly irrelevant: 10 years ago, there were posters on these very forums extolling the virtues of assembly language as the "standard" in the games industry. It is the nature of any IT industry that technologies will have their day in the sun, then gradually fade into the background as another new toy comes to the fore. By the time you leave school, any so-called "standard" programming language in use today will probably be either obsolete, or pretty damned close to it. So the important thing to do is: REPEAT... Pick any damned language. Learn it. ...UNTIL dead OR retired. Learn programming. You can do that in any programming language. As you learn more of those, new ones will become easier to pick up. Don't try to predict what the IT industry will look like by the time you've earned your degree. Nobody's that good at predicting the future. Quote: 3. What types of jobs are there? I know there are artists, programmers, designers, and more. I want to know which would be the funnest or which gets more hands on approach. This is a question we cannot answer for you. What do you enjoy doing? Are you an engineering type? Do you enjoy problem-solving? In that case, you'll probably quite like programming. Or project management, which is essentially the art of programming people. Do you love gazing at wildflowers and studying the way the light reflects off them? Do you make flick-book animations in your exercise books? If so, art is probably more your thing. There's a lot of overlap between the disciplines, so it's worthwhile diving right in and trying out each one. You've a lot of learning ahead of you, so don't worry if you find something else proves more interesting than programming later on. You're also unlikely to have much success as a game designer unless you understand how games work, so studying the basics is a good thing in any case. And, of course, building that demo reel will help with this career too; you can always rope in a programmer and / or artist friend to help out. Quote: 4. Lastly, do I have to go to a special gaming school or can I attend a good college and get my bachelor's in computer science and then get some experience out of college? The second option seems the most popular with the companies I've worked for of late, but it really depends on your career path. If you're hell-bent on being a programmer first and foremost, be the best programmer you can be. A university course in the subject will generally be much broader than a games-specific course, which won't be teaching you stuff that programmers in other fields are using. Cross-pollination of ideas is a good thing. That said, employers will be far more impressed by a great portfolio of your work than by any number of letters after your name. Quote: If anyone can help me out that would be great. I am thinking of doing Networking / Software Development or Game Design when I am out of college. I really like games and think I would enjoy designing them. "Liking games" isn't the same thing as "designing games". Anyone can watch a movie, but most would suck at directing one. Quote:I am not very good at drawing, —which probably rules out graphics work, but it's also an issue if you want to be a game designer in some genres. Practice can help though. Quote:but I am good with computers and I am good at making up stories and coming up with background stories for characters. (Which is another job isn't it? Game Writers? But, I think I would have to major in English for that). If you want to be a writer, you need to write. If you want to be a game writer, you also need to understand how games work and what games are. However, writing for games is not the same thing as writing a story. A game is a tool which lets its users tell stories within the game's confines. The Writer and The Author are no longer the same, but separate individuals, often with different ideas about where the story should go next. This is hard, and there are, as yet, no courses I'm aware of which teach the craft. So an English degree is probably as good as any, though you could always take English as a minor and go for a Comp. Sci. (or related) degree instead. Frankly, your post reads like someone trying to decide what career he's going to have before he's even checked out all the options. You have plenty of time to decide this for yourself, but the only way you'll know if you're doing the right thing is to try the options out! Don't be afraid of discovering you suck at something. I'm a lousy swimmer and can't speak Swahili. But that doesn't matter, because neither is relevant to what I do, which is write. But there is one thing you should strive for: find something you love doing. Once you know what that is, you'll know where your future lies.
  14. Programming and...math.

    Quote:Original post by Fl4sh Are most of you guys good at word problems and extracting info from those word problems to write equations and etc...? Nope. And those aren't "word" problems. They're "limited-data math" problems. The words are merely the user interface in which they're presented. Thanks to a cycling accident many, many years ago, I have a truly useless short-term memory, but I have published games to my name. Programming as a translation job. A common feature of most programs is some form of process defined in mathematical terms, so the programming will therefore involve translation of that maths. Computers actually suck at maths out of the box. They have to be taught how to do the vast majority of it by being fed programs with the necessary information. Thankfully, hordes of programmers have already done most of that work, so there's no need for me to do it all over again. (I did reinvent A* once—back in the days before the mighty internet—but I prefer not to do that sort of thing if at all possible.) In other words: I prefer to make my computers do my maths for me, because I pretty much suck at it myself. (That accident I mentioned earlier gave me dyscalculia. I've had my current mobile phone for three months and I still can't remember my own phone number.) Even basic algebra is difficult for me to do. But... I know where to look it up! I also have an engineering mind-set and can understand the "why" and the "what's it for?" of things like vectors and matrices. I've even written the user guide for a 3D graphics engine which was once very popular in the commercial games industry. (GTA3 used it, for example.) While I won't claim maths is useless to a programmer, I don't believe being good at maths is all that necessary any more. The days when you had to hand-code multiplication and division functions in assembly language for the Zilog Z80A processor are, thankfully, long gone.
  15. Tips on Writing and Development

    Oh look. My first post here in years, and it's a long one. Again. (Sigh.) (Deep breath...) Quote:Original post by pothb — replying to a post by Tom Sloper We're talking about the assumption of who he's writing to, and that if he should clarify. This article is basically in a place for people interested in game writing so, it is fine if he made the assumption that people who wants to write for games are the ones reading. Like having his article in a book labeled Game Writing or Writing for Games. Writing for games is most emphatically not the same thing as writing a novel or, indeed, for any linear medium, but the OP appears to believe this is what writing for games is about. A writer in the games industry does not write stories. They create storytelling assets. Except in a very few, very specific, genres—e.g. the point-and-click adventure games pioneered by Sierra and LucasArts—the two are not the same. If I may explain... * Play = Story - Interactivity. In fact, both are on the same continuum—the same line—with "100% interactive" at one end, which is basically "real life", and "0% interactive" at the other, which is death. (Novels and movies retain some interactivity, with novels a little more interactive than movies as the latter do the job of conjuring up the imagery for you.) Stories are a human invention. A tool. They were created as a way to reduce the knowledge children had to gather through play. (All mammals play, incidentally. It's not unique to humans. It evolved as a learning tool—something our Victorian ancestors never realised, to our present education systems' detriment.) The purpose of Story is twofold: firstly, it is the means by which we build and retain many of our mental models of the world around us. The more accurate the story, the more accurate the model. Those models which involve abstract elements—and particularly those which involve time—are heavily reliant on storytelling techniques. (See On Intelligence for more on my basis for this theory. This also underlies many of the rules behind good user interface design.) Humans created increasingly complex and abstract stories and techniques for telling them as a means of communicating these mental models to each other directly, bypassing the need to rediscover them through fun, but long-winded, play sessions. You could, for example, learn about fire through playing with it, but the infant mortality rate alone is likely to have been enough to spur the evolution of storytelling as a means to avoid the whole "deep-fried toddler" problem. Farming and all the trappings of civilisation that went with it are, however, increasingly abstract. Stories can use metaphor and similar tools. Play's box of abstractions is more limited. Play and Story are essentially two sides of the same coin. They're the same thing, just viewed from different angles. * A game is a tool with which the user can tell his own story (or stories). There's nothing magical about it, but that distinction is an important one: You should not be writing a linear story in the traditional sense of the term. Games can have a beginning, a middle, and an end, but they don't necessarily have to happen in that order, nor do any of them have to be the same every single time. Most games do tend to have a single beginning however. Many also prefer a single ending, while the middle part tends to be open to a greater or lesser degree. The game design defines this "plot shape". A professional writer working on a game will not simply write down an overarching plot and build a single story. That's not the writer's job. The writer's job is to: 1. Create (or refine) a believable, consistent game world, with consistent, believable rules. The real world can be illogical. Shit just happens. For every successful hero, there's someone who set out to avenge their brutally murdered daughter raging at the heavens after finding his home—with all his carefully organised clues—reduced to smouldering rubble thanks to a gas explosion caused by a badly-installed pipe in the apartment downstairs. Games (and fiction in general, except for some very specific genres) cannot get away with being illogical—witness the whining and screams of protest when a player is killed in a multiplayer FPS game due to an internet connection issue such as lag—so defining hard and fast, iron-hard, concrete rules is a key process of game design. 2. Create believable, consistent characters the player can identify with. The latter part is hard. House MD works well on TV because he acts as a catalyst for the characters around him, but such characters rarely work well in games. Players, as a rule, don't identify with a barely human, limping, mentally unhinged arsehole of a diagnostician who's only in it for the glorified crossword puzzles. Fun to watch is not the same as fun to be. Furthermore, your characters have to be defined not in linear, storytelling form, but provided with full, logical backgrounds which explain their actions and reactions in the game. They also have to be provided with things to say. For myriad potential situations. As there's only a finite number of lines an actor can record, this is challenging in itself if you want to avoid hearing the same lines repeated over and over. (Hence the popularity of war settings, where the limited scope for long, meandering discussions about the philosophies of ancient Greece plays in the game designer's favour.) 3. Create story elements which the player can use to tell his own stories. Is your FPS game's lead character a hard-bitten soldier who signed up because he wanted to? Perhaps he's an unwilling conscript. A freedom fighter. A terrorist. A criminal trying to escape his past, or find redemption. (Two of those items are the same thing, viewed from a different perspective.) These sound like great jumping-off points, but you can't dump the character's background on the player in a single cut-scene. Nor can you assume the player will read the manual—if any—that came with the game. You have to drip-feed the background. And you have to do so without breaking the player's suspension of disbelief. This is tricky. In a novel, you can get away with long, rambling introspective monologues. But where do you do that in your FPS? You can't have the character narrate his thoughts in-game: if the player is shot, the narration has to stop too. How many times will the player want to hear the same lines while he plays the same level over and over in order to get past that tricky bit? This is why games have cut-scenes: it gives the player time to get his breath back, but also acts as a point where some serious storytelling can be done. It is also why even single-player campaigns in most games tend to be fundamentally linear: cut-scenes are fixed points in the narrative, and also, as a rule, fixed animations. It's bastard hard to make a cut-scene modify itself to fit any number of game state variables because you then have to record one or more orders of magnitude more dialog, and probably bump up the asset count too. (This is a very good argument for more research into procedurally-generated content, incidentally.) Many of the rules of good user interface design also apply to this field. If you're doing your job right, players will create a mental model around their chosen avatar, not just the game itself. It is up to you to ensure that mental model is as accurate as possible. The player should become their avatar in their own mind. Identify with it. So you need that avatar's characterisation to have just enough wiggle-room to let the player bring their own personality aboard. 4. Mesh with the game designer's vision. The play experience is paramount. Writing plays second fiddle to the game design. Always. Even if the game designer is you. NPCs could guide the player. They could bicker with him. They can argue the toss, or chat up members of the opposite sex(es). But they must always be secondary characters: they cannot win the quest, or complete the level on the player's behalf (except in cases where that's the point of the level, for example. And even then, the player has to be the one who works this out!) This limits your characters. You can have a mentor who teaches your character all the basic moves, but there's a damned good reason why Obi-Wan Kenobi got killed off halfway through the original "Star Wars": if he'd stayed alive, Luke Skywalker would have had very little to do. And these same rules apply to games too. 5. Polish, refine and hone your work. Be utterly ruthless. This is expected of you in any medium, but it also meshes in closely with (4) above: every cut-scene you write; every line of dialog you script; every pivotal event or turning point has to be implemented in code, art and audio assets. That shit's expensive, folks. If it's not 100% necessary to the play experience, it must go. It matters not how well-written your cut-scene, if it doesn't add anything to the game, kill it. It matters not how cool and bad-ass that mentor character is: if she's getting in the way of the gameplay, she's gone. This is often one of the hardest parts of writing: coming up with seriously cool dialog sequences or an awesome sidekick, only to have to rip it out because it simply doesn't work in the context of the game. To be fair, it's not any easier to cut a favourite character when writing in other media either. * Finally, if your goal is to create yet another "Find the hidden junk in the ridiculously cluttered house owned by an obsessive-compulsive collector of eclectic junk"-type game, with a thin layer of shoddy, amateurish (and often illiterate) "mystery story" nailed on to disguise the game's blatant contrivances, feel free to ignore all of the above. You won't be the first. Nor will you necessarily fail to sell any copies; the casual games industry has created a relatively ignorant audience, unfamiliar with the potential and possibilities of the medium. That audience's ignorance is not their fault: everyone is ignorant. You. Me. Tom. All of us. It is simply not possible to know everything about everything. But in this case, the casual games audience is ignorant primarily because we have failed to educate them, and are thus creating yet another market which will dismiss game writing in all games as worthless by default. That's not to say that there isn't an audience whose primary interest is simply blowing sh*t up, but writers working in the games industry need to aim much higher, and do more. We need to show our audience that there's more to game writing than "Aliens! Thousands of 'em!", or "Profesor Prune thinks he seen a clue in his laborotory. Go there and see if you can find a tuba, a bald eagle and a wheelbarow." (The typos in that last one are indicative of the insults to basic literacy such games often contain.) If it wouldn't be acceptable in a novel or movie, it shouldn't be acceptable in a game.