Jump to content

  • Log In with Google      Sign In   
  • Create Account

The Bag of Holding

The right place for objects

Posted by , 19 April 2006 - - - - - - · 265 views

During my usual bouts of procrastination, I dug back up Paul Graham's essays, thinking that I might find some useful stuff in there since my Lisp conversion. Specifically, I started reading Why Arc Isn't Especially Object Oriented, looking to compare notes with Paul's work on the Arc language (a Lisp dialect, for those not familiar with it) with my own dabblings in the fledgling Epoch project.

I've read a lot of "OO considered harmful" type stuff in the past. I guess, like Lisp, it was one of those things that I shrugged off as crazy talk - because I didn't understand it, and because the writer didn't seem to explain it. Actually, I need to qualify that a bit, with some history of my own opinions on the matter of objects.

When I first started using objects extensively (around the time I got ahold of VB5 with its "new" Classes support) I thought objects were kind of handy for certain kinds of data modelling, but really didn't see what all the fuss was about writing entire programs based on objects interacting. As far as I was concerned, the Logical Thing To Do was to use objects to model, well, object-like stuff, and use good-ol' procedural style code to model logic and high-level operations involving objects.

However, I was very much a novice and unaware programmer at the time, so I figured there was a good chance I didn't know what I was talking about. While the attitude was good, I made a terrible mistake, which I now (in retrospect) kind of regret - I moved off into Java land. A lot of people were saying that Java was gonna be Really Big, and you'd better learn it if you want to have relevant skills in the real world.

I tried - really, really tried - for about a month to like Java. It had some of the niceness of VB's "get it done and screw the details" philosophy, and at least by comparison to C was kind of handy in that regard. But one thing just pissed me off about it: Java seemed to have some kind of weird, almost erotic fixation on objects. I ditched Java permanently, which (frankly) I think was a good move on my part. I've hated it ever since.

At that point, I'd been doing parallel work in VB and C for several years. I loved VB for doing "GUI stuff" and maybe the occasional rapid throwaway utility or whatever. For anything heavy-duty, I used C. (Actually, what I used was C++'s flavor of C-style programming - it was trivially isomorphic to C code, but exploited litte gimmicks like omitting struct everywhere. It wasn't legal C in the sense that you could compile it in a C compiler straight, but it was for all intents and purposes C code.) It worked great - I had a potent tag-team of languages that could solve most of my problems. I could even write MS-DOS 6 compatible batch files if I needed to.

However, I felt like I had some kind of gap in my knowledge still, because I didn't quite get this whole "objects" thing. I was still a student looking for a teacher, and sadly, at that point in time, the easiest teacher to find was the OO fanatic camp. I wish I could have found a different teacher instead (like, say, Lisp). But in any case, what happened happened - and I decided to try to learn "real C++" in order to get a handle on this objects stuff.

In the course of learning C++'s flavor of OO, I realized that OO was a vacuous (or, at best, nebulous) term. Every single language had a different notion of how OO should work, and nobody agreed that anyone else's was better (except maybe Smalltalk's). Nobody seemed to agree what OO really was - it just seemed to involve a lot of objects.

Even more unfortunate for me, I chose MFC as my entry point to the C++ land of OO. It permanently warped my opinion of the entire objects notion, although eventually I think the experience will prove beneficial, all things considered - if for nothing else than the fact that it has deeply broadened my experience. Call it an eye-opener, I guess.

I dabbled in this land for a while, doing a few little projects of my own, but never really liking it. I decided that my real problem was that I didn't have a good, hard problem to solve; I was just tooling around, and didn't have room to get a real solid feel for how Objects are supposed to be.

After a couple years of dabbling, I started the Day Job From Hell. (As a matter of fact, I'd "dabbled" a bit by writing a prototype version of what eventually became the product I worked on during that job. But that's another tale.) I figured it was my lucky break: the main thing I was to work on was - you guessed it - a C++, OO-heavy, MFC-encumbered mess. Of course, at that point, it was like some kind of vision from heaven; finally, I could get some clarity on all this OO stuff!

I wrestled, hard, with that project at first. I tried - really tried - to do everything in OO style, like Java. Except at every turn, I had this deep revulsion; I felt like I was going back into Java-land. Java was supposedly this great, highly OO language, but I hated it. Something wasn't adding up in my head.

Around that time I picked up a copy of The Pragmatic Programmer. Finally, things started to click, and it was the beginning of the fastest period of acceleration in my personal understanding of programming that has happened to date (in fact, I think I'm still on that upward trend, and possibly still accelerating a bit). Instead of trying to pursue this ghost, this OO, this object-worshiping nothingness that never seemed to materialize, I started using Pragmatic principles instead.

I rewrote the program, almost 100% from scratch, against very heavy protests from the management. I firmly believe that I made the right decision. Even now, I think the management is starting to grudgingly realize that it was the right thing to do; they've had a far more stable and reliable product even since I quit than they did when it was being actively maintained by the old developer.

After I got done rewriting the entire program, I noticed something funny: it deeply - and I mean very very deeply - resembled what I used to do in VB, all those years ago, when I first got my hands on "classes." Maybe a third of the "stuff" in the program was thought of as an object. The rest was basically procedural code, except with some of the nice trappings of C++ (STL, RAII, and such) to help smooth over the ugliness of doing applications in raw C.

For a while, I felt vaguely guilty about this. I felt like I'd betrayed my quest to Learn Objects, like I'd missed the mark somehow. I thought I had failed to understand the Grand Truth of OO, and that I was committing a deep sin by writing in the same style that I used to write VB code. I mean, hell, everyone knows VB is the worst language ever, right?

So I thought long and hard about this, in the back of my head, while other thoughts filled up my conscious effort. I realize now just how long this has been rolling around in my brain, but has only now attained clarity. The more I thought about it, the more I realized that I couldn't identify a "crime" in that code. Yeah, it wasn't Java-style OO, but it was exceptionally good code, compared to my previous stuff. Maybe it missed the mark of "the grand truth of OO" but it definitely lined up with what Pragmatic Programmer had to say. It wasn't object-laden, but I was still proud of that architecture.

I've been doing a lot of reading to back up my efforts on this Epoch thing. As a result, I've been finding out a lot about the real situation of OO. It seems that a lot of people have arrived at this conclusion long before me; and a lot of people seem to despise OO, at least in the sense of Java's "use objects or die" approach.

Now, though, I see a new perspective. I don't think the problem here is really objects per se - I think it's object orientation. And the more I read, the more I think that this is what all the anti-OO people have really been driving at all along; I just wasn't smart enough to understand it yet.

My problem was premature rejection. I read this stuff that, to me, seemed to be saying "objects are stupid! Use Lisp instead." And this bothered me. I had all kinds of cases where objects were exceptionally good representations of certain classes of problems. Today, I'd say that objects are probably the best method (for now at least) of modelling systems that are dominated by automata. A simulation game like X3 would be a damned nightmare to write without objects.

So, I would read these anti-OO discussions, and figure everyone out there was insane. I could see the benefits and power of objects plain as day, and these people seemed to be telling me that objects were, in fact, not useful. So I largely ignored the whole thing, assuming that once I found the Great Truth of OO, I'd be able to counter their arguments.

I did find the Great Truth of OO, but the truth isn't that the Emperor is also the Messiah. The truth is that the Emperor has had a catastrophic wardrobe malfunction.

Objects are awesome tools of abstraction. I think they should stick around, and here's what I think the term should mean:
  • An object has some state, also called attributes or properties.

  • An object knows how to do interesting things with its own state.

That's all. No more, no less. Encapsulation, implementation hiding, data hiding, all these things are good - but they are not inherent properties of objects, nor do objects hold a monopoly on those notions. In this sense, I think, objects are still very good tools.

Where we get into evil trouble is when we try to write all of our logic as objects. The presence of "manager" or "handler" objects is a tell-tale sign of this. Steve Yegge posted a humorous caricature of this kind of programming in Execution in the Kingdom of Nouns. I think, to a large degree, this kind of stuff is what "object-orientation" is all about: if you have code to write, find a way to cram it into an "object," a noun.

I think that we now have enough history (thanks to Java) to prove that this is a Bad Way To Do Things.

Inevitably, the question of objects is going to come up in the Epoch project. Up until now, I've been basically planning on declaring Epoch to support object-orientation. Now, though, I've changed my mind. Epoch will revile object-orientation. Epoch will spit upon it, deface it, shame it, and tell it to go back to Java land where it belongs, so that we can get work done in peace.

Epoch will have a new perspective. Well, I don't think the perspective itself is really all that new; I just don't think anyone has codified it before. I think a lot of people have become disillusioned with OO (if they were ever "illusioned" to begin with), and already have this perspective. But I've never seen a name for it, or even a really concise description of what it entails - just rhetoric about why OO is bad.

I've started thinking about it as "object awareness." Objects exist, and they're very useful tools in certain areas. As such, I think it's important to allow for them. Epoch will definitely "believe in" objects in that it won't be object-agnostic, the way, say, BASIC or C is. Objects themselves will be welcome citizens. Heck, at this point, I'm moving in the direction of modelling Epoch's language within itself, using self-referential, recursive objects.

Where the line is drawn, though, is in idolizing objects. Some things just shouldn't be objects. I think the term "object oriented" carries a sort of connotation of bending everything in the direction of objects. Objects aren't merely first-class citizens; they're aristocrats, maybe even dictators. That, I think, is the essence of all that is wrong with OO.

Here's to Object-Aware Programming. May our code be more concise, our abstractions more clean, and our modules less cluttered with FooManagers.

Oooh bandwagon!

Posted by , 19 April 2006 - - - - - - · 157 views

I just joined LinkedIn. I don't know why, as usually I avoid those kinds of things as a rule, but for some reason it sounded good. Or maybe I was just bored. Maybe I thought I'd get on this bandwagon "early" while there's still leg room. I dunno.

In any case, if you add me, make sure you get the Mike Lewis that works for Egosoft - because there's a shedload of us out there, and most of them aren't me. (I did find another Mike Lewis in the Atlanta area who works in IT, which is kind of weird. I wonder how he'd respond if I came to work and started yelling at him about identity theft.)

Righty-o then... back to work for moi.

Some useless pondering

Posted by , 18 April 2006 - - - - - - · 210 views

Bill Amend is the smartest person in the universe.

Hang on... maybe I need to back up a bit and explain this properly.

So I was sitting around doing whatever, and suddenly, unbidden, something came to mind: an old Foxtrot comic strip where Jason writes a multimedia book report for school. At one point he's shown with a pile of thick, technical-looking tomes heaped on his desk. One of the titles there was "Binary Search Trees in C."

For some reason, this bugged me. My first reaction was that binary search trees aren't that complicated, and shouldn't deserve an entire thick book - maybe a 20 or 30 page chapter in a book on general algorithms and data structures.

My next thought was a sort of rebuttal to this (my mind like to debate itself). If you need to describe the notion of a binary search tree to an absolute initiate, it could well take an entire book: examination of representations for data, comparisons of alternate search methods, some guidelines for identifying areas where a binary search tree is a good solution, explanation of tree structures in general, examinations of various traversal methods and their inherent tradeoffs... maybe, just maybe, you could get an entire book out of it.

But, I countered to myself, does that even make sense? Would such a book really be titled Binary Search Trees in C? Don't you think it would have a better title, like, say, "Searching with Tree-Based Structures" or something? The phrase "in C" really seems to indicate that this is focused specifically on implementing binary tree related stuff in the C language. So to me, at least, that sort of implies that the reader is expected to have at least passing familiarity with the concept. And in that case, describing how to implement various tree-related things in C shouldn't take an entire book - again, maybe 20 or 30 pages, tops.

So, to me, this title is a miniature paradox. It doesn't make sense. It is inconsistent with the book's apparent mass. So what gives?

Then it occurred to me that maybe the secret here is simple: maybe it's just that the title sounds good. I mean, really - the vast bulk of Foxtrot's audience probably isn't comprised of veteran C hackers. So, by extension, it's quite likely that they don't know anything about binary search trees. In fact, for most people, seeing a book titled "Searching with Tree-Based Structures" will probably elicit a definite "WTF Mate?" reaction.

So, the phrase "binary tree" is important - because it has the word "binary" in it, and "binary" sounds, y'know, technical and stuff. Obviously then the phrase "binary tree" is technical too. So even if people do get weird mental images about oak trees made out of green glowing 1's and 0's, they'll be thinking technical. In fact, such an absurd mental image probably heightens the sense that this is a dense, specialized, genius-kids-only tome. Clearly, the notion of digitized flora is pretty strange, so a reader who gets such an impression on hearing the phrase "binary tree" is probably going to be more awed by Jason's mystical technological powers.

What of the "in C" clause, though? Again, to a totally non-technical reader, this gibberish is probably going to serve nicely to further confuse them. This isn't a bad confusion - in fact, it's a very good confusion, because it yet again reinforces the reader's impression that Jason is some kind of genius whiz kid computer magician. It also serves another, more insidious purpose: to the initiates, who will recognize the C language being referred to in the title, it's a sign of authenticity. You know who I mean: the CS dropouts who switched over to Creative Writing or something, heard about the C language once (and maybe have even seen a few lines of code). Those people now read this strip, and think to themselves, "Aha! I know about C. Clearly, this Bill Amend guy has done his homework. Clearly, Jason has teh überskillz. I am deeply impressed by his wizardry."

This title, then, is not just a paradox - it's a masterful work of mental manipulation. It's nothing less than art. Because even veteran programmers are likely to see the title, and think to themselves, "Hmmm... binary search trees. Useful things, them binary trees. And C, too - Jason's no weenie. He uses a Real Programming Language. This kid must be some pretty sharp stuff."

So really, this title is brilliant on Bill Amend's part - he's masterfully impressed the vast bulk of his readership, and drawn them deeper into the immersion of his comic-strip world. Pretty much everyone comes away from seeing that book title with a sort of sense that Jason's character is just sickeningly smart. And that's a very solid win for Amend.

He only made one miscalculation, though: the few, the proud, the skeptical and bored - well, me, anyways. He didn't count on someone analyzing the brilliance of his fictional book title. He left a little hole by which a critically-thinking programmer could rip a huge chunk out of the fabricated world of Foxtrot.

Uh oh, wait a second - this book is fictional, right? I suddenly realize that maybe my argument is baloney, because maybe it's a real book! However, a quick search of Amazon indicates that this is just a false alarm. The book is indeed an invention.

Except now another thought hits me, slowly and painfully: I've just spent a good twenty minutes dismantling the implications of a fake book title in a fricken comic strip.

What's worse, I didn't come away concluding that the author is a faker trying to twist our minds. I came away concluding that the title, while bogus, is actually brilliant in its effectiveness. To rub salt in the gaping wounds, I actually wrote a journal entry about it. Eek.

Bill Amend is truly smarter than me. There's no way that anyone can analyze his book title and find a problem with it (as far as I know), so chances are he's also smarter than everyone else in the world.

All hail the new Overlord of the Human Race.

Back in the swing o' things

Posted by , 17 April 2006 - - - - - - · 168 views

Well, it seems I've finally found my productivity gland again (after a three hour nap this afternoon, of course).

Stuff is flowing nicely and I should be able to get the plans for this cutscene system totally done here soon. I ran into a fun challenge today with controlling fine-grained timings for different events in a scene. My previous model was barfing terribly for a particular common case:

- Event A happens absolutely 5 seconds into playback (hardcode 5 second offset)
- A 2 second audio clip needs to be played just before the start of event A
- BUT the clip may be 3 seconds in another language
- The clip duration has to be computed automatically!
- Solution: start the audio clip at offset: (event a offset) - (clip duration)

The problem was, in the first incarnation of my timing model, the solution couldn't be expressed - at all. So I redid the timing system and came up with a far more potent solution.

Now I have a very nice framework that lets us build sophisticated and powerful timing interdependencies, with simple and easy-to-implement XML and code.

The secret? Recursion.

I have two recursive data elements: timing attributes (attributes of XML nodes), and manual timing nodes. A timing attribute is a simple XML attribute field that is allowed to take on multiple values depending on what it needs to represent:
  • An integer evaluated as an absolute quantity of milliseconds

  • A string that refers to a manual timing node's ID tag; the manual node's value is copied (lazily) and used as the value

  • A string of the form start:foo that yields the starting time offset of foo (usually a shot displayed in the cutscene, or some other similar event)

  • A string of the form end:foo that's similar to start:foo

  • A string of the form duration:foo that yields the duration of foo

Any place where a node needs to talk about timing, it uses a timing attribute.

This leaves us to define what a manual timing node is. That's simple:
  • A string giving the node's ID tag

  • A base value, which is a timing attribute

  • An optional offset attribute which adds or subtracts from the base value. Both addition and subtraction are permitted in the same timing node (trust me, this will be useful in a few special cases - but I don't have time to explain right now). The offset attributes are themselves timing attributes.

So if I want to cue a sound effect at some specific point in time, I just do thusly:

	<bazaudio offset="specialtimingnode" />
<timing ref="specialtimingnode"

I can now express any timing relationship I want with the correct combination of manual timing nodes and automatically computed timing attributes. Cool stuff.

The other interesting problem that I've come up against is a little more abstract. Basically, the system I've designed is a kitchen-sink cutscene generation and rendering engine. It handles camera movement, flying all the little ships around and blowing them up, showing people talk, overlaying text and graphics onto the screen, etc. etc.

This makes the system ideal for displaying mission briefing sequences, where some talking head gives you some instructions, shows you some cool file footage of stuff exploding, etc.

Both systems are really designed, ultimately, for easy implementation. Everything is done with such recursive structures. The idea is that I can define a few simple pieces of code that handle those recursive elements, instead of writing a huge amount of special-case code to handle all the little variations and gimmicks that the content development team will need to be able to use.

Unfortunately, recursive data construction seems to be something that I understand just fine (and most of the other programmers as well) but just isn't something you use in the art/content realm too often. So the artists are at this point kind of worried that A) my system won't do what they need and B) even if it does they'll never figure it out.

Personally, I think that as soon as I get a couple of concrete examples done for everyone to look at, it'll get a lot more clear. To be fair, I've been thinking about these problems for a couple of months virtually non-stop, whereas everyone else hasn't. So it's only natural that the solution is obviously fine in my mind, but looks a bit iffy to everyone else.

For instance, most mission briefings will look pretty much alike. So from the artists' point of view, the most logical solution is to have a special-case XML and code system that handles those particular variations that they need. To them, it's better to add a special case when they need one. To me, that means code bloat, nasty bugs, and some terribly poor code longevity (as soon as the next project starts and we need different layouts, the code is useless). So I've been doing this other thing instead.

For a while, I was just planning on saying "here's an example, let me know if you need some pointers in getting any particular effects/results you need, I promise it all works and this is the best deal for everyone involved." I think that would have been a perfectly reasonable way to handle it.

Then I had an idea.

I'm constantly raving about building layers of abstraction in code. For some stupid reason, it took me until now to realize that I should also be building layers of abstraction in data.

The solution here is to build a veneer layer that lets the content creation team express notions abstractly and in simple data. Some kind of transformation layer then expands that into the lower-level data, which is pumped into the rest of the code. Since the code understands all this recursive, emergent gibberish, and the artists don't, all I need is a translation bridge between them.

It's really pretty similar to doing something like using XSLT to generate renderable markup from XML data, for instance. Except I'm moving from one abstraction of XML to another, and the actual translation mechanism will probably be a bit of manual code wizardry.

I still need to decide when this will be done. Doing it at design time has the advantage of needing less code and data to be shipped into production, but also requires the content team to "export" from abstract to concrete data formats every time they want to see their work in action. That sucks. So at the moment I'm leaning towards having the transformation be done "on the fly" when the game first loads the data.

This way, we can do things like abstract off the common briefing layouts/formats into one layer. The content developers generate content that targets that layer. Then, the translation code magically turns that into the raw cutscene data, which is much more stable and powerful. Everyone wins.

My brain is now dangerously close to self-combustion, so it's time to go do something Else for a while. With luck I'll have a bunch of this stuff documented (and maybe even implemented) very soon now.

It feels good to get back into things.

God Wills It!

Posted by , 15 April 2006 - - - - - - · 189 views

Well, if there's one thing every new Spiritual Convert needs, it's a good Crusade to fight for.

As a Newly Reformed Smug Lisp Weenie, I feel it is my deep responsibility to the Faith to join a Crusade. The thing is, I'm too lazy to see what Crusades are currently being fought, and I justify my laziness by telling myself I probably wouldn't want to fight for those Crusades anyways. So it's easier just to start my own. This also happens to dovetail nicely with my Ulterior Motives, i.e. the Epoch language project.

I've been reading up some more on what people are thinking The Next Big Language will be. It seems there's a generally widespread opinion that The Next Big Language is going to have to come from a major corporation like Microsoft or Sun. Maybe there's some dissent, but the only really well-argued stuff I've seen (so far) basically agrees that grassroots language revolutions will just take too long to gather critical mass. They might have great languages, but they won't be The Next Big One because they lack massive support and hype.

Hype and push from a Very Wealthy Corporation can take even a miserably poor language (read: Java) and turn it into a phenomenon. Conversely, really good languages (like the Lisp family, the ML family, and so on) are not going anywhere, even though they blow away other languages by a huge margin.

It seems that there's this sort of pervasive argument going on here. The argument basically observes that Lisp hasn't taken over the world, and Java (the language, not the platform) sucks and has more or less taken over the world because of marketing from Sun (and, to a lesser degree, Microsoft). The argument then concludes that, in light of these observations, The Next Big Language will have to come from A Really Big Company. Or maybe it will be Ruby, but probably it'll come from A Really Big Company.

I think this is bogus.

In fact, I think the Next Big Language will come from somewhere else. I think it will appear in a very grassroots way, and blindside the traditional Language Creation/Adoption Pipeline. I think that a highly pragmatic approach just might change the way we think of how programming languages grow.

I kind of waxed rhetorical at the end, but you can read my schemes (hehe, a Lisp pun! I'm such a clever little disciple) over in the Epoch thread, specifically at this post.

*Cue angelic beam of light*

Posted by , 14 April 2006 - - - - - - · 207 views

A funny thing happened today.

Actually, it happened a couple of days ago, but the effects have taken a bit of time to fully hit home. I was reading around in Steve Yegge's blog archives, Lambda the Ultimate, and a few other places that I forget at the moment. The other important bit of background was that I was fooling around with some concepts for the syntax of the emerging Epoch language. Somewhere in all this, something deep in my brain snapped.

I want to use the word "sudden" to describe what happened, but that isn't really quite accurate. It was a slow thing, building over a couple of hours (and in fact is still building a bit even now). There was no one trigger, not even in the sense of a "straw that broke the camel's back." Somewhere, though, a pile of thoughts, peeves, and old questions sort of congealed, and the half-formed answers in the back of my mind passed some ineffable threshold.

Once that marker was passed, my entire view on programming - and, more importantly, programming languages - changed profoundly. As cheesy as it may sound, it honestly has given me an almost giddy sense of euphoria. There is really something verging on the spiritual about reaching such a deep mental clarity. One could almost say that I've found religion.

I've found Lisp.

You have no idea how weird it is to type those words seriously. Heck, it wasn't but a few months ago that I was casually dismissing the world of "smug Lisp weenies" (brownie points to whoever I stole that phrase from) and probing the land of imperative languages for the answer to All My Programming Problems. I've been pondering for years the need for better abstraction facilities in programming languages - although admittedly for much of that time I only knew that I was annoyed at the deficiencies of languages like Java, without being able to really clearly express why I felt them to be so crippled. In fact, a fair bit of my journey along that road is recorded here in this very journal.

Over time, things added up. Then came Tim Sweeney's presentation on programming language needs for next-generation games. That whole thing spurred off a thought in my mind that had been brewing for some time: existing languages suck. So let's build a new one! The results of that particular epiphany are still doing interesting things off in this thread.

Yet even during that discussion, I've mostly (until now) been kind of stand-offish with respect to Lisp. When the language was still called Foo, I wanted to steal some of the "nifty" Lisp features without falling off the deep end into all that crazy talk about sexprs and macros. As things evolved into Epoch, I wanted to steal Lisp's power without looking like Lisp.

I've known for quite some time now of the need for a richer language than what I'm normally stuck with using. In fact, I've often looked at functional languages like Lisp, ML, and Haskell with a sort of wistful longing - I wanted to get into all that awesome potency, but without the scary shapes and incantations that didn't really make sense to my C-addled brain. I could appreciate the need for expressivity, extensible languages, and clean methods for building abstractions. I just hated the syntax.

I think I might have stayed in precisely that position for a very long time - maybe forever - if the Epoch discussion hadn't happened. The Epoch thing got me thinking really hard about syntax, and about clean ways of expressing a very rich and self-extending set of notions. I had started slowly down the trap of making a Really Convoluted Syntax, when something I read (I wish I could recall what) brought me back to reality: a simple, elegant syntax will serve much better.

Actually, I think one other thing contributed powerfully to my Conversion - my work at Egosoft. I've been working on defining a lot of new data formats for some of the features I'm building. In the process, I've had a bit of a struggle - the artists and content creation folks basically want the data to be a huge blob of special-case stuff, with lots of truly disgusting edge-cases, magic symbols, and so on. I've been fighting to take a totally different approach: define all of the data recursively, building complex content in a sort of emergent way from a lot of tiny and similar building blocks. I've played with a huge number of alternatives, and this recursive approach just makes this incredibly better all around. It's more powerful, more robust, and more reliable - it doesn't need a lot of special-case code to back it up, meaning that it decreases (in the long run) the chance of stupid bugs creeping in at edge cases.

So anyways, while I was reading whatever it was that I was reading, the Big Light Bulb Over My Head clicked on. Obviously, this recursive-definition thing isn't just a good idea for game data. It's a good idea for any data. Now I can see clearly why I'm so drawn to the idea of XML but find it to be so annoying and icky at the core - it's almost this recursive thing, but not quite.

Now I can see why Lisp is such an effective language family. Now I can appreciate the syntax and the philosophy behind it. Before, it struck me as just this weird, otherworldly, almost arbitrary thing. Some lunatic had a parenthesis-fetish, and built a language around it. Thanks but no thanks, I prefer the curly braces and four-billion-page specs that can only begin to cover the silly little nuances of syntax.

Unlike any other time before, I can really appreciate the effectiveness of Lisp's syntactical approach. It's not just a random thing - it's a truly beautiful and elegant solution to some very hard problems in language design. I'm not nearly smart enough to solve those problems. Every time I try to hack out a little start at a solution, I look over at Lisp and find out that it's already done a better job, and has been for decades.

At this point, I'm in a weird position. I want a language that has the Lisp-nature, but is not Lisp. It seems like some sort of bizarre Zen mind-rape. Maybe this is the last vestiges of my long history in the C-syntax family still clinging on to my soul for dear life. Maybe I've loaded my bloodstream so thick with curly braces that I need time to be purged.

Or maybe I have this sort of niggling feeling like Lisp just isn't quite it yet. (Eww... I almost feel like a heretic for even suggesting that, when even a few days ago I would have been proud to announce it.) Frankly, I think there's some truth there. Lisp is a brilliant concept. It does a lot of things in very nice ways. And yet I can't help but wonder why it has never taken off, why stupid messy compost heap languages like C++, Perl, and Java still exist.

I've heard a lot of theories as to why Lisp has failed to take over the world. Most of them strike me as total garbage. I think the reality is that Lisp, while Well and Truly Awesome, is still a step short of optimal.

It's funny, really. Two days ago I would have had no reservations about implying that Lisp isn't The Best Thing Forever and Ever. Yet now, as a newly reborn Smug Lisp Weenie, I feel kind of dirty for even bringing it up. I haven't even written any nontrivial software in Lisp yet. Now I can truly sympathize with those who found the Truth before me; you really honestly can't explain it succinctly.

To reach enlightenment requires one to travel down a long, hard road. It takes experience and acute awareness of the deficiencies of other tools. It takes a lot of hard thinking about deep principles of programming. You really can't be led to Lisp Enlightenment by means of a clever article or book. Like nirvana, Lisp Enlightenment must be attained by each individual, walking on his own path.

I doubt I'm smart enough to invent something that is closer to optimal than Lisp, at least not by itself. But maybe there's hope in learning from Lisp. Maybe we can take the pragmatic road, look long and hard at the realities of Lisp's station in life, and see if there just might be some room for improvement. Maybe we can stand upon the shoulders of giants, and just maybe, in so doing, we can see farther.

So now I am a Smug Lisp Weenie in Training. And damned if there isn't a heck of a lot to be smug about.

Procrastination and such

Posted by , 14 April 2006 - - - - - - · 167 views

Well, I ended up falling asleep last night instead of working. Story of my life. I'm going to make a quick effort at making up for it today, but I have four challenges that will defy that:

  1. My parents left Florida today and will by stopping by in the area on their way to Indiana. So my weekend will pretty much be consumed by bumming free meals off my sister (with whom they are staying).

  2. It's Friday, and Fridays always suck for getting work done. Worse, I'd be working on a Friday night, which is just shameful, and I wouldn't at all be surprised if I get "coerced" into going out someplace tonight instead of working.

  3. My nephew has a soccer game tomorrow morning at 11 AM. So I have to be conscious by then, since I don't want to miss it, and since there's food afterwards.

  4. I have an aunt who is notorious for giving incredibly awesome gifts. This year for my birthday she gave me a little DIY robotics kit, where you assemble a little robot mouse that can follow a path using optical sensors. Unfortunately the manufacturer's site is a total mess, so I have no links, but I'll be posting a documentary of my assembly efforts.

So... my high-minded, noble attempts at productivity just got their collective butt kicked, and hard.

But robotic thingies are fun, so I really don't care [grin]


Posted by , 13 April 2006 - - - - - - · 175 views

Ever since I got back from Florida I've been dragging my feet. (Heck, I've been dragging my feet for quite a while now.) I finally got fed up with it and decided to sit down and try to figure out what the heck is going on. I've tried before to attack the problem analytically, with no real results; so this time I figured I'd just sit down and write things as they came to mind, hoping for some subconscious process to spill the beans and reveal some patterns in my thought, and thereby find some solution to the problem.

Here's what I blurbled - with only minor edits so that things make sense and are suitable for general audiences:

I don't feel like working.
I don't want to reply to my email.
I want to relax and/or goof around all the time.
I don't want to feel like I'm under pressure.
But I do want some pressure, so I feel useful.
I want to get work done, but I don't want to do work.
My attention span is very low for certain things and I've been having an unusually hard time focusing.
I feel like there are many things I need to "catch up on" on a daily basis (GDNet, games, etc.)
These things are really nothing more than ritualized addictions
After doing the rituals I usually feel "lost" - I want more rituals to do, and don't feel like starting work
Of course the solution is to do work first and foremost at the beginning of the day, and have no "morning ritual" - the reitual activities should be done only after work has been done
In some cases it is permissible to use ritual activities as a sort of zen-slap to refocus the mind; but of course mental activity has to be started first. Can't get slapped into starting.
Writing out thoughts helps release mental buildup and aids in achieving clarity
I am bady disciplined in allocating the time I have - procrastination builds on itself
I continue to think of things as "Get All This Work Done" instead of "Make a small start" which leads to feeling overwhelmed, and feeds paralysis
Due to my circumstances, I don't feel any direct pressure on a daily basis to get my job done
Sometimes I get a feeling of "I'm ready to do work now" but often this comes in the middle of other activities. I usually put off the feeling until I finish the other activity, and as a result lose the feeling - then continue the activity, and so on. Instead I should be willing to let this feeling interrupt other activities, drop games/blog reading/etc., and immediately go to work.
Self-evaluation is helpful.
Removing excess thoughts from the mind - putting them onto paper - helps analyze them cleanly, which in turn helps attain mental clarity

So there's my mind dump. I managed to get some stuff done today - more than the rest of the week so far - which is good. I'm looking at working through the night since I'm wired and full of sugar and caffiene, so we'll see if this helps keep me productive.

Anyways, on the whole, I strongly recommend trying this. Even if you don't really find any solutions, it's tremendously relaxing to at least spell out the problem. The best part is, since it's now written on paper, you can let it out of your mind. You don't have to worry about forgetting it, because you can just read your notes again if you do forget. Seems kind of silly and circular, but so far it looks like it's successfully gotten me around a nasty mental block.


Posted by , 12 April 2006 - - - - - - · 154 views

Being in a slap-happy, hyperactive, party-like-a-hamster-on-crack mood at 2:40 AM on a Thursday really sucks.

That is all.

Mmmm... games. With sugar.

Posted by , 11 April 2006 - - - - - - · 179 views

Today we got the first really official roadmap for the Next Project. I can't really give much detail due to NDA restrictions, unfortunately. The bottom line though is that we basically have about twice as much time as I was expecting to get certain major features to an alpha-complete phase.

I already more or less have my own private roadmap spelled out for those areas, and I'm gearing up for a solid sprint of coding effort in the next couple of weeks that will put my areas of responsibility solidly ahead of schedule with respect to the official roadmap.

I'm honestly sorry to have to be so vague, but the reality is, giving too much information here would hurt the project overall. Since a lot of scheduling details are still up in the air, it's dangerous to even hint at actual timing information at this point. That sort of hinting has a way of getting spread into the fan community and blown out of proportion - often wildly.

Usually, what will happen is someone will make an (often totally misguided) attempt at extrapolating things like release dates, possible feature sets, and so on from such hints. Paradoxically, the less detail is given in such hints, the wilder the speculation becomes. Then a giant version of the Telephone Game is played across the Internet. Eventually, you find out in a few weeks that your team has "promised" massive sets of features and changes in a mere fraction of the actual project development schedule.

Naturally, reality and these speculations don't really get along so well. When real life fails to produce the fruits of your fans' wildest imaginings, they get annoyed. It is terrifically hard to convince an excited fan community that you never actually promised all that stuff, and that all of it was made up. The net result is a painful (and often expensive) PR snafu.

So, as much as I'd like to excitedly detail my schedule plans and all the features that are being lined up, I'll have to keep it very fuzzy for now.

I will say that we're putting a big focus on producing high-quality gameplay content. If we hit this roadmap as planned, we're in for a very exciting iteration in the X series.

Went out and saw Ice Age 2 on the spur of the moment tonight. My flatmate and I walked into the theatre literally seconds into the previews, without having any idea if there was even a movie playing at the time. Huzzah for spontenaity [smile]

All I really have to say about the movie is this: they could continue making this series until the day I die, and, provided that the little squirrel character continues to make regular appearances, I will pay to watch every last one of them. I have no idea who is behind that squirrel-thing, but whoever you are, you're a genius.

I'm going to spend the rest of the week making charicatured gesticulations and cute squeaky grunts and sniffles.

February 2017 »

19 202122232425