Jump to content

  • Log In with Google      Sign In   
  • Create Account

The Bag of Holding

Sleeping (or not)

Posted by , 13 June 2006 - - - - - - · 67 views

There seems to have been some confusion about my sleep schedule in my last post.

So let be briefly clarify how my sleep schedule works:

- When tired, go to sleep. This often means snoozing in my chair, or, sometimes, stumbling over to the couch. Maybe once a week I'll actually make it into my bed.

- When no longer tired, get up and do stuff.

I literally cannot remember my sleep patterns from the last several days, partly because they're totally arbitrary, and partly because I just stopped caring. It's a little weird being totally out of sync with the rest of reality, but I haven't been this consistently rested and alert in literally years, and I generally tend to manage quite long runs of work before slouching back off to zzz-land.

It sounds insane at utterly unhealthy, but so far it's been going great. Working from home has its benefits.

The only currently recorded side effect is pondering hot elven women at 4:30 AM. I'm still not entirely sure if that's a bad thing, though.

Epiphanies feel like electrified chocolate

Posted by , 12 June 2006 - - - - - - · 81 views

Proactive de-funk step one: chug about 75% of a can of Monster. Guaranteed to at least make your skin tingle, if not actually correct motivational deficiencies.

I know this is annoyingly soon after my last post, but I just have to brain-dump it, and it's too important for an edit. Plus, an edit would totally ruin the dramatic tension I left at the end of the last post. This way is better for all of us - many rainforests will be spared. Or something. (The caffiene and sugar are really starting to kick in.)

So I was thinking about how cool and awesome my new laptop is, and how I always get motivated to do lots of work whenever I get expensive new toys. The only problem is, compared to my regular workstation with a regular comfy keyboard/mouse combo, the laptop just isn't comfortable. Sure, in a hotel room in Europe it's great; but back in my Lair of Power it's merely good, and outstripped by the Better that is my desktop machine.

One of the things I really enjoy about desktop computing is dual monitor configurations. I swear, after using dual-head systems for a while, going back to single-screen is like being strangled to death and drowned in damp sand at the same time - and without hot elven women there to help assuage the pain. (See what I mean about the sugar and caffiene? I may also be slightly feverish, and the associated brain-cell-barbeque is likely not helping. I sure hope to God this is as funny tomorrow morning as it is in my head right now. Although I'll settle for it just not being bizarrely pathetic.)

So right... I was talking about something meaningful, supposedly. Laptops and such. The thing I like about this laptop is the übersexy 1440x900 WXGA+ resolution, which is just so much more awesome than 1024x768. The whole 17" thing is pretty hot, too. Unfortunately, it still gets trumped by dual 17" 1280x1024 LCDs, which simply provide far more real-estate-power. And Billybob knows it's all about the real estate.

When I code, I fill up a metric buttload of space. I often run two instances of VS side-by-side, one on each monitor. Even in less severe scenarios, I almost always have a stack of documentation, note-taking, and assorted web-browser windows open on my secondary monitor. Real estate is amazingly valuable.

Naturally, then, one thing that just made me hot and bothered at the Egosoft office is the artists' display setups: a 21" widescreen LCD with a second 19" on the side for spare stuff. They run some ungodly resolution with a large number of digits in it. It's very little short of orgasmic for someone who, like me, can never acquire enough screenspace.

So anyways... back to the whole buying-toys-to-get-productive thing. Obviously the new-laptop toy is great, but it has limited usefulness here at home. I also want a new development workstation, because after a dual-core system, going back to an Athlon XP 2400 is murderously slow. The little glitches while I try to web-surf and do large compiles never really bothered me before, but the laptop's Core Duo power gave me a brief taste of life without those hiccups, and that taste was better than cocaine.

That got me to fantasizing about tech hardware, which is never healthy. I figured, heck, if I'm dreaming about dropping $2500 on a blazing-hot new development rig, why not do it right and buy a screen upgrade, too, seeing as I'm always so short on screen real-estate and all? This of course brought back to mind the awesome potential of the widescreen/dual setup.

Unfortunately, I realized that that just won't cut it. Because what I need isn't more pixels. What I need is smarter software.

VS2005 is nice. It really is. There are some hiccups with it still, like the fact that 9 times out of 10 if I double-click a workspace in Explorer VS will just deadlock and presumably go do something unspeakable with itself in a dark corner somewhere. I really don't know what happens. Oh, and the Intellisense background update is still horridly broken.

But it still does one thing bad, and that's dual-head support. I'd really love to be able to tear off windows and slap them on another display for a while. I can do this to a limited degree with things like toolbar windows, but there are some bad Z-order issues that arise when working with other apps on the second monitor. In general, it just doesn't work.

I'd settle for one of two things, honestly. The first solution would be the easy, cheap hack, and make me happy for at least a while: hide a little toggle-switch in the IDE somewhere that says "when I click Maximize, I really want you to inflate over all available monitors instead of just the one you're on right now." I can do this if I un-max the window and resize it by hand, but that leads to ickiness of its own. It also makes it a pain in the ass to write code, because code windows fill up two monitors when what I really want is to have a code space on each monitor.

The second option would be far more complicated to build (I suspect) but infinitely cool: open two IDE "parent windows" that are actually the same instance of VS and act on the same open workspace. As it stands, if I try to open the same workspace in two separate instances of VS, one will barf because it can't access read-only stuff like Intellisense databases, which the other instance has locked. This makes it more or less worthless; I may as well just open Notepad on the other monitor for all the benefits that VS retains in that mode.

So if I could just have VS intelligently handle dual-monitor setups, that would be so damn cool. I know art packages have done that for years; Photoshop does it nicely, 3DSMax does a reasonable job, and Final Cut Pro on the Mac is undoubtedly one of the best dual-head-aware pieces of software I've ever seen. When VS can do it, too, I will be one happy geek.

... yeah, all that was pretty much pointless. Except to say that VS has crappy support for dual-monitor setups.

(And someone please, please, please tell me that this magic switch already exists and I just suck because I haven't found it yet.)

Cross-platform ports FTW

Posted by , 12 June 2006 - - - - - - · 103 views

X2 has just been released for Linux. Woohoo, etc.

On a less corporate-shill note, this week sucks. I started out this weekend fighting off a nasty runny nose. Mysteriously, it dried up overnight and hasn't been back since, but it left behind it a mild fever and the general achy malaise that tends to go with being sick. I also have some wicked bad sinus headaches that keep flaring up every few hours, making it basically impossible to get work done.

So instead I've been reading through my Foxtrot comic book collection, which is sadly sparse. I'm down to one last book, and then I may have to fall back to my Calvin and Hobbes stash. As much as I enjoy C&H, starting a marathon would be decidedly unwise - seeing as I own virtually every strip ever published in some format or another, it usually takes me a solid week to get through my whole collection, and that means a week of not getting work done. As much as I want to procrastinate a whole bunch right now, that's kind of not really a good idea, seeing as this 3D engine rewrite thing is sort of critical to finishing the game we're working on.

Lé sigh.

I feel a little swamped at the moment. There's this engine stuff for work, plus the Epoch project which I really want to get moving on, the rewrite of the TKC software (which, seeing as I'm still getting a steady stream of faithful donations, I feel like I really should work on again someday), a stack of domestic chores to get done... and that's just "work" stuff. Reading comic books for a day and a half has really, powerfully tempted me to get back into all the other reading I want to do, which I frankly just can't afford the time for right now.

(So why I'm burning time posting this crap is beyond me...)

All in all, I'm in a dangerous funk at the moment, and I really, really, really need to break out of it. Normally at times like this, I'd inspire myself by gaming for a few hours. Unfortunately, the northbridge fan on my gaming rig's motherboard died just before I went to Germany, which leaves that machine unusable. Replacing the fan is fairly easy, but it's one of those "urgh I really can't be arsed" sort of mundane tasks that just keeps getting put off. So, bottom line, I either game on my workstation (urgh for crap hardware specs), my new laptop (urgh for bad ergonomics... hey, that's poetic!), or in my imagination.

I know that if I don't get up off my butt soon and start getting work done, this is going to turn into a long spiral. Feeling sick tends to make me want to bum around a lot and read, which in turn gets me sentimental, which makes me dig up old games, which makes me burn a lot of time. Generally, this also means I don't rest very much, which means I stay sick for a while. And I'm sure you can see where all this leads.

The problem with being in a funk is that you don't want to get out, even if you know you should. Blargh.

"Of course, maybe if I read Masters of Doom again for the bajillionth time, that will motivate me to get going!" Yeah, except it won't, and all it will really do is waste time. The problem is, every time I manage to Alt+Tab over to Visual Studio, my eyes just kind of glaze over and I end up back on GDNet or reading webcomics or whatever.

I need a shot of testosterone or something... anything to get my psyched up and in a "RAHHR FOURTYEIGHTHOUR CODING MARATHON TO DESTROY ALL OTHER GAMES FOREVER GRAHHHH!!!" sort of mood.


Posted by , 11 June 2006 - - - - - - · 78 views

I'm a fairly big addict of Digitally Imported's Internet radio service. The bulk of my time is spent in the Chillout channel, as I find the selection there (for the most part) to be an incomparable backdrop for late-night hacking sessions.

The mixed blessing and curse of a service like DI is that it's crammed with extremely rare tracks. I know of one or two cases where the track played by DI is the only existing copy of the song in the world. On the one hand, the nominal (and voluntary) subscription fee is a bargain considering the ease of access one has to such works. On the other hand, it can be incredibly hard to listen to a particularly masterful track and know that you'll never be able to go down to the store and buy the CD. Some of this stuff is literally so rare that you can't even find it via certain... shall we say, less than reputable Internet sources.

I keep a list of tracks that I've heard that I want to try and find someday. Several I've been able to buy, and some can only be had by resorting to more extreme methods. In every case I'd more than gladly give a tidy portion of my income to the original author for the priviledge of listening to their music on demand.

I could turn this into a lengthy rant of ire about how modern music distribution encourages crap to be churned out and jostle for position on some self-important "Top X" listing-du-jour. I could vent my considerable frustration about how I'm expected to pay ridiculous sums of money and submit to idiotic terms-of-use just to listen to garbage, while in the meantime the labels refuse to sign genuine talent and brilliant artists like Marco Torrance. I could say a lot of very unhappy things about trying to enjoy music in this day and age - things which shouldn't have to be said, given the unprecedented and truly remarkable ability our modern civilization has to distribute media like this.

I'm really not in the mood to rant, though. I'm mostly just sad, and the usual rip of nostalgia is making its routine surge. I've said it before, and likely will have occasion to do so again several times, but I'm far too sentimental for my own sanity [smile]

I sort of feel vaguely dirty and foolish bringing it up, but it captures my mood so well, and has come to mind repeatedly this evening. I'd love to be able to say that I drew this little slice of my soul from some great writer like Shakespeare... or... some other great writer. But, frankly, I don't really know that many great writers, and I haven't read that much of the "great literature." So with a mixture of shame and geekish pride, I have to reveal that one of the most powerful pieces I ever read on this subject was a Star Wars short story.

I don't recall many details, and those I do remember are likely butchered. The vast majority of the other short stories in the book were total crap, as is par for the Star Wars course (Timothy Zahn excepted). This one stuck with me not at all because of the franchise branding, but because of the subject material itself.

As I remember it (which again is not particularly reliable), there was some legendary mass-murdering psychopath alien who had comitted horrendous acts of genocide during his military service. There was some explanation of this, but I don't recall what it was. The interesting thing was that he had escaped justice for decades, and had hidden himself away on some obscure planet or whatever, living in an underground hovel.

The pride of his life was a literally priceless collection of music. In his subterranean lair, he had collected thousands of recordings - literally the last music surviving in most of the universe. (Apparently the Emperor was not fond of tunes, or whatever. I don't remember that, either.) In the end, something dramatic happens, and he dies listening to the only existing copy of some famous song in the entire galaxy.

Or maybe he didn't die, or something. Heck, I don't remember at all. Just that there was a lot of rare music involved, and he was the only one with access to it. I suppose it loses all of its dramatic punch when recounted without any of the actual details of the story.

But believe me, it was deeply moving. Especially when one gets done hearing Lluvia del Verano and knowing that, chances are, it'll never get signed onto a label, despite the fact that Torrance is a borderline-genius artist.

DI offers an amazing chance to enjoy some of that artwork; it's a chance that, even 15 years ago, would have been totally impossible. The hardest part is knowing that that short glimpse is all I may ever get to have.

But really, isn't that just part of what makes it so special?

Where mah garbageman?

Posted by , 08 June 2006 - - - - - - · 86 views

Let me just register my sentiment, for the record, that manual memory management is crap. Therefore, by extension, C++ is crap.

I honestly wonder how much cleaner this stupid scenegraph would be if I could just use a proper garbage collected language. Unfortunately, I don't have that choice right now. Even a bolt-on GC for C++ is really more than we can afford to try and integrate right now.

I think if it weren't for boost::shared_ptr I'd have jumped out of my window by now.

Of course, that wouldn't have done a whole lot, seeing as I live on the ground floor and all. But the symbolic act of hopelessness and despair would have been awe inspiring.

So anyways, at the moment I'm working out all the gritty details of who exactly "owns" scenegraph nodes, who creates/destroys them, and all that such stuff. Once I get it all figured out, I'll do my RWGD post and talk about the results, as well as the cameras-aren't-scenegraph-nodes thing.

Assuming, of course, that anyone in the world actually gives a flying lemur kidney about any of it [grin]


Posted by , 07 June 2006 - - - - - - · 135 views

I think that puddle of ooze that trickled out of my ear was the remainder of my brain.

So far so good - the basic framework for the scenegraph has been put into place, and the high-level render algorithm is more or less fully stubbed in. I'm about 80% of the way through reimplementing camera abstractions; part of the nasty work is moving away from treating cameras as scenegraph nodes and making them separate entities. (Sometime when I rally some energy and English writing ability I'll do another Real World Gamedev entry on why I made that decision.)

One of the big blocks of functionality left to reimplement is our internal scene buffer/soft-instancing mechanism. It's currently woven quite deeply into the data structures and code for the scenegraph, although ideally it should be a separate module that the scenegraph nodes themselves are totally unaware of. Decoupling and reimplementing that system will be a big step towards getting the camera logic finished.

Once that's done, I really just have to stub in render delegation calls to each scenegraph node; I'll come back through when I implement the rest of the scenegraph classes and have each node class do the correct "stuff" when it is Render()ed. Then the fun bit will be making sure the camera transformation pipeline is still sane, and the camera logic will basically be done.

Wheeze! This is a heck of a project, but at least progress is steady and still on target. Things are already looking a lot cleaner (and a fair bit more efficient) so morale is still good.

I also got ahold of my Frisco Melt (if you know why I mention that, you spend too much time on GDNet) and so I'm very happy.

Off to relax a bit, and then dive back into the code.


Posted by , 06 June 2006 - - - - - - · 97 views

So XMen 3 was pretty good. I never touched any of the comic books (the medium in general tends to be a little too corny for my taste, with only a few exceptions) so I really don't care about the "canonicity" of any of it. As such it was a good movie, and quite impressive in that it's the third movie of a trilogy and didn't suck. That's no mean feat in the world of trilogies.

Unfortunately, I am obligated to hate the movie simply because it is commonly referred to as "X3." Bastards.


Posted by , 06 June 2006 - - - - - - · 88 views

I'm back in the USA now [crying]

Time to copy all my work off the laptop and start buckling down.

Real World Development: Volume 1

Posted by , 04 June 2006 - - - - - - · 82 views

Real World Game Development
How Decisions are Made in a Commercial Game Project

I always enjoy reading this kind of stuff from other teams; it's fascinating to see the parallels between our projects and others'. Sometimes we make similar decisions, other times our needs, preferences, and/or whims simply take us in other directions.

With the new project I have on my plate of massively overhauling a 3D game engine, I thought it would be interesting to catalog some of my own processes and decisions. This is partly useful to me as it provides a way to force myself to think through and rationalize each decision, for an audience likely to be far more objective about certain things than anyone on the team. It's also a great documentation dumping ground - I can come back here and refer to my decision-making process when the time comes to write it up formally for internal docs.

And who knows... maybe someone will find something interesting to read in all this lot of bilge [smile]

Determining what data format to use
One of the most significant challenges of building an engine into an existing game codebase is achieving interoperability. Decisions that would be utterly trivial to make in a new system can have really massive consequences when you're trying to work inside a large body of code - most of which you do not want to have to rewrite. In many cases, decisions that are hard enough by themselves become horrendously complicated by the fact that they can cause the rest of the project to spiral out of control. Much of the challenge is actually a form of damage control, where we have to minimize the ripple effects of our decisions and changes.

Getting Past Fixed Point Data
We have a tremendous amount of legacy code using fixed-point arithmetic. Back when that code was first developed, it was a good idea; certainly it was faster than floating-point. But now, times have changed - a lot - and fixed point operations like multiplication are, by necessity, going to be over twice as slow as floating-point.

Obviously, in a totally new system, that's a no-brainer: just use floating point! However, that option isn't so simple for our case. We have major systems which, by virtue of their design, just simply won't be able to use floating point values any time soon. So while we will be eliminating a lot of fixed-point code, much of it will have to remain. Thankfully, the system is factored well enough that we can define clear interfaces across which we have to convert from floating-point to fixed-point.

This issue took a couple of hours to talk over and figure out, but overall it wasn't too bad. The really nasty ones would come later.

Choosing Data Precision
OK, so we more or less settled on floating-point data. Unfortunately, it's not that easy. In the C++ environment we're developing in, we have two choices: float or double. floats are 32 bits, have no additional memory footprint above the existing fixed-point system, and are fairly effective for many purposes. doubles are 64 bits, which means extra storage space, and might possibly cause alignment problems. However, they do provide vastly better resolution and accuracy than floats, so they have their uses.

In fact, we know from (harsh) experience that we have to use double-precision values for many things. World-space coordinates, for instance, are huge in our game, and double is literally the only accessible data format that can cope with having vast distances and minute details in the same coordinate space. So many areas have to be double anyways.

Worldspace position coordinates, transformation matrices, even direction vectors are all fairly lightweight, so representing them in double is no problem at all. Where it gets tricky is the actual polygon mesh data which is passed to the hardware.

Our polygon meshes are huge, and doubling the storage requirements (by going from a 32-bit representation to double's 64 bits) would have very bad consequences for the game's memory demands. Obviously, it is in our best interests to minimize this storage size. There's one more chainsaw to juggle: our vertex buffer format only understands 32-bit floats; and we can't change that without totally screwing our minimum hardware requirements.

The solution, of course, lies in the fact that position and transformation data (worldspace coordinates, rotation values, etc.) is independent of the pre-transformed mesh. The hardware does the final transformations anyways, and we can pass it doubles without problems. In the worst-case scenario, we can convert back to float just before we send stuff to the hardware, and that will be rare enough that it won't totally kill performance.

We're not out of the woods yet, though. There are large subsystems - written to use fixed-point math, naturally - which need access to vertex data in order to do things like collision detection. Rewriting those modules is out of the question for now (unless we manage to discover another half-dozen programmers) and so we'll have to provide a way for them to access geometry data natively.

There are three options: store one copy of the mesh data in floating-point, and convert to fixed-point when needed by collision checks etc.; store one copy of the mesh data in fixed-point and convert it to floating point each frame before it is sent to the vertex buffer; or store parallel copies of the mesh, one in each format. The first two have severe performance implications, and the latter has severe memory implications.

However, the third option really isn't so bad. We only need to use a reduced LOD version of the mesh for those systems that read the mesh data, meaning it will be significantly smaller than the full renderable mesh data. We can also strip out some things like normals, texture coordinates (occasionally), and such. Even better, not everything that is rendered needs to have a readable mesh, so there will be some cases where we don't have to store the second copy of data at all. So from a storage standpoint, the impact isn't too bad - and the fact that it avoids major performance problems and major code rewrites makes it the clear winner.

Stay tuned...
That's already quite enough reading for one day I think [smile] Besides, the next issues I want to cover relate to designing the scenegraph itself, and there are still some things I haven't had a chance to think out fully yet.

So... more later. How much later I do not yet know, but I hope that between now and then there will be lots of sleeping.

Bier ist gut

Posted by , 03 June 2006 - - - - - - · 109 views

I finally got around to sampling a new local beer today. The brew of choice was Prinzregent Luitpold, another hefeweizen. The color is medium-dark and very cloudy, which is more consistent with the weissbiers I've had in the past. It has a sharper flavor than the Erdinger, and a very rich fruity aftertaste. Altogether it would be an excellent steak-and-potatoes beer, although it still went quite nicely with the schnitzel cordon bleu and potatoes that I had.

Somehow, in my butchering of trying to speak (and understand) German, I ended up with two beers. Thusly I had one as a meal drink, and the second by itself. Even as a standalone beer, the Luitpold fares well, even though I tend to prefer having a meal along with my brew.

If I get a chance tomorrow, I'm going to try and get ahold of a darker beer, just to see what the local offering is like.

January 2017 »

151617 18 192021