Most of us graphics people spend our time living on the bleeding edge of technology. New papers, new techniques, new hardware, and new effects are our obsession. Personally, I've always found the past fascinating. What was the bleeding edge 15 years ago? Or, a slightly different question: What did people think the bleeding edge was 15 years ago? What did they predict would be the state of the industry 5 years ago?
I was skimming some presentations today, and there's some really rather interesting stuff from way back. The history relating to Microsoft and DirectX after the Windows era began is particularly colorful. It's no secret that I'm a huge fan of Direct3D and the Windows platform today, as well as the efforts MS is undertaking now and in the future. Looking back, though, there were some painfully hilarious missteps. Today's flashback is all about Microsoft Talisman.
Reading the wiki page is enough to get all the facts on the matter (all the ones I know about, anyway). Still, the magnitude of how misguided the whole thing was might not sink in immediately. Talisman was built around one basic assumption, which was that memory bandwidth would quickly become the ultimate bottleneck in consumer computer graphics. The goal was to drastically reduce the required memory bandwidth, apparently by using some kind of psychotic recomposition model. Basically, if something hasn't changed since the last time you rendered it, don't render it again; simply blit it over to where it needs to be. Parallax was done via affine transforms, until the error was great enough that a 32x32 block of pixels needed to be updated again. (Note that these blocks exist in 3D, for various objects, so occlusion is going on.) There was also some extra cleverness for patching together the blocks without problems with seams. No framebuffer, either; all this was generated as the monitor drew. Naturally, the application had to be structured in a very specific way to accomodate this sort of rendering pipeline.
I don't know how this sounded in 1996, but right now it sounds totally insane. I think it's not a bad guess that it sounded insane back then too, because by the time Talisman was getting ready to go, modern 3D accelerators appeared (the Riva 128 presumably), solved the bandwidth problem by simply throwing large amounts of on-card and on-chip memory at the problem, and tore Talisman to shreds for performance.
So, total failure?
It's funny how useful things can come out of even the most misguided project. In this case, we owe a couple things to Talisman:
* Texture compression, and more recently, Z buffer compression. These are both fairly vital for modern graphics applications. Disabling Z compression by accident really sucks, because your app will suddenly get much slower for no apparent reason. (Especially under heavy multipass situations.) And texture compression is a no-brainer.
* Tiled rendering. The Xbox 360 uses predicated tiled rendering, because the framebuffer does not generally fit on the 10 MB DRAM chip which has been set aside for it. So the screen needs to be split into pieces and rendered one piece at a time.
* Anisotropic texture filtering was brought to the forefront by Talisman.
* Not useful per se, but BeginScene and EndScene were added into DX here.
The next entry in this series is yet another Microsoft failure, and SGI is in the mix too. Some of you already know what I'm talking about.