Yaaaaaay, anitaliasing is now working nicely. It's a little ugly to get it into the engine, since you need to know the D3D backbuffer and depth-stencil format before you can find out how many multisample quality levels the device supports.
So, where I used to have:app.Init(1024, 768, bFullscreen);
I now have:app.Init(1024, 768, bFullscreen);
u32 nAntialiasQuality = gfx.GetMaxAAQuality();
Which still isn't too bad I suppose, but I'd prefer not to have to init in two steps like that.
Things on my [graphics] TODO list now:Instancing of some sort for rendering multiple models. Might not bother with this, as I doubt it'll be that useful.
Keyframe blending in the vertex shader instead of on the CPU. Means a larger vertex buffer, but it shouldn't be that bad, and will mean I can slerp and other cool things if I want to.
Support render to texture. Not quite sure how this will work, probably have a CreateRenderTarget() function, and allow the game code to explicitly work with render targets. The engine code will handle round-robins to keep performance up (Random fact: Guild Wars uses a double-buffered render to texture system for this for performance reasons).
Support HDR lighting. Not sure how much work this will be, I really don't know much about it. All I know is that it needs a floating point render target, and I belive involves two passes; one render to the floating point render target, and then another to convert to 32-bit ARGB render target. No idea if it needs a pixel shader.
Pixel shaders. Not quite sure how to handle this, since I don't want game code to worry about requiring a certain pixel shader version. So ideally, I'd like to support a "fallback" rendering method.
Get rid of my graphics-related singletons completely. Yes, I have several (Texture manager, sprite manager, graphics manager), please don't beat me...
That's about it really. I also finally played with PIX, NVPerfHUD, and CodeAnalyst. Didn't really tell me that much, CodeAnalyst was the most useful; telling me that I was only using one CPU core (49% of app time in the processor driver), because I forgot my test app is only single threaded. So it now spawns another thread to do some texture updating. It also told me that my 128-bit (4xfloat) colour to 32-bit (ARGB unsigned int) function is really expensive. 5% of my app time spent there. I might have to just use 32-bit colour everywhere, and then have a "high resolution" colour class I can use where needed.
Also, my laptop is back (Well, in Edinburgh). I sent it off a couple of weeks ago to get the touch pad repaired since it was completely knackered. So I'll see if it works now, and I'll also have a low-spec machine to test on. Also also, my sister has donated her old PC to me - another (low spec) one to play with. Hooray!
EDIT: In other news, I was looking at the changelist for one of the files I maintain:
< // Maintained by: Captain Steve Macpherson -> // Maintained by: Private Steve Macpherson
Looks like I got demoted during my week off [sad]