I've had a pretty crap time with programming as of late - I have no idea why, but every project I've been working on has been in a state of general broken-ness for a week or so now. Anything/everything I tried seemed only to make the situation worse [headshake]
So, tonight I actually managed to fix something.
For my university dissertation I was working on simple bloom for the HDRI post processing. I've designed it to be more flexible than my DirectX SDK sample but for now it's essentially the same.
Which is why it not only confused the hell out of me, but annoyed me - I had an almost identical version working just fine. Why was my new one fubar'd:
In the above image I've added the two red lines; in the top-left corner the results are as desired, but everywhere else the quality is shocking. Notice the really obvious "banding" that's going horizontally across the image - it looks worse when you actually animate the scene [oh]
Turns out the only difference with my new code and my old code was that I was trying to be too clever with the simplest part. The post-processing vertex shader. My SDK sample just renders a screen-aligned quad using fixed-function processing and vertices stored in a sysmem array. I changed this to use a custom vertex shader that used projection space inputs - making it resolution independent.
I still don't quite know why it didn't work, but I think it was something to do with the dreaded mapping-texels-to-pixels thing. Just hidden in a slightly obscure way...
The above is the correct final result. Notice that it's quite blocky/pixellated. The bloom effect is done on a 2x downsampled image for efficiency reasons...
I'll have to try it in the labs next week, but the GeForce 6600 I'm using should bilinearly filter the above image, and give me something like this:
The above being my ATI codepath that implements floating point filtering in a pixel shader.
So that's all I have to say/show for now.
I'm working on a Direct3D 10 mini-article at the moment. More on that next time.