Hardware antialiasing in floating point

Started by
13 comments, last by Marco Salvi 15 years, 9 months ago
Just a quick question: I gather that, because hardware FSAA can't be applied when rendering directly to a texture, the standard practice is first to render to some buffer that does permit AA (such as the back buffer), and then blit this to a texture. But I also understand that AA with floating point formats is unsupported on any current hardware (or at least on DX9-era hardware). So how is AA performed with HDR, or any other effect that requires FP textures? I assume there's a way other than just using software AA.
Advertisement
MSAA on FP16 render targets has been supported since the Radeon X1800 series, and the Geforce 8 series IIRC. Without hardware support for MSAA, you can always try supersampling (render at a higher resolution and downsample).

But yes, you'll need to create an MSAA render target first using IDirect3DDevice9::CreateRenderTarget(). Since textures don't support MSAA render targets, and shaders can't sample from surfaces, you need to render to the MSAA render target then StretchRect() to your texture so it can be sampled in your HDR shaders.
NextWar: The Quest for Earth available now for Windows Phone 7.
Hmm, okay. Just so I've got this straight: in any HDR application prior to DX10 hardware, no hardware FSAA was performed - instead, all antialiasing was done via the supersampling method you describe?
The X1800 wasn't a DX10 class part. I remember several years ago, it was one of the features which prompted me to purchase an X1800 over a Geforce 7 series part at the time. Oblivion with HDR + AA was nice. [grin]

Back then, there weren't very many games that actually used FP16 HDR. Oblivion was one, FarCry was another. There were a few others, but they were nowhere near as prevalent as they are now. These games just disabled AA when your video card couldn't do it. SSAA is simply far too expensive to be of any use, especially since HDR already hits your fillrate particularly hard.

The exception, of course, was Valve's HDR method (first used in their Lost Coast tech demo) which only required a SM2 class card and was able to perform AA (presumably because they didn't use floating-point buffers). I don't remember the specifics, but I believe they performed some sort of tonemapping on a per-object basis rather than as a postprocessing effect.
NextWar: The Quest for Earth available now for Windows Phone 7.

Coincidentally, I came across this post about it just yesterday. Pretty much explains the basic Valve technique and should contain some interesting links.
Rim van Wersch [ MDXInfo ] [ XNAInfo ] [ YouTube ] - Do yourself a favor and bookmark this excellent free online D3D/shader book!
Quote:Original post by myers
Just a quick question: I gather that, because hardware FSAA can't be applied when rendering directly to a texture, the standard practice is first to render to some buffer that does permit AA (such as the back buffer), and then blit this to a texture. But I also understand that AA with floating point formats is unsupported on any current hardware (or at least on DX9-era hardware). So how is AA performed with HDR, or any other effect that requires FP textures? I assume there's a way other than just using software AA.


Another way to do incorporate MSAA and HDR on DX9 class hardware is to store HDR colours in an alternative colour space that doesn't require more than 4 bytes per pixel. On the other hand I'm not sure it's possible with DX9 to read such a render target back and convert it to RGB (it needs to be tone mappe after all..), but PS3 and 360 can easily handle this technique.
Quote:Original post by Marco Salvi
Another way to do incorporate MSAA and HDR on DX9 class hardware is to store HDR colours in an alternative colour space that doesn't require more than 4 bytes per pixel. On the other hand I'm not sure it's possible with DX9 to read such a render target back and convert it to RGB (it needs to be tone mappe after all..), but PS3 and 360 can easily handle this technique.


Actually, I think we finally might have gotten a DX9/HLSL implementation of LogLuv to work for this purpose (over in this thread). Anything you'd care to add would be very much appreciated, especially if you have anything to share on filtering and/or MSAA pitfalls you warned about [smile]
Rim van Wersch [ MDXInfo ] [ XNAInfo ] [ YouTube ] - Do yourself a favor and bookmark this excellent free online D3D/shader book!
Ok, thanks folks. One other thing: does MSAA work with FBOs, or am I going to have to use (groan) pbuffers?
Uh, that's OpenGL specific, obviously.

Sorry, OpenGL isn't my cup of tea, so I wouldn't know. The original LogLuv stuff to encode floating point in ARGB8 textures was written in Cg though, so that may be a viable alternative for OpenGL as well.
Rim van Wersch [ MDXInfo ] [ XNAInfo ] [ YouTube ] - Do yourself a favor and bookmark this excellent free online D3D/shader book!

This topic is closed to new replies.

Advertisement