Jump to content
  • Advertisement
Sign in to follow this  
myers

Hardware antialiasing in floating point

This topic is 3729 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Just a quick question: I gather that, because hardware FSAA can't be applied when rendering directly to a texture, the standard practice is first to render to some buffer that does permit AA (such as the back buffer), and then blit this to a texture. But I also understand that AA with floating point formats is unsupported on any current hardware (or at least on DX9-era hardware). So how is AA performed with HDR, or any other effect that requires FP textures? I assume there's a way other than just using software AA.

Share this post


Link to post
Share on other sites
Advertisement
MSAA on FP16 render targets has been supported since the Radeon X1800 series, and the Geforce 8 series IIRC. Without hardware support for MSAA, you can always try supersampling (render at a higher resolution and downsample).

But yes, you'll need to create an MSAA render target first using IDirect3DDevice9::CreateRenderTarget(). Since textures don't support MSAA render targets, and shaders can't sample from surfaces, you need to render to the MSAA render target then StretchRect() to your texture so it can be sampled in your HDR shaders.

Share this post


Link to post
Share on other sites
Hmm, okay. Just so I've got this straight: in any HDR application prior to DX10 hardware, no hardware FSAA was performed - instead, all antialiasing was done via the supersampling method you describe?

Share this post


Link to post
Share on other sites
The X1800 wasn't a DX10 class part. I remember several years ago, it was one of the features which prompted me to purchase an X1800 over a Geforce 7 series part at the time. Oblivion with HDR + AA was nice. [grin]

Back then, there weren't very many games that actually used FP16 HDR. Oblivion was one, FarCry was another. There were a few others, but they were nowhere near as prevalent as they are now. These games just disabled AA when your video card couldn't do it. SSAA is simply far too expensive to be of any use, especially since HDR already hits your fillrate particularly hard.

The exception, of course, was Valve's HDR method (first used in their Lost Coast tech demo) which only required a SM2 class card and was able to perform AA (presumably because they didn't use floating-point buffers). I don't remember the specifics, but I believe they performed some sort of tonemapping on a per-object basis rather than as a postprocessing effect.

Share this post


Link to post
Share on other sites
Quote:
Original post by myers
Just a quick question: I gather that, because hardware FSAA can't be applied when rendering directly to a texture, the standard practice is first to render to some buffer that does permit AA (such as the back buffer), and then blit this to a texture. But I also understand that AA with floating point formats is unsupported on any current hardware (or at least on DX9-era hardware). So how is AA performed with HDR, or any other effect that requires FP textures? I assume there's a way other than just using software AA.


Another way to do incorporate MSAA and HDR on DX9 class hardware is to store HDR colours in an alternative colour space that doesn't require more than 4 bytes per pixel. On the other hand I'm not sure it's possible with DX9 to read such a render target back and convert it to RGB (it needs to be tone mappe after all..), but PS3 and 360 can easily handle this technique.

Share this post


Link to post
Share on other sites
Quote:
Original post by Marco Salvi
Another way to do incorporate MSAA and HDR on DX9 class hardware is to store HDR colours in an alternative colour space that doesn't require more than 4 bytes per pixel. On the other hand I'm not sure it's possible with DX9 to read such a render target back and convert it to RGB (it needs to be tone mappe after all..), but PS3 and 360 can easily handle this technique.


Actually, I think we finally might have gotten a DX9/HLSL implementation of LogLuv to work for this purpose (over in this thread). Anything you'd care to add would be very much appreciated, especially if you have anything to share on filtering and/or MSAA pitfalls you warned about [smile]

Share this post


Link to post
Share on other sites
Ok, thanks folks. One other thing: does MSAA work with FBOs, or am I going to have to use (groan) pbuffers?

Share this post


Link to post
Share on other sites

Sorry, OpenGL isn't my cup of tea, so I wouldn't know. The original LogLuv stuff to encode floating point in ARGB8 textures was written in Cg though, so that may be a viable alternative for OpenGL as well.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!