Sign in to follow this  

HDR w/ 2 8rgba buffers?

This topic is 3458 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I was just thinking about this and I was wondering if it would be doable to do HDR with two 32 bit buffers, one with LDR and one with the HDR? Just thinking about how to do the tonemapping that requires the downsampling to get average luminance without using a FP buffer.

Share this post


Link to post
Share on other sites
yes it is. I tried a couple of combinations. You can even use a MRT consisting of two 8:8:8:8 render targets to alpha blend objects in there ... if you can live with a few limitations.
So the idea is to render bits 1-8 in render target one and bits 4 - 12 in render target two. So you only have 12-bits but you have alpha blending :-)

Share this post


Link to post
Share on other sites
Quote:
Original post by Pragma
It's possible to store HDR data in 8:8:8:8 using the RGBE format: http://www.graphics.cornell.edu/online/formats/rgbe

But unlike wolf's suggestion, you won't be able to blend it or have the gpu downsample it for you.


Well it won't be mathmatically correct if you blend or filter it, but then again what do we do in realtime graphics that isn't fudged? [smile]

Also LogLuv is another nice choice to use if you want to get HDR into a 32bpp buffer. Heavenly Sword used it with hardware multisampling and while it wasn't "correct", I didn't spot any noticeable artifacts.

Share this post


Link to post
Share on other sites
Halo 3 uses the technique of rendering both a LDR and HDR buffer.

You can read a paper describing their implementation details here.
http://www.microsoft.com/downloads/details.aspx?FamilyId=995B221D-6BBD-4731-AC82-D9524237D486&displaylang=en

Also, I believe the Source engine renders directly to a RGBA8 buffer. They do this by keeping a running histogram of luminance values through occlusion queries (i'm not sure exactly how this works, it'd seems cool though), and they do the tonemapping step in the pixel shader. Thus, there is no need to render to a FP buffer and then do a post-process tonemapping step. There are a few downsides to this approach though, one is which your bloom has to occur after tonemapping...
Their Siggraph presentation is here:
http://www.valvesoftware.com/publications/2006/SIGGRAPH06_Course_HDRInValvesSourceEngine_Slides.pdf

Share this post


Link to post
Share on other sites
Quote:
Original post by wolf
yes it is. I tried a couple of combinations. You can even use a MRT consisting of two 8:8:8:8 render targets to alpha blend objects in there ... if you can live with a few limitations.
So the idea is to render bits 1-8 in render target one and bits 4 - 12 in render target two. So you only have 12-bits but you have alpha blending :-)


woot This is exactly what I was thinking of doing.

As for the other techniques. RGBE doesn't alpha blend, Halo guys use a FP buffer that XNA doesn't have access to on the 360.

Share this post


Link to post
Share on other sites
Quote:

Also, I believe the Source engine renders directly to a RGBA8 buffer. They do this by keeping a running histogram of luminance values through occlusion queries (i'm not sure exactly how this works, it'd seems cool though), and they do the tonemapping step in the pixel shader. Thus, there is no need to render to a FP buffer and then do a post-process tonemapping step. There are a few downsides to this approach though, one is which your bloom has to occur after tonemapping...


Regarding sources method, I was wondering if anyone knows how they might actually be doing bloom?

They tonemap their color then put it into the rgba8 texture, which clamps it between 0 and 1, even if after tonemapping, the value happened to be 5 for example. So, how is it that they even know to bloom in a certain spot?

The only way I can think of is that they use the alpha channel to flag that a pixel needs to be bloomed...

Share this post


Link to post
Share on other sites
Quote:
Original post by coderchris

Regarding sources method, I was wondering if anyone knows how they might actually be doing bloom?

They tonemap their color then put it into the rgba8 texture, which clamps it between 0 and 1, even if after tonemapping, the value happened to be 5 for example. So, how is it that they even know to bloom in a certain spot?

The only way I can think of is that they use the alpha channel to flag that a pixel needs to be bloomed...


Well they could just use an LDR value as a threshold and bloom everything above that...I haven't played it in a while so I don't remember if there's any areas that suggest they're doing otherwise.

Share this post


Link to post
Share on other sites
For the source engine, I think probably they just do a bloom extraction pass on the tonemapped image. IE, they probably recalculate the lumninace for the post-HDR (tonemapped) pixels, and then just extract pixels that exceed a certain threshold. This is certainly not completely accurate but probably a good enough fake. It would give passable results because bright areas would get tonemapped to areas with high relative post luminance compared to dark areas. You'd just have to set your threshold appropriately.

Share this post


Link to post
Share on other sites
Quote:
Original post by coderchris
Regarding sources method, I was wondering if anyone knows how they might actually be doing bloom?

They tonemap their color then put it into the rgba8 texture, which clamps it between 0 and 1, even if after tonemapping, the value happened to be 5 for example. So, how is it that they even know to bloom in a certain spot?


Bloom has little to do with HDR, and was in use way before we had real HDR formats usable. You can take any LDR image (from a photograph for example) go in photoshop and simulate an overbright light source, bloom, soft focus, and lens flare. Those are things that are best avoided in real life photography because they scream cheap lense or bad photography (well except for a few voluntary artistic effects). But these effects are used in the game industry to hide the fact that games are rendered with a limited range of expression and to enhance them beyond their low dynamic range displays. More and more people equating bloom with HDR rendering have been blurring the line (pun !).

But anyway.

So about Source, yes it doesn't really matter, it's between a bloom and a soft focus effect, you take your ldr rendertarget as a texture, you apply some non linear effect (to boost highlights), then you blur the highlights and blend them on top of the normal render target. That's what they were doing with Lost Coast, haven't been checking if they did things differently these days.

Some times ago, there was Splinter Cell on Xbox (then PC) and they were simply rendering the bright light source in a separate texture, then blurring those on top of the screen, that was cool.



LeGreg

Share this post


Link to post
Share on other sites
It is possible to encode bloom into the alpha channel if you willing write some tricky blend mode conversions and don't mind making some sacrifices. You won't be able to bloom any objects that need destination alpha.

Share this post


Link to post
Share on other sites

This topic is 3458 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this