HDR Without Floating Point Textures!
It took me 2 weeks to study GLSL and get all the render-to-texture stuff working, and all that just to find out that my poor video card does not support floating point textures!
Well, I didn't have the money to buy a new one, but I didn't want to give up either, so this is what I came up with!
Why not use the alpha channel of ordinary textures and keep a special value there, which will indicate how much brighter the pixel is than 255?
Then, in the fragment shader, we could just multiply the color of the fragment by that value in the alpha channel, check if any of the resulting color components are bigger then 255, and if they are, put that fragment in the HDR partition (which will be later blurred to create the blooms), or otherwise, put the LDR partition.
What do you think about this idea?
Sounds good, if you don't mind trading off alpha values in your textures for HDR values.
Another way might be to use a HDR map, where the overbrightness values are stored as a single channel in an independent texture, just like per-pixels normals are stored in a normal map. You'd have to bind another texture, so there's more overhead, but it's more flexible.
Another way might be to use a HDR map, where the overbrightness values are stored as a single channel in an independent texture, just like per-pixels normals are stored in a normal map. You'd have to bind another texture, so there's more overhead, but it's more flexible.
mrbig proposal is something like RGBE where E stands for expoent. See http://www.graphics.cornell.edu/~bjw/rgbe.html and google for more details
If my idea isn't new, why isn't it used anywhere?
It shouldn't be even a bit slower than floating point textures...
It shouldn't be even a bit slower than floating point textures...
Um it is beign used, in fact it was being used since at least October 2004 by Microsoft DirectX. Encoding data in textures is nothing new.
I've searched all over the net and found nothing...
Oh well. O_O
It doesn't really matter as long as it works. :)
Oh well. O_O
It doesn't really matter as long as it works. :)
personally im waiting until i have a card that supports floating point textures before i do hdr but u can do it without (but its messier)
theres an nv3x example on the nvidia dev website
theres an nv3x example on the nvidia dev website
Quote:Original post by Jack Sotac
Tron 2.0 from Monolith uses almost this exact method. Real-Time Glow
Er... No. RGBE is HDR, what they did is LDR glow, with the glow factor stored in alpha. HDR is something, bloom is something else, TRON2 has glow which is something else totally, HDR can be done without bloom, mrbig wants to do HDR. 'Nuff said.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement