Sign in to follow this  
Followers 0
Chris_F

Bit packing for HDR textures

2 posts in this topic

I'm currently doing some work in UDK and I want to be able to use HDR textures. The problem is that UDK doesn't support anything beyond RGBA8888, so 16-bit or float textures are out of the question. I've tried formats like RGBE, RGBM and LogLuv. LogLuv has probably given me the best results, but they all suffer from the same problem which is that they can't be linearly sampled without producing banding artifacts. It's easy to see why. In the case of LogLuv, the channel that contains the LSB portion of the luminance contains harsh transitions.

 

Example:

lsb.png

http://s20.postimage.org/5789pshwt/lsb.png

 

My question is whether or not there is some way of encoding a greater than 8-bit precision luminance into two 8-bit channels in such a way that neither portion contains these kinds of banding patterns with harsh transitions.

0

Share this post


Link to post
Share on other sites

The common solution is to use two DXT textures, at least if you're going for compression. You should have a look at this, or this.

Edited by MJP
0

Share this post


Link to post
Share on other sites

Unfortunately UDK does not allow you to import DDS textures, which means it's impossible to use a custom DXT5 compressor like they do. However, I could probably store it all in a single RGBA8 texture if I leave out W and reconstruct it in shader. Hopefully this would yield good results.

Edited by Chris_F
0

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0