I'm currently doing some work in UDK and I want to be able to use HDR textures. The problem is that UDK doesn't support anything beyond RGBA8888, so 16-bit or float textures are out of the question. I've tried formats like RGBE, RGBM and LogLuv. LogLuv has probably given me the best results, but they all suffer from the same problem which is that they can't be linearly sampled without producing banding artifacts. It's easy to see why. In the case of LogLuv, the channel that contains the LSB portion of the luminance contains harsh transitions.
Example:
http://s20.postimage.org/5789pshwt/lsb.png
My question is whether or not there is some way of encoding a greater than 8-bit precision luminance into two 8-bit channels in such a way that neither portion contains these kinds of banding patterns with harsh transitions.