Packing a heightmap into a texture

Started by
8 comments, last by jamesw 18 years ago
My heightmap is 16 bit, and I want to be able to use it as a texture (for various applications that need a per-pixel height) without having to use a special format (such as D3DFMT_L16) that some hardware may not support. My question is, how would I go about packing this into a common D3DFMT_R8G8B8 format and then how would I get the 16 bit values back in a shader?
Advertisement
Any recent Radeon (9500+) or GeForce (5200+) supports D3DFMT_L16 (I had to look in the Caps Matrix of Hell [bawling]). Even the good 'old onboard Intel 945g supports it. So if that is within your range of hardware support, you may want to consider just using it.

Beyond that, I'm not quite sure, but I will think about it.
Dustin Franklin ( circlesoft :: KBase :: Mystic GD :: ApolloNL )
Looking at the format specification:
D3DFMT_R8G8B8 24-bit RGB pixel format with 8 bits per channel.

Each pixel has more than enough space to store the 16bit height. Just encode it in two of the 3 available channels. You would store low 8 bits under B, and high 8 bits under G.

In shader, just read the color of the texel, then recombine them to get original height back ( height = b + 256 * g )

The only "issue" here is, that you would be wasting 8 bits, but that isn't really that much.
Quote:I had to look in the Caps Matrix of Hell


Yeah that thing is a beast, has been extremely useful to me though since I don't have a bunch of old systems laying around to test things on.

Quote:( height = b + 256 * g )


Ahh so simple. Thanks guys.
Quote:Original post by jamesw
Quote:I had to look in the Caps Matrix of Hell

Yeah that thing is a beast, has been extremely useful to me though since I don't have a bunch of old systems laying around to test things on.

Normally I use the D3D Caps Database. However, it seems to be broken at the moment.
Dustin Franklin ( circlesoft :: KBase :: Mystic GD :: ApolloNL )
Since a heightmap would be something you interpret in software into vertices, why would it need to be compatible with the device? Why would the device even need to know? Just load it into the scratch pool and don't worry about it.
SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.
Quote:Original post by Promit
Since a heightmap would be something you interpret in software into vertices, why would it need to be compatible with the device? Why would the device even need to know? Just load it into the scratch pool and don't worry about it.

For parallax mapping, you need a heightmap which is sampled in the pixel shader. Perhaps that is the application here.
Dustin Franklin ( circlesoft :: KBase :: Mystic GD :: ApolloNL )
I'm trying to fade out the distortion in my water near the shorelines, in order to get rid of the glitch where the distorted reflection shows pixels from under the terrain. This glitch happens a lot in Far Cry and lots of others (not HL2 though). Previous discussion is in this thread. The water shader needs a per-pixel depth input to know where to blend out the distortion, and this is what the heightmap is for. A more accurate way would be to render the scene to a depth buffer and then find the distance between the water plane and the corresponding depth, but who wants to make an extra pass out of it. This is actually second on my list of things to try so I won't get around to it right away.
I think you might get texture filtering issues. If the filter is set to anything other than Point, the values will get interpolated between each pixel in the texture.
Normally, this wouldn't be a problem, and if I remember correctly, this might not be supported in floating point surface types, but this becomes an issue if you split the height value across two colors. Each color gets filtered seperately, which might give you some artifacts.

Were you planning on using a linear filter to interpolate between each of your height values?
Sirob Yes.» - status: Work-O-Rama.
Actually I hadn't even thought about that, thanks. I definately need to interpolate between height values. So I should either use D3DFMT_L16 and linear filtering or D3DFMT_R8G8B8 with point filtering, and do the filtering in the shader? D3DFMT_L16 is looking better and better. It's a PS 2.0 shader anyway, so as circlesoft pointed out all DX9 cards should be able to hack it.

This topic is closed to new replies.

Advertisement