Direct3D : alpha blending in 16bpp

Started by
0 comments, last by Crousto 24 years, 3 months ago
Hello, I''m working on a 3d engine to render a landscape. I''m using generic textures for the terrain, and for some tiles i''d like to add a road texture over the normal grass (or snow,desert,...) texture. So i''ve saved my road texture in a .DDS file (using the tool included with the DX7 SDK) in DXT1 format (1 alpha bit, 15 rgb bits). The file is ok when testing it with the DX7 "compress" sample. But impossible to render the road right in my engine, it seems like D3D ignore the alpha channel (although ALPHABLENDENABLED is set to TRUE, and SRC_BLEND to SRC_ALPHA). The file is loaded by D3DXCreateTextureFromFile(), and the video mode color depth is 16 bits (initialized by D3DXCreateContextEx). I''ve tried to change the numAlphaBits param of this function, but it doesn''t improve anything. What can i be doing wrong? Thanks in advance for any help!
Advertisement
The DXT format means you are using texture compression. Unless you have a card that supports this I don't think it will work right. It could be that the compress sample will uncompress textures if it sees that your system does not support them. Just don't select any of those formats that you see and it will be saved as regular uncompressed image, which is probably what you want at the moment.

Oh, also, it seems to me that if you are only using one alpha bit, wouldn't it make more sense to just use color keying?

Edited by - Shinkage on 1/3/00 9:49:32 AM

This topic is closed to new replies.

Advertisement