• Advertisement

Archived

This topic is now archived and is closed to further replies.

Direct3D : alpha blending in 16bpp

This topic is 6594 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello, I''m working on a 3d engine to render a landscape. I''m using generic textures for the terrain, and for some tiles i''d like to add a road texture over the normal grass (or snow,desert,...) texture. So i''ve saved my road texture in a .DDS file (using the tool included with the DX7 SDK) in DXT1 format (1 alpha bit, 15 rgb bits). The file is ok when testing it with the DX7 "compress" sample. But impossible to render the road right in my engine, it seems like D3D ignore the alpha channel (although ALPHABLENDENABLED is set to TRUE, and SRC_BLEND to SRC_ALPHA). The file is loaded by D3DXCreateTextureFromFile(), and the video mode color depth is 16 bits (initialized by D3DXCreateContextEx). I''ve tried to change the numAlphaBits param of this function, but it doesn''t improve anything. What can i be doing wrong? Thanks in advance for any help!

Share this post


Link to post
Share on other sites
Advertisement
The DXT format means you are using texture compression. Unless you have a card that supports this I don't think it will work right. It could be that the compress sample will uncompress textures if it sees that your system does not support them. Just don't select any of those formats that you see and it will be saved as regular uncompressed image, which is probably what you want at the moment.

Oh, also, it seems to me that if you are only using one alpha bit, wouldn't it make more sense to just use color keying?

Edited by - Shinkage on 1/3/00 9:49:32 AM

Share this post


Link to post
Share on other sites

  • Advertisement