Loading alpha only texture from 8bit grayscale

Started by
1 comment, last by Namethatnobodyelsetook 16 years, 10 months ago
Hi, I'm trying to load alpha only texture (D3DFMT_A8 format with D3DXCreateTextureFromFileEx) from 8bit grayscale images (tried tga and png formats). But DirectX is always loading the images as 32bit textures (ARGB) with alpha completely white. The video card is Geforce 5700. In OpenGL I can always load the 8 bit grayscale in whanever format I want. Here is the loading parameters: D3DXCreateTextureFromFileEx( dx_dev, name, D3DX_DEFAULT, D3DX_DEFAULT, D3DX_DEFAULT, 0, D3DFMT_A8, D3DPOOL_MANAGED, D3DX_DEFAULT, D3DX_DEFAULT, 0, NULL, NULL, &dx_texture); So the question is, what am I doing wrong? How can I load an alpha only texture from a grayscale image? Thanks
Advertisement
D3DX's loading routines are prone to correcting your parameters if it detects a problem with them. Not all hardware supports D3DFMT_A8 textures - forget whether it was ATI or Nvidia that had them though. Could well be that D3DX is seeing your hardware doesn't support the texture format and is falling back to something it does support.

Enabling the debug runtimes should reveal why D3DX is changing things - it's usually quite helpful!

hth
Jack

<hr align="left" width="25%" />
Jack Hoxley <small>[</small><small> Forum FAQ | Revised FAQ | MVP Profile | Developer Journal ]</small>

This is actually one of the things I've always hated about D3DX texture loading... it thinks (or used to) of grey scale images as being color based, and if you ask for alpha you'll get nothing.

You can use shaders. In ps.1.1 you can use blue replicate to get L8 data into an alpha channel.

You can load as L8 or X8R8G8B8, lock the surface and copy it to an A8 surface. You can make a tool to copy to A8 textures and save the result in a DDS file if you don't want to do this at runtime. Hopefully D3DX will promote it to A8L8 or A8R8G8B8 on cards that don't have A8.

But that's the kicker... some cards don't have pure A8 support, and may require larger amounts of video RAM. Cards that lack support tend to be cards that are older, thus slower, and have less video memory to begin with. You're forced to use extra resources on cards that have limited resources. Since you've got to plan on running on these cards anyway, is it worth the effort of using A8 on the high end/newer cards? These cards will will already be able to run your app faster than the lower end cards that you have to support.

These choices suck.

This topic is closed to new replies.

Advertisement