Jump to content
  • Advertisement
Sign in to follow this  
Licu

OpenGL Loading alpha only texture from 8bit grayscale

This topic is 4095 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi, I'm trying to load alpha only texture (D3DFMT_A8 format with D3DXCreateTextureFromFileEx) from 8bit grayscale images (tried tga and png formats). But DirectX is always loading the images as 32bit textures (ARGB) with alpha completely white. The video card is Geforce 5700. In OpenGL I can always load the 8 bit grayscale in whanever format I want. Here is the loading parameters: D3DXCreateTextureFromFileEx( dx_dev, name, D3DX_DEFAULT, D3DX_DEFAULT, D3DX_DEFAULT, 0, D3DFMT_A8, D3DPOOL_MANAGED, D3DX_DEFAULT, D3DX_DEFAULT, 0, NULL, NULL, &dx_texture); So the question is, what am I doing wrong? How can I load an alpha only texture from a grayscale image? Thanks

Share this post


Link to post
Share on other sites
Advertisement
D3DX's loading routines are prone to correcting your parameters if it detects a problem with them. Not all hardware supports D3DFMT_A8 textures - forget whether it was ATI or Nvidia that had them though. Could well be that D3DX is seeing your hardware doesn't support the texture format and is falling back to something it does support.

Enabling the debug runtimes should reveal why D3DX is changing things - it's usually quite helpful!

hth
Jack

Share this post


Link to post
Share on other sites
This is actually one of the things I've always hated about D3DX texture loading... it thinks (or used to) of grey scale images as being color based, and if you ask for alpha you'll get nothing.

You can use shaders. In ps.1.1 you can use blue replicate to get L8 data into an alpha channel.

You can load as L8 or X8R8G8B8, lock the surface and copy it to an A8 surface. You can make a tool to copy to A8 textures and save the result in a DDS file if you don't want to do this at runtime. Hopefully D3DX will promote it to A8L8 or A8R8G8B8 on cards that don't have A8.

But that's the kicker... some cards don't have pure A8 support, and may require larger amounts of video RAM. Cards that lack support tend to be cards that are older, thus slower, and have less video memory to begin with. You're forced to use extra resources on cards that have limited resources. Since you've got to plan on running on these cards anyway, is it worth the effort of using A8 on the high end/newer cards? These cards will will already be able to run your app faster than the lower end cards that you have to support.

These choices suck.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!