Creating an alpha texture

Started by
8 comments, last by Strange Species 19 years ago
Using a valid DirectX device, I am able to create textures. However, I am unable to create an alpha-only texture. For example, the following works: Device->CreateTexture(Width, Height, 0, 0, D3DFMT_A8R8G8B8, D3DPOOL_MANAGED, &AlphaTexture, 0); But this does not: Device->CreateTexture(Width, Height, 0, 0, D3DFMT_A8, D3DPOOL_MANAGED, &AlphaTexture, 0); I have also tried switching D3DPOOL_MANAGED to D3DPOOL_DEFAULT, but that didn't do anything. I will gladly supply more info if anyone requires it. Any help would be appreciated :)
Advertisement
What error returns when you run the code? What does DX Debug say?
Chris ByersMicrosoft DirectX MVP - 2005
I'm not sure, I only implemented very basic error check. I will find out now.
It's just invalid call. As for DX Debug, I don't think that's possible as I'm running the app through a C++ DLL within a VB program.
Does your card support formats of D3DFMT_A8? Have a look at the DirectX Caps Viewer -> Your graphics card type -> HAL -> Adapter Formats -> (whatever format your backbuffer is) -> Texture FDormats.
The Caps Viewer can be found in the Utilities folder of the DirectX SDK.
Hmmm... it would appear not, which is odd. I'd have thought that it'd be fairly widely supported, and it's a decent graphics card. It supports A8L8, L8, and many others, but not A8.

How do you recommend I perform texture blending without an A8 texture?

If it helps, this is the intended order of textures:
- Base texture (colour only, fixed)
- Alpha map 1 (alpha only, dynamic)
- Texture 1 (colour only, fixed)
- Alpha map 2 (alpha only, dynamic)
- Texture 2 (colour only, fixed)
- Alpha map 3 (alpha only, dynamic)
- Texture 3 (colour only, fixed)
Yeah, I just realised my GeForce 4 doesn't do A8 either. I also thought it was fairly widespread.
It shouldn't really matter what format you use, since you'll only be using the alpha channel. I'd probably give A8L8 a try. You might want to be able to fall back to A8R8G8B8 if that's not supported on the target machine (Although it'll be a hell of a waste of gfx memory)
I'll only be using 3 alpha textures for my terrain mapping, and they'll probably be 32x32 pixels, so any memory loss should be minimal I suppose. Thanks for your help, I should be able to continue with it now :)
Beware that on the ATI Radeon 9600/9700/9800, D3DFMT_A8 is supported whereas D3DFMT_A8L8 isn't supported.

D3DFMT_A8L8 is supported on the nVidia GeForce 3 and above, but they don't support D3DFMT_A8.


A more widely supported* format on most modern ATI *and* nVidia chips is D3DFMT_L8 which is essentially grey scale, so if you're using pixel shaders, you can just use the .r, .g or .b channel instead of the .a channel.

If you're not using pixel shaders, for most things you should stil be able to rework your SetTextureStageState() set up. If you really need the value in the alpha channel you could probably abuse a D3DTOP_DOTPRODUCT3 since that replicates the result of the operation into all channels including alpha; a DOT3 against a TFACTOR of 0x00FF0000 would copy the red channel into A,R,G,B for example.



[L8 are commonly used for lightmaps which is probably why there's better support for them]

Simon O'Connor | Technical Director (Newcastle) Lockwood Publishing | LinkedIn | Personal site

Interesting... using the DX Texture tool, I created a small A8 texture. When I tried to load it using D3DXCreateTextureFromFile(), it succeeded.

This topic is closed to new replies.

Advertisement