• Advertisement


This topic is now archived and is closed to further replies.

Advanced Enhanced 2D Question about alpha-mapping

This topic is 6419 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I need to add alpha-blending capabilities to a 16-bit texture without switching to 32 bit-texturing (upgrading it to 8888) and not loosing quality (downgrading it to 4444). Is there a way to attach to a 16-bit texture an alpha-only texture (map), and to tell the API (d3d) to use color from my texture and alpha from the map? I've been investigating quite a while, but couldn't find any answers, just some vague ideas that this can be done. Edited by - Despotismo on 7/21/00 5:01:50 PM

Share this post

Link to post
Share on other sites
You will need to use two different textures to do this. I haven't even moved up to 3d programming yet, but I know how to do this. Load one texture as a 16-bit normal texture, and another with 8-bit alpha. Then use texture stages to combine them.

// Load your textures here

// Combine the two
lpd3dDev->SetTexture (0, texture);
lpd3dDev->SetTexture (1, alpha);
lpd3dDev->SetTextureStageState (0, D3DTSS_COLOROP, D3DTOP_SELECTARG1);
lpd3dDev->SetTextureStageState (0, D3DTSS_COLORARG1, D3DTA_TEXTURE);
// NOTE: I'm not sure if you have to set the alpha operation in texture stage 0. I did here just in case.
lpd3dDev->SetTextureStageState (0, D3DTSS_ALPHAOP, D3DTOP_SELECTARG2);
lpd3dDev->SetTextureStageState (0, D3DTSS_ALPHAARG2, D3DTA_CURRENT);
lpd3dDev->SetTextureStageState (1, D3DTSS_COLOROP, D3DTOP_SELECTARG2);
lpd3dDev->SetTextureStageState (1, D3DTSS_COLORARG2, D3DTA_CURRENT);
lpd3dDev->SetTextureStageState (1, D3DTSS_ALPHAOP, D3DTOP_SELECTARG1);
lpd3dDev->SetTextureStageState (1, D3DTSS_ALPHAARG1, D3DTA_TEXTURE);
lpd3dDev->SetTextureStageState (2, D3DTSS_COLOROP, D3DTOP_DISABLE);
lpd3dDev->SetTextureStageState (2, D3DTSS_ALPHAOP, D3DTOP_DISABLE);

// Then render the polygon that uses this texture. We're done!

I hope this helped. Please tell me if it worked for you or not.


Edited by - Dave2001 on July 26, 2000 4:11:05 PM

Share this post

Link to post
Share on other sites
I can''t get it working.

I guess that the code you sent is only for multiple-texture blending enabled HW. How can I separate this correctly into a multi-pass texture blend?

Another thing. The whole results depend only in the middle-tier internal texture representation. If my texture is 565, and I combine it through stages with an alpha-only texture, this will give me another texture that will be blended into de final polygon. If it happens that this middle internal texture is not alpha-capable, all the alpha information will be lost!

Do all cards support 32 bit textures, because, If I cant get this working, I can perfectly use 8888 pixel format, and achieve the effect. But I think that some cards (Voodoo1 & 2, for example) dont support 32 bit texs, so I must use 16 bit textures with 8 bit alpha maps. But if to do this I must demand that the card has multiple-tex blending, I would have the same high-profile 3d card problem, cause not every card supports this.

Thanks anyways,

Despotismo AKA Javier Otaegui
Sabarasa Entertainment

Share this post

Link to post
Share on other sites

  • Advertisement