Archived

This topic is now archived and is closed to further replies.

Help with texture stage states

This topic is 5016 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I''m trying to use the luminance of a texture as the source blend factor. I have a texture in the format D3DFMT_L8 (it''s just an 8-bit grayscale). I want to render with this texture, using the texture luminance as the blend factor. But I want the color for the rendering to be taken only from the vertex colors (not the texture color). I can do this:
pD3DDevice->SetRenderState (D3DRS_SRCBLEND, D3DBLEND_SRCCOLOR);
pD3DDevice->SetRenderState (D3DRS_DESTBLEND, D3DBLEND_INVSRCCOLOR);
 
and it almost does what I want. It''s using the texture luminance as the blending factor like I want, but it''s also using the luminance as the texture color (modulated by the vertex colors). I want it to only use the vertices for color, and just use the texture luminance purely as the blending factor. Is there any way I can do this? Maybe with the texture stage states somehow? TIA

Share this post


Link to post
Share on other sites
In the texture stages there is no intended way to get color into the alpha channel. In pixel shaders you are allowed to move blue into alpha, which would work fine, but I''m guessing you don''t want to use pixel shaders. I''m guessing you can''t use A8, or P8, and don''t want to use A8L8, A8P8, A8R3G3B2, etc.

So, that leaves us looking for sideeffects that will affect alpha... and we''re in luck. Dot3 replicates to all channels, including alpha. If you want to assume TFactor and Dot3 are present on the card you''re all set. You can set TFactor to 255,0,0, and dot it with the texture lookup. Remember though that this will make the texture range 128-255 valid, and 0-127 invalid, so it''s kinda like a 7 bit alpha now.

Share this post


Link to post
Share on other sites
quote:
Original post by Namethatnobodyelsetook
In the texture stages there is no intended way to get color into the alpha channel. In pixel shaders you are allowed to move blue into alpha, which would work fine, but I''m guessing you don''t want to use pixel shaders. I''m guessing you can''t use A8, or P8, and don''t want to use A8L8, A8P8, A8R3G3B2, etc.


I initially tried to create the texture as D3DFMT_A8, but the call to CreateTexture() failed with D3DERR_INVALIDCALL (I just now tried it with D3DFMT_P8, with the same failure). Ideally, the A8 format is what I would like, but it appears that D3D won''t create a texture in this format (please tell me if it will, and I''m doing something wrong). I could certainly use something like A8L8 and just fill the color data with solid white, but I have a lot of these textures and I hate to waste all that texture memory on color data I don''t need.

quote:

So, that leaves us looking for sideeffects that will affect alpha... and we''re in luck. Dot3 replicates to all channels, including alpha. If you want to assume TFactor and Dot3 are present on the card you''re all set. You can set TFactor to 255,0,0, and dot it with the texture lookup. Remember though that this will make the texture range 128-255 valid, and 0-127 invalid, so it''s kinda like a 7 bit alpha now.



Hmmm, that is an interesting approach. I could certainly convert to the "pseudo 7 bit alpha" by shifting the data one bit as I load from the image file.

My only concern is how many cards do support Dot3 and TFactor? I really want to support geForce1 level cards, and ideally as far back as TNT2 level cards. Also, even if the card supports it, is the performance going to be slower using something like Dot3?

What level card does it take to support pixel shaders?

Share this post


Link to post
Share on other sites
Pixel shaders need a GeForce3 or better.

Some older cards don''t support TFactor (such as ATI RagePro Turbo, aka, XPert98). I''m not sure when Dot3 was added to DirectX, so I can''t really say which cards support it. I''m pretty sure it was in DX7, but I could be wrong, as I started with DX8.

By the way, you can''t hard code your formats... you need to design your code to be flexible, unless you''re using common formats like A8R8G8B8, or R5G6B5. For example using DX8, my GeForce3 supports P8, but not A8, or L8. Using DX9 on the same card, same drivers, it supports L8, but not P8, or A8. It''s all very odd. You need query which formats are supported, and come up with some way to rank the formats for your needs, and use the best-fit format that''s actually supported.

If you want to support older cards, you might be out of luck using L8, and you''ll need to find a format with alpha, and waste some color bits.

Share this post


Link to post
Share on other sites
Okay, looks like I''ll just need to use a common format and waste the RGB space with solid white. Thanks for your help and feedback.

It just seems strange to me that support for some type of alpha-only format wouldn''t be common. There''s so many typical cases where you''d want to render with a solid color, but blend or mask it with something like an alpha channel (colored sprites/particles, bitmapped font rendering, etc.).

Share this post


Link to post
Share on other sites
I know, it''s absurd, especially considering nVidia exposes A8, and AL8 on Xbox (AL8 is 8 bits when A=R=G=B)... and considering you could use P8 as AL8 on PC, until they discontinued support in DX9, while the hardware does it perfectly well in DX8.

You might actually be better off going back to DX8.1 if you''re supporting old hardware. It''s a shame you can''t support both in DX9.

Share this post


Link to post
Share on other sites