Texture blending using vertex alpha

Started by
3 comments, last by Namethatnobodyelsetook 15 years, 7 months ago
I want to blend two (or more if possible) textures together,in a single pass using multitexturing, and vertex alpha. So far I've tried it with 2 textures, but it doesn't give the desired results. My States are set as following: m_pDevice->SetTextureStageState(0, D3DTSS_COLOROP, D3DTOP_SELECTARG1); m_pDevice->SetTextureStageState(0, D3DTSS_COLORARG1, D3DTA_TEXTURE); m_pDevice->SetTextureStageState(1, D3DTSS_COLOROP, D3DTOP_BLENDDIFFUSEALPHA); m_pDevice->SetTextureStageState(1, D3DTSS_COLORARG1, D3DTA_CURRENT); m_pDevice->SetTextureStageState(1, D3DTSS_COLORARG2, D3DTA_TEXTURE); the FVF looks like this: (D3DFVF_XYZ | D3DFVF_TEX2 | D3DFVF_DIFFUSE) What am I doing wrong? And what states do I have to set for blending more textures together? Thanks for any help! Loncs
Advertisement
bump :)
The should be fine, as long as that's not your complete code, and you really do have two sets of UV coords in your vertex.

Any stage with a colorop requires an alphaop, so set stage 1's alphaop (and args) to something other than disabled. If you really have 2 UVs, ensure that D3DTSS_TEXCOORDINDEX is set appropriately on the second stage.

Diffuse in this case means one of two things. If lighting is off, use vertex diffuse, if lighting is on, use the result of the lighting system. Does your lighting use material or vertex color? If it uses material color, you can't use vertex alpha (without porting to shaders and handling it yourself). The lighting system mucks with your alpha (lights and materials all have alpha values that do *something*) and I can't remember what settings make it behave, as we switched to shaders a very long time ago... even for non-shader hardware. You can emulate vertex shaders with software vertex processing, then output to the fixed pipe for pixel processing. You lose no potential market share, but get lots of extra flexibility.
Thanks a lot. I got it working now.
The Strange thing is, that if I turn on alphablending with ALPHABLENDENABLE, then the SRCBLEND, DESTBLEN and BLENDOP renderstates affect the multitexturing. However I thought these render states only have effect when blending to the framebuffer, so when doing multipass blending. Is this normal behavior?
And of course I want to move on to shaders soon, but I thought it would be good to learn how the fixed function pipeline works first. Or is this just a waste of time cos'it isn't used anymore?
Alphablending doesn't affect blending between stages, just the output from the final stage. You may have been outputting vertex alpha (current on stage 0 means the same as diffuse), causing you to cut holes wherever you were attempting to show one of your textures. Two entirely separate blends, but using the same alpha, may have made it seem like the first blend wasn't working correctly.

Learning the fixed pipeline is still useful if you're going to release something publicly over the next couple of years to a non hardcore gaming crowd. Many older systems have older GPUs or non-shader motherboard graphics.

The fixed pipeline is going away though... it's not there on the 360. It's not in D3D10. It's not uncommon to see a shader 2.0 minimum spec on games, but fixed pipe support is definitely still needed for the casual market.

This topic is closed to new replies.

Advertisement