partial transparency in D3D8 with 32bit tga files

Started by
3 comments, last by karasuman 21 years, 3 months ago
Does anyone know what render states should be set so that there is partial transparency with a 32bit targa file? I have the following set, and I seem to be getting only on-off transparency in the program. I''m not sure if the problem is with the file or the program.

g_lpD3DDevice->SetRenderState(D3DRS_ALPHABLENDENABLE, TRUE);
g_lpD3DDevice->SetRenderState(D3DRS_SRCBLEND,  D3DBLEND_SRCALPHA);
g_lpD3DDevice->SetRenderState(D3DRS_DESTBLEND, D3DBLEND_INVSRCALPHA);
 
Thanks in advance
Advertisement
1) Which texture format are you loading into ? - if it''s something like D3DFMT_A1R5G5B5, then you''re only getting 1 bit of alpha information (thus on/off) - try D3DFMT_A8R8G8B8

2) Ensure the ALPHATEST renderstate is set to FALSE, otherwise pixels whose alpha doesn''t pass whatever alpha REF value is set will not be rendered.

3) Check your SetTextureStageState calls to make sure that the alpha from the texture rather than say the vertex is actually making it through the whole pipeline to the frame buffer blending stage.

4) Make sure you set the D3DRS_ZWRITE renderstate to FALSE and render all transparent stuff AFTER non-transparent stuff (otherwise the alpha stuff will be writing Z and won''t look transparent when some opaque stuff is rendered).

--
Simon O''Connor
Creative Asylum Ltd
www.creative-asylum.com

Simon O'Connor | Technical Director (Newcastle) Lockwood Publishing | LinkedIn | Personal site

Thanks for the response! I think the problem is #3:

3) Check your SetTextureStageState calls to make sure that the alpha from the texture rather than say the vertex is actually making it through the whole pipeline to the frame buffer blending stage.

I have them set to...

	g_lpD3DDevice->SetTextureStageState(		0, D3DTSS_ALPHAOP, D3DTOP_MODULATE);	g_lpD3DDevice->SetTextureStageState(		0, D3DTSS_ALPHAARG1, D3DTA_TEXTURE);	g_lpD3DDevice->SetTextureStageState(		0, D3DTSS_ALPHAARG2, D3DTA_DIFFUSE); 


And I tried...

	g_lpD3DDevice->SetTextureStageState(		0, D3DTSS_ALPHAOP, D3DTOP_BLENDTEXTUREALPHA);	g_lpD3DDevice->SetTextureStageState(		0, D3DTSS_ALPHAARG1, D3DTA_TEXTURE);	g_lpD3DDevice->SetTextureStageState(		0, D3DTSS_ALPHAARG2, D3DTA_TEXTURE); 


among others. Anyone know what would be a good set of parameters? Thanks in advance.
My guess is that it''s your pixel format, because your alpha operations look fine to me. Btw, you don''t need to put D3DTA_TEXTURE in both arguments, as you can just use the D3DTOP_SELECTARG1 operation to ignore the 2nd argument. If you want to keep using D3DTA_DIFFUSE as one of the arguments though (typically modulated with D3DTA_TEXTURE) then ensure your per-vertex alpha is set correctly (ie. 1.0 for full opacity).

[ MSVC Fixes | STL | SDL | Game AI | Sockets | C++ Faq Lite | Boost | Asking Questions | Organising code files | My stuff ]
Thank you for the response!

I determined the problem, as usual, it was something silly! When I was changing the code from A1R5G5B5 to A8R8G8B8, I had only changed the part where D3D is initialized. I didn''t change the part where I loaded the file to a surface. So once I switched that, it was ok.

Thanks again for your time.

This topic is closed to new replies.

Advertisement