Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

GF4MX/DX8.1 dest alpha read problem

This topic is 5835 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello. I had some DX7 code that used D3DBLEND_DESTALPHA blend states in some important areas. I was using an ATI XPert 128 at the time, and the code ran correctly. I then bought a Gainward GeForce4 MX card ("PowerPack", model Pro/450TV, 64MB DDR), and it runs the code correctly except for the parts where I use the destination alpha read. In those sections, the result is as if the alpha had uniform value of 1.0. I''ve tried various experiments, such as setting the alpha to 0 using the buffer clear, but the results still work as though I was reading a 1.0 from the destination alpha all the time. I then re-wrote the code using the DirectX8.1 interface. Again, all the code except the part with the alpha read worked properly, and again, the results seemed to imply that the detination alpha was filled with the value 1.0. I''m using the newest NVidia drivers. I don''t think there is a mistake in my code-- it worked on the ATI card, I''ve checked the initialization code again to make sure I''m using the A8R8G8B8 format for the render surface. I need to know if anyone else has had this problem, or if there might be something I''ve overlooked code-wise that the ATI card driver didn''t catch. Sorry for the long post, and thanks.

Share this post


Link to post
Share on other sites
Advertisement
why not post the renderstate() calls in which you use D3DBLEND_DESTALPHA.

its likly your doing something wrong that the ati drivers ignore but the nvidia drivers do not.

pretty sure no video card supports ARGB color format for the backbuffer (which means no dest alpha when dealing with that). though render target buffers (ie drawing to a texture) supports an alpha channel. how are you handling this? drawing to a texture i hope.

try it with the dx8 ref rastirzier (software driver designed to test effects and ensure its a driver problem not a code problem).

Share this post


Link to post
Share on other sites
destination alpha has been in from geforce2..

And probably the ati card had it to ati can do proper 2d stuff.. but iam not sure

Share this post


Link to post
Share on other sites
Thanks for the reference driver idea! The problem is reproduced by the reference driver, so I guess I can rule out a hardware problem with the new card.

Here is one of the state setups. In this example, I've set the first pass to all black, and the second to all white, except that the second pass takes the dest alpha, and modulates it with the current passes color component. The should be black, since 0.f*1.0=0.f. But the result is white. Similar results occur elsewhere dest alpha is used.


          
D3DCOLOR dir_color = leMath::colorToARGB( 0.f, 0.f, 0.f, 0.f );
hr = dx_stuff.d3dd->SetRenderState( D3DRS_TEXTUREFACTOR, dir_color );
if(FAILED(hr)) return false;

// set fall off texture to modulate color against

hr = dx_stuff.d3dd->SetTexture( 0, shader.getTexture( leSHADERTEXTYPE_FALLOFF ).tex );
if(FAILED(hr)) return false;

// set transform for falloff texture and dir light

hr = dx_stuff.d3dd->SetTransform( D3DTS_TEXTURE0, (D3DMATRIX*) &scene.light.matr );
if(FAILED(hr)) return false;

// set address to use camera space normals...

hr = dx_stuff.d3dd->SetTextureStageState( 0, D3DTSS_ADDRESSU, 1 | D3DTADDRESS_CLAMP );
if(FAILED(hr)) return false;
hr = dx_stuff.d3dd->SetTextureStageState( 0, D3DTSS_ADDRESSV, 1 | D3DTADDRESS_CLAMP );
if(FAILED(hr)) return false;
hr = dx_stuff.d3dd->SetTextureStageState( 0, D3DTSS_TEXTURETRANSFORMFLAGS, D3DTTFF_COUNT2 );
if(FAILED(hr)) return false;
hr = dx_stuff.d3dd->SetTextureStageState( 0, D3DTSS_TEXCOORDINDEX, 1 | D3DTSS_TCI_CAMERASPACENORMAL );
if(FAILED(hr)) return false;

// set first stage to modulate falloff with tfactor

hr = dx_stuff.d3dd->SetTextureStageState( 0, D3DTSS_COLORARG1, D3DTA_TEXTURE );
if(FAILED(hr)) return false;
hr = dx_stuff.d3dd->SetTextureStageState( 0, D3DTSS_COLORARG2, D3DTA_TFACTOR );
if(FAILED(hr)) return false;
hr = dx_stuff.d3dd->SetTextureStageState( 0, D3DTSS_COLOROP, D3DTOP_MODULATE );
if(FAILED(hr)) return false;
hr = dx_stuff.d3dd->SetTextureStageState( 0, D3DTSS_ALPHAARG1, D3DTA_TEXTURE );
if(FAILED(hr)) return false;
hr = dx_stuff.d3dd->SetTextureStageState( 0, D3DTSS_ALPHAARG2, D3DTA_TFACTOR );
if(FAILED(hr)) return false;
hr = dx_stuff.d3dd->SetTextureStageState( 0, D3DTSS_ALPHAOP, D3DTOP_MODULATE );
if(FAILED(hr)) return false;

// disable second stage so it doesn't mess up anything.

hr = dx_stuff.d3dd->SetTextureStageState( 1, D3DTSS_COLOROP, D3DTOP_DISABLE );
if(FAILED(hr)) return false;

// make sure blending is off

hr = dx_stuff.d3dd->SetRenderState( D3DRS_ALPHABLENDENABLE, FALSE );
if(FAILED(hr)) return false;

// render the reqs...

if(!renderVerts( leENTTYPE_STATICOBJ, batch, vb, dx_stuff )) return false;



// set the alpha color as tfactor

D3DCOLOR amb_color = leMath::colorToARGB( 1.f, 1.f, 1.f, 1.f );
hr = dx_stuff.d3dd->SetRenderState( D3DRS_TEXTUREFACTOR, amb_color );
if(FAILED(hr)) return false;

// Identity texture

hr = dx_stuff.d3dd->SetTexture( 0, scene.i_tex.tex );
if(FAILED(hr)) return false;

// set blending to modulate dest alpha with src color

hr = dx_stuff.d3dd->SetRenderState( D3DRS_ALPHABLENDENABLE, TRUE );
if(FAILED(hr)) return false;
hr = dx_stuff.d3dd->SetRenderState( D3DRS_DESTBLEND, D3DBLEND_ONE );
if(FAILED(hr)) return false;
hr = dx_stuff.d3dd->SetRenderState( D3DRS_SRCBLEND, D3DBLEND_DESTALPHA );
if(FAILED(hr)) return false;

// render reqs

if(!renderVerts( leENTTYPE_STATICOBJ, batch, scene, vb, dx_stuff )) return false;



quote: pretty sure no video card supports ARGB color format for the backbuffer (which means no dest alpha when dealing with that). though render target buffers (ie drawing to a texture) supports an alpha channel. how are you handling this? drawing to a texture i hope.

I'm not rendering to a texture, but to the back buffer. Here is my setup code, if that helps:
  



// setup for windowed or fullscreen mode

if(windowed)
{
// make sure the desktop is 32bits-- needed for display!!

if( (d3ddm.Format != D3DFMT_A8R8G8B8) && (d3ddm.Format != D3DFMT_X8R8G8B8) )
return leERR( "Desktop needs to run in 32bit mode to use windowed mode" );

d3dpp.Windowed = TRUE;

// Is this okay, or do we have to use Format (I need the alpha!!!!)

d3dpp.BackBufferFormat = D3DFMT_A8R8G8B8;//d3ddm.Format;

}

else
{
d3dpp.Windowed = FALSE;
d3dpp.BackBufferFormat = D3DFMT_A8R8G8B8;
}

// set dest buffer size

d3dpp.BackBufferWidth = xres;
d3dpp.BackBufferHeight = yres;

// set the back buffer count

d3dpp.BackBufferCount = 1;

// set the swap effect to discard, whatever that is

d3dpp.SwapEffect = D3DSWAPEFFECT_DISCARD;

// set the back buffer format

d3dpp.BackBufferFormat = d3ddm.Format;

// setup for using the best stencils available.

d3dpp.EnableAutoDepthStencil = TRUE;
d3dpp.AutoDepthStencilFormat = D3DFMT_D24S8;

// try to create the device

hr = dx_stuff.d3d->CreateDevice( D3DADAPTER_DEFAULT,
D3DDEVTYPE_HAL,
hWnd,
D3DCREATE_HARDWARE_VERTEXPROCESSING,
&d3dpp,
&dx_stuff.d3dd );
if(FAILED(hr));



This code does not return any errors on the NVidia card. The documentation is sort of fuzzy, but I assume CreateDevice fails if the required format is not supported.


quote:
destination alpha has been in from geforce2


The GeForce256 had destination alpha reads of some sort, also I think. At anyrate, an investigation of the HAL caps show that GF4MX supports the D3DBLEND_ALPHA blend operation for both source and destination. Am I getting the wrong meaning out of this?

Thanks for the replys.

[edited by - SH on May 31, 2002 8:45:33 AM]

[edited by - SH on May 31, 2002 8:47:52 AM]

[edited by - SH on May 31, 2002 8:52:56 AM]

[edited by - SH on May 31, 2002 8:54:07 AM]

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!