Sign in to follow this  
QuadMV

ATI issue - Help with .x loading and texturing

Recommended Posts

QuadMV    168
Hi all, It seems to be an ATI problem. I am experiencing a frustrating problem in which I’m loading and rendering a .x model successfully, but no texture is appearing. After a lot of time, and numerous changes, I finally had a breakthrough yesterday. I am testing now on 5 different machines: Machine1: ATI Radeon 9800 Pro – (my original development machine) Machine 2: ATI Mobility Radeon 9600 (a laptop) Machine 3: Nvidia Geforce 2 Machine 4: Nvidia 7800 GTX (the latest and greatest NVidia Card, this card rocks) Machine 5: Intel 82915 Express (onboard/integrated chipset) What I’ve recently discovered is that my texturing is working perfectly on all but the two ATI machines. So I must be doing something that is not compatible. I’ve loaded the latest drivers, and still no luck. Has anyone else experienced any issues with ATI? Any suggestions what I might need to change? Thanks, Quad

Share this post


Link to post
Share on other sites
The only ATI difference that I know of is that it actually pays attention to the "MinIndex" and "NumVertices" arguments of DrawIndexedPrimitive, as does REF, and software processing. nVidia hardware ignores these values. This can cause missing triangles on ATI, not a lack of textures.

nVidia hardware has an upper texture size of 4Kx4K, while ATI still has an upper limit of 2Kx2K, so large textures may fail to work on ATI.

That's about all I can think of off hand. Have you tried the debug runtimes (control panel/directx/direct3d tab, select debug runtimes radio button)? Have you tried the REF device? Both are likely to complain if your code is doing something wrong.

Share this post


Link to post
Share on other sites
Moe    1256
Could it possibly be that the ATi cards don't support the texture format that your texture is loaded into? I am not sure if that is even possible, but I suppose you never know. Is your texture a power of 2?

Share this post


Link to post
Share on other sites
It *is* possible to load texture in invalid formats. If you specify a format explicitly, it may use it, even if the card can't handle it. Also, if you make a DDS that contains a format not supported, like D3DFMT_R8G8B8, things get unhappy. I can't remember if it was on PC or XBox, but using R8G8B8 format in a DDS caused a crash for us.

ATI and nVidia may also differ in what occurs when sampling with no texture present (ie: SetTexture(stage, 0)). Some cards abort processing at that point. Some cards fetch white. Other cards may do something else.

Share this post


Link to post
Share on other sites
QuadMV    168
I've not tried some of the debug suggestions you are mentioning, but I know there is no incompatibility with the image. I’m using the demo that comes with directx. It’s tiny.x, the girl and her texture is tiny_skin.bmp. It displays fine on ATI when using the .X viewer, so I’m sure it’s something I’m not setting correctly.

Maybe this will help to identify the problem. I’ve been playing around with these settings, maybe something is set wrong, maybe something needs to be turned off, or maybe I’m missing something:


SetLights ( pd3dDevice, true );
pd3dDevice->SetPixelShader(NULL);
pd3dDevice->SetRenderState(D3DRS_ALPHABLENDENABLE, FALSE);
pd3dDevice->SetTexture( 1, NULL );
pd3dDevice->SetTextureStageState( 0, D3DTSS_COLOROP, D3DTOP_MODULATE );
pd3dDevice->SetTextureStageState( 0, D3DTSS_COLORARG1, D3DTA_TEXTURE );
pd3dDevice->SetTextureStageState( 0, D3DTSS_COLORARG2, D3DTA_DIFFUSE );
pd3dDevice->SetTextureStageState(0, D3DTSS_ALPHAOP, NULL);
pd3dDevice->SetTextureStageState(0, D3DTSS_ALPHAARG1, NULL);
pd3dDevice->SetTextureStageState(1, D3DTSS_COLOROP, NULL);
pd3dDevice->SetTextureStageState(1, D3DTSS_COLORARG1, NULL);
pd3dDevice->SetTextureStageState(1, D3DTSS_ALPHAOP, NULL);
pd3dDevice->SetTextureStageState(1, D3DTSS_ALPHAARG1, NULL);
pd3dDevice->SetRenderState( D3DRS_CULLMODE, D3DCULL_CCW );
pd3dDevice->SetRenderState( D3DRS_AMBIENT, 0x33333333 );
pd3dDevice->SetRenderState( D3DRS_NORMALIZENORMALS, TRUE );
pd3dDevice->SetSamplerState(0, D3DSAMP_ADDRESSU, D3DTADDRESS_WRAP);
pd3dDevice->SetSamplerState(0, D3DSAMP_ADDRESSV, D3DTADDRESS_WRAP);
pd3dDevice->SetSamplerState( 0, D3DSAMP_MAGFILTER, D3DTEXF_LINEAR );
pd3dDevice->SetSamplerState( 0, D3DSAMP_MINFILTER, D3DTEXF_LINEAR );
pd3dDevice->SetRenderState(D3DRS_VERTEXBLEND, D3DVBF_DISABLE);
Player.Render(pd3dDevice, Player.GetObjectMatrixPtr(), mViewClipping);



If anyone thinks of anything, I’d appreciate the input, no matter how far fetched you might think it is.

Thanks

Quad

Share this post


Link to post
Share on other sites
[quote]Original post by QuadMV

pd3dDevice->SetTextureStageState(0, D3DTSS_ALPHAOP, NULL);
pd3dDevice->SetTextureStageState(0, D3DTSS_ALPHAARG1, NULL);
pd3dDevice->SetTextureStageState(1, D3DTSS_COLOROP, NULL);
pd3dDevice->SetTextureStageState(1, D3DTSS_COLORARG1, NULL);
pd3dDevice->SetTextureStageState(1, D3DTSS_ALPHAOP, NULL);
pd3dDevice->SetTextureStageState(1, D3DTSS_ALPHAARG1, NULL);


[/quote]
NULL isn't a valid option for these. An *OP of D3DTOP_DISABLE is kind of a NULL equivilent.
Since stage0 has a colorop, it MUST have an alphaop, even if you don't care what happens with alpha.

D3DTSS_ALPHAOP, D3DTOP_SELECTARG1
D3DTSS_ALPHAARG1, D3DTA_DIFFUSE

this is a good "I don't really care" setting you can always use.

As for stage 1, set both the color and alpha ops to D3DTOP_DISABLE. You don't need to set the arg1 for either color or alpha as they are disabled. If you do set them, set them to any valid D3DTA_* value.

COLOROP and ALPHAOP = D3DTOP_* (D3DTextureOPeration)
COLORARG1, COLORARG2, COLORARG0, ALPHAARG1, ALPHAARG2, ALPHAARG0, RESULTARG = D3DTA_* (D3DTextureArgument)

So, *OP gets D3DTOP_*, and *ARGn, get D3DTA_*.

Share this post


Link to post
Share on other sites
QuadMV    168
YOU THE MAAAAAN !!!!

I can't believe that was it. Either changing the null setting or setting alpha did the trick. I don't know which, I'll play more later, but that did it. I knew it must be something dumb, and I was right. I don't usually use NULL for that, but it must have been one of those middle of the night things and I didn't realize what I had done, and it never occurred to me that it might be a problem. I guess I probably assumed that _disable was a zero value, and therefore null would have been equivalent, but I guess not.

Thanks to all for the suggestions.

Quad

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this