Jump to content
  • Advertisement
Sign in to follow this  
grosvenor

DrawIndexedPrimitive & ATI driver

This topic is 2186 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I have a strange problem with my DirectX9 engine and ATI drivers - and no idea how to track this further:

After I installed the latest ATI driver on my computer directX-DrawIndexedPrimitive calls starts to ommit some triangles while rendering. This behaviour starts exactly after 512 calls to DrawIndexedPrimitive. I have run under maximum debug settings, debug directx, nothing conspicious is reported. This only happens using the latest ATI drivers.

I'm using my own a 3D game engine which has already been used in a few commercial releases, DirectX 9, shader model 3.0. System is Windows 7, 64bit, a HP 8510p notebook with a ATI HD 2600 mobility GPU.

The code works fine with the Windows 7 stock drivers, and the original HP drivers from 2009. But not with the current ATI drivers. So far I said, well, forget the ATI drivers. However, yesterday I installed the indy game "starfarer", that does not even get the device up at all using the windows 7 or HP driver, but does so using the ATI one.

It would really suck if I had to say to my eventual customers "you might not be able to play your other games, but my game will work fine if you roll back your gpu driver.".

So has anybody had similar experiences? Are there known issues with the DrawIndexedPrimitive and ATI? Any ideas what to try?

- Matt

Share this post


Link to post
Share on other sites
Advertisement
Are you rendering on a second thread?
Are you doing redundant-state checking on your end?
Why not some screenshots of this omitting of triangles?

What does PIX say you are passing to the call? Is it not the correct number of triangles for the primitive type (triangle list/triangle strip)?
Are the correct vertex buffers bound?

It is unlikely an ATI driver bug. More likely they have made fixes which have exposed some error on your side, and you should be glad for this chance to find and fix it.


L. Spiro Edited by L. Spiro

Share this post


Link to post
Share on other sites
Hi L. Spiro,

[quote="L. Spiro"]Are you rendering on second thread?[/quote]
It's the main thread.

[quote="L. Spiro"]Are you doing redundant-state checking?[/quote]
Uhh... sounds pretty general. Anything specific I should be looking for?

[quote="L. Spiro"]Why not some screenshots of this omitting of triangles?[/quote]
Right, why not...
[attachment=9899:missing_triangles.jpg]

So, this is where the buffers are bound:

VERIFY(Get3DDevice()->SetFVF( m_VertexFormat.GetVertexFormatID() ));
VERIFY(Get3DDevice()->SetStreamSource(0, m_VertexBuffer.m_D3D9VertexBuffer, 0, m_VertexFormat.GetVertexSize()));
VERIFY(Get3DDevice()->SetIndices(m_IndexBuffer.m_D3D9IndexBuffer));


... and this is the render call

//actual rendering:
assert(pEffect);
if (pEffect)
{
unsigned int numPasses = 0;
VERIFY(pEffect->Begin(&numPasses, 0));
for (unsigned int pass = 0; pass < numPasses; ++pass)
{
VERIFY(pEffect->BeginPass(pass));
HRESULT hr;
hr = Get3DDevice()->DrawIndexedPrimitive(D3DPT_TRIANGLELIST, 0, 0, m_VertexBuffer.m_NumVertices, index0, numIndices/3);
if (hr!=D3D_OK)
{
sys::Output(DXGetErrorString(hr));
sys::Output(DXGetErrorDescription(hr));
assert(false);
}
VERIFY(pEffect->EndPass());
}
VERIFY(pEffect->End());
}


Btw, the effect does not care which model I take. It happens the same with a tesselated sphere. It happens in-game and it happens in the modelviewer.

What does PIX say you are passing to the call? Is it not the correct number of triangles for the primitive type (triangle list/triangle strip)?[/quote]
148 <0x064A5970> IDirect3DDevice9::DrawIndexedPrimitive(D3DPT_TRIANGLELIST, 0, 0, 4688, 0, 2612) 66862329856

I can even hardcode the parameters in this call to these values, the missing triangles bug appears not before the 512th frame.
To be honest, I used PIX the first time now that you mentioned it.

Are the correct vertex buffers bound?[/quote]
Probably. When I launch the modelviewer, there is only one vertex buffer and one index buffer thats get set at all, so there is not much chance of mixing it up, I guess. From the pictures I would rather guess the index buffer to be the problem, but it 's no definite saying, because the polygons don't share vertices.

It is unlikely an ATI driver bug. More likely they have made fixes which have exposed some error on your side, and you should be glad for this chance to find and fix it.[/quote]
That's right. But I am really curious. This one would really be a long time lurker.

Thanks for helping,

Matt

Edit: the VERIFY doesn't do anything. It's only an empty macro in this configuration. Edited by grosvenor

Share this post


Link to post
Share on other sites
I'd suggest to try the debug runtimes - available from your DirectX control panel in the Start menu. You may have some bad parameters going in which affect things with one driver but which the other is more forgiving of (software vs hardware vertex processing can sometimes trigger this too) and the debug runtimes will identify this pretty quickly for you.

Share this post


Link to post
Share on other sites

Edit: the VERIFY doesn't do anything. It's only an empty macro in this configuration.


When you say 'empty macro' what do you mean precisely? as in 'show us'....

Share this post


Link to post
Share on other sites
Empty macro, as in

#define VERIFY(x) (x)


(but you're right to ask for a clarification, it could also have been like

#define VERIFY(x) ()

)
@mhagain:
Using debug runtime, all validation cranked up to maximum. The only thing reported are warnings of redundant SetRenderState and SetTextureStageState calls.
So this is probably what L. Spiro meant by "Are you doing redundant-state checking on your end?".
The answer is no, I just set all states required every frame.

- Matt

Share this post


Link to post
Share on other sites
Wow, a -1. That is what I get for helping.


Uhh... sounds pretty general. Anything specific I should be looking for?

If depth testing is disabled, it does not make sense to disable depth testing again.
Frankly, while this is just an example, the general idea of not setting the same state twice is fairly important for the performance of any game. This applies to DirectX as much as OpenGL. In either case you need to do this. If the last texture-wrap mode was CLAMP, then don’t call a DirectX or OpenGL function to set the same state to the same value.

While this feature is necessary, I mention it because it can also cause problems if you are using multiple threads and do not have proper synchronization in place. If you are using iOS or Android, remember that OpenGL ES 2 allows you to share resources between contexts, not state. I am aware that you are not using OpenGL or OpenGL ES 2, but the same idea applies when managing DirectX resources. And threading issues are the only issues that I have had that cause the problems you describe.


Basically, as far as I am concerned, I have the buggiest graphics drivers in the world. My dual ATI Radeon HD 5870’s crossfired result in about 10 or 11 reboots daily, yet I am still able to develop games in DirectX 11. Recently I had faulty shader values that caused my graphics cards to crash, but that was my fault. I am now able to have very complex cbuffers working fine once I fixed my own problems.

Basically, GeForce cards will not crash when you create retarded shaders, but ATI will. However, creating valid shaders, of any complexity, on any ATI card, is valid and will not crash. You are doing something wrong on your end, and if your answers to my questions are all reasonable, then you need to post your shader.


L. Spiro

Share this post


Link to post
Share on other sites

Wow, a -1. That is what I get for helping.


For the record the -1 didn't come from someone posting in this thread; I voted it up to cancel it out.

Share this post


Link to post
Share on other sites
I am aware that there is a performance impact through redundant SetRenderStates and so on.
But nothing beats "good enough". So, as long as it doesn't hurt the driver I won't change it. If I am ever considering performance issues, I might look into this aspect again, but not now.

So here is a simplified shader, that shows the same behavior (tris disappearing after 512 frames):

float4 PS_ConstantColor(VS_OUTPUT pixel) : COLOR
{
float4 col;
col.rgba=Color;
return col;
}

VS_OUTPUT VS_NoLight( float4 position : POSITION0,
float3 normal : NORMAL0,
float3 texCoord : TEXCOORD0
)
{
// calculate the pos/normal using the "normal" weights
// and accumulate the weights to calculate the last weight
float3 skinPosition = float3(0.0f, 0.0f, 0.0f);
float3 skinNormal = float3(0.0f, 0.0f, 0.0f);

skinPosition = mul(float4(position.xyz,1.0), World);
skinNormal= mul(float4(normal.xyz,0.0), World);
skinNormal = normalize(skinNormal); // normalize normal
float3 diffuse = Ambient;

// transform position from world space into view and then projection space
VS_OUTPUT pixel;
pixel.Position = mul(float4(skinPosition.xyz, 1.0f), ViewProjection);
pixel.Diffuse = float4(diffuse,1);
pixel.TexCoord = texCoord;
pixel.CamView = normalize(skinPosition - CameraPosition) ;
pixel.Normal = skinNormal;
return pixel;
}


technique cc_shader { pass p0 {
VertexShader = compile vs_3_0 VS_NoLight();
PixelShader = compile ps_3_0 PS_ConstantColor();
}
}


Another bit of information: the bug does not appear with the reference rasterizer.

Share this post


Link to post
Share on other sites
1) I did another test: run Dragon Age:Origins. Result: same artifacts occurring as in my engine. Not so with the old HP driver.

2) AMD/ATI says that you should use a tool provided by them to determine if your specific notebook is compatible to the catalyst mobility driver. Unfortunately, this site is currently unreachable. Internet rumor says to stick to the notebook manufactures gpu drivers. These are from 2009, but both my game and DA:O run without problems.
Also, using the ATI driver I also experienced a minor, but unprecedended issue with power managment, which might be another clue for compatibility issues.

So, for now I am going to conclude that it is the driver after all. L. Spiro's reasoning to take on the opportunity to find flaws in the engine is a good one. But all considered, I would say that the chances are minimal that there is something insightful to find here. Unless somebody else has encoutered the same problem and found a solution to it.

So, thanks for your help. Cu,

- Matt

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!