32 Bit Index Buffer doesn't work

Started by
8 comments, last by W 22 years, 1 month ago
Hello I encountered major problems using 32 bit Index Buffers. The problem is: If I use 32 bit indices (with a triangle list) only those triangles are rendered that have (at least) one vertex out of the visible area. My Investigation: So, I''ve just taken the tutorial example of my DirectX 8.0 Sdk (the one that renders a rotating triangle) and indexed it - everything worked fine with 16 bit indices, but when I rebuild to 32 bit indices you see simply nothing. When I made the triangle larger in size, I suddenly coult see the triangle but only if one vertex was out of the clipping borders. I''ve already checked the maximum allowed index size in the CAPS structure and it IS 2^32 !!! So please could you help me mfg and thanks WP
Advertisement
the currently existing TnL card just support 16bit, this is because of the internal vertex cache and ''cause the best size for a vertexbuffer is 64kb (source: NVdia doc) and 64kb buffer cannot have more than 16bit of vertices, so you have to split your vertex and indexbuffers to 16bit idices.

rapso->greets();
And you don''t create a 32 bit index buffer and accidently fill it with 16 indices?

quote:Original post by W
Hello

I encountered major problems using 32 bit Index Buffers.

The problem is:
If I use 32 bit indices (with a triangle list) only those triangles are rendered that have (at least) one vertex out of
the visible area.

My Investigation:
So, I''ve just taken the tutorial example of my DirectX 8.0 Sdk
(the one that renders a rotating triangle) and indexed it -
everything worked fine with 16 bit indices, but when I rebuild to 32 bit indices you see simply nothing. When I made the triangle larger in size, I suddenly coult see the triangle but only if one vertex was out of the clipping borders.

I''ve already checked the maximum allowed index size in the CAPS structure and it IS 2^32 !!!


So please could you help me

mfg and thanks
WP


Well, I've checked twice and I DO fill the IB with 32 bit indices!

Otherwise I'd definitely get an error message because the first 32 bit index would be much too high.

For testing purposes only, I have reduced the number of indices to 3 (one singe triangle) and 3 vertices:

  #define D3DFVF_CUSTOMVERTEX (D3DFVF_XYZ | D3DFVF_DIFFUSE)typedef struct {...} CUSTOMVERTEX;CUSTOMVERTEX lpVertices [] ={  //    x      y     z         clr  { -1.0f, -1.0f, 0.0f, 0xFFFF0000, },  {  1.0f, -1.0f, 0.0f, 0xFF0000FF, },  {  0.0f,  1.0f, 0.0f, 0xFFFFFFFF, },};DWORD dwNumVertices = sizeof(lpVertices) / sizeof(CUSTOMVERTEX);#define D3DFIF_CUSTOMINDEX (D3DFMT_INDEX32)typedef DWORD CUSTOMINDEX;CUSTOMINDEX lpIndices [] ={  0, 1, 2,};DWORD dwNumIndices = sizeof(lpIndices) / sizeof(CUSTOMINDEX);  


(i've just named it CUSOMINDEX in analogy to CUSTOMVERTEX)

... and it still doesn't work,
but (e.g.) if I set the y of Vertex 3 to (let's say) 10 (which depends on the view matrix I use) the triangle becomes visible
Furthermore there is no debug output, there are no error messages

... and it works perfectly with 16 bit Indices

(I use an ATI Xpert 2000 32MB, latest driver available, DirectX 8.1)

Edited by - W on February 21, 2002 4:48:30 AM

Edited by - W on February 21, 2002 4:51:48 AM
I''ll rephrase what rapso wrote:

Make sure the hardware/driver combination you''re trying to run on actually supports 32bit index buffers. Many don''t.

You should check the MaxVertexIndex member of D3DCAPS8 before you try to use 32bit indices. From the DX8.1 docs:
quote:
MaxVertexIndex
Maximum size of indices supported for hardware vertex processing. It is possible to create 32-bit index buffers by specifying D3DFMT_INDEX32; however, you will not be able to render with the index buffer unless this value is greater than 0x0000FFFF.

Simon O'Connor | Technical Director (Newcastle) Lockwood Publishing | LinkedIn | Personal site



quote:
I've already checked the maximum allowed index size in the CAPS structure and it IS 2^32 !!!


(correctly: 2^32 - 1 = 0xFFFFFFFF)

It would be much too easy if that was the problem.

If anyone has a "well" working 32 bit indiced sample or something like that I could try if it works on my card or not, but I start to believe that it's not the card's fault.
I've passed my prog around and it doesn't work anywhere.

I have no idea what the cause for this could be.

Edited by - W on February 21, 2002 9:18:50 AM
doh, sorry! - I didn''t read your original post clearly enough!!


Looking more deeply this time ;-) - the ATI Xpert 2000 is based on the ATI Rage128 chip.

That''s not a T&L chip so any vertex processing caps (and the index caps) are going to reflect what is possible using the software vertex processing pipe.

That chip does have hardware triangle setup (i.e. it takes indices, references vertices from them to produce triangles).

Since it isn''t a T&L chip or newer, its **very** unlikely the chip itself actually supports 32bit indices, so unless the driver does any translation before sending to them to the card you''ll have problems. (As an example GeForce2 Ultra doesn''t support 32bit indices)

A couple of questions:

1. are you checking the caps using IDirect3D8::GetDeviceCaps or IDirect3DDevice8::GetDeviceCaps ? There may be a difference - the first will give you a more true picture of the hardware since the second will give you the software caps if software vertex processing is selected!

2. what is the MaxStreams cap set to ? - if it''s 0, then the driver doesn''t understand DX8 calls or caps and so values you read such as MaxVertexIndex can actually be invalid!

2.

--
Simon O''Connor
Creative Asylum Ltd
www.creative-asylum.com

Simon O'Connor | Technical Director (Newcastle) Lockwood Publishing | LinkedIn | Personal site

Thanks for your advise!

That''s what I found out:

quote:2. what is the MaxStreams cap set to ? - if it''s 0, then the driver doesn''t understand DX8 calls or caps and so values you read such as MaxVertexIndex can actually be invalid!

using IDirect3D8::GetDeviceCaps I get
MaxStreams = 0x1 at HAL
and
MaxStreams = 0x10 at REF
IDirect3DDevice8::GetDeviceCaps produces
MaxStreams = 0x10
quote:1. are you checking the caps using IDirect3D8::GetDeviceCaps or IDirect3DDevice8::GetDeviceCaps ? There may be a difference - the first will give you a more true picture of the hardware since the second will give you the software caps if software vertex processing is selected!

using IDirect3D8::GetDeviceCaps I get
MaxVertexIndex = 0xFFFFFFFF at HAL
and
MaxVertexIndex = 0x00FFFFFF at REF
IDirect3DDevice8::GetDeviceCaps produces
MaxVertexIndex = 0xFFFFFFFF

So 32 bit indices should be possible although (in the worst case) I can only use 0x00FFFFFF.
But I''m just using three triangles and it doesn''t work either.

It seems to me that DirectX (or my gfx driver) is performing some kind of of mismatch between hardware and software (because "clipped" vertices are rendered properly whereas "inner" are not) - I originally encountered the problem as I tried to render a more complex structure with the result that only surrounding (clipped) triangles were drawn and the middle was left blank.

Could this be the cause?
My guess would be driver - its probably the same driver core used for the new ATI chips which (AFAIK) do support 32bit indices - that driver core probably isn''t throwing up an error where it should (unless the indexed triangle list is being turned into a non-indexed list inside driver). D3D tends to trust the driver for things like the caps - although it does reinterpret some caps.

Your best bet in situations like this (where the caps say its possible, but it doesn''t work on the hardware) is to talk to the developer relations people at ATI to see whether its their fault, Microsofts fault or something you''ve done.

--
Simon O''Connor
Creative Asylum Ltd
www.creative-asylum.com

Simon O'Connor | Technical Director (Newcastle) Lockwood Publishing | LinkedIn | Personal site

Hmmm

It really seems that the caps are some kind of invalid or simply wrong because I found out that using REF instead of HAL solves this problem - just basicly (because software is not really an alternative).

Otherewise this appears to be a more general DirectX problem because my prog didn''t want to work properly on other PCs with different GfxCards too.

This seems to be a topic for a bug report

This topic is closed to new replies.

Advertisement