32bits index buffer bug ????

Started by
1 comment, last by Cedric Perthuis 22 years, 9 months ago
Have someone already managed to use 32 bits Index buffers ? I have created an Index buffer with CreateIndexBuffer passing as D3DFORMAT the parameter D3DFMT_INDEX32. And I get srange results rendering a triangle lists with DrawIndexPrimitive. Some triangles are simply not drawn, the others all share a same vertex, so I have a kind of triangle fan, with a triangle list primitve. It seems that the last vertex of each group of 3 vertices is always the same. I have checked by locking the vertices again, and the 3rd index is not always the same. After that little episod ( saying 3 hours .... ) I have tried with a rectangle made of 4 vertices , and 6 indexes rendered by a triangle list. And I could not managed to get the right shape with a 32 bits index. Just for the fun, I replace the 32bits index buffer, by a 16bits index buffer, and I transformed my indexes to 16bits in that way: - building of my 32bits indexes arrays in DWORD *pSrc32; - locking the vertex buffer to WORD *pSrc16; - *(pSrc16++) = *(pSrc32++); on the 6 indexes. And it works well. I then replaced the rectangle by my original mesh of 500 triangles, and it works fine ! So There was no problem with my original code. As a conclusion: two alternatives: 1) My GeForce2 does not support 32bits indexes , and the CreateIndexBuffer method does not fail. 2) My GeForce2 support 32bits indexes and DX8 32bits indexes buffer have never worked ! In the sample, MS never used 32bits index buffer, only 16bits, so is there something special to do ? Is there a way to test if 32bits index buffer is provided ?
Advertisement
Let''s quote the DirectX 8 docs :

" D3DCAPS8

MaxVertexIndex
Maximum size of indices supported for hardware vertex processing. It is possible to create 32-bit index buffers by specifying D3DFMT_INDEX32; however, you will not be able to render with the index buffer unless this value is greater than 0x0000FFFF. "

Yes, it means if your card only supports FFFF Max vertices, you''re stuck with 16 bit indices. You should be able to check the Caps with the utilities of the SDK (no need to code that).

I''ve had the same problem with a Geforce 1 : D3D would let me create everything, with no errors, but wouldn''t render the buffer correctly.
Thanks!

I should have check the D3DCAPS8 !
Actually, I should not have relied on the CreateVertexIndex retrun value, nor the nice method description of the DX8 SDK wbich does not mention that restriction .

I''ve already change my code, but at least I''v understood why my rendering crashed, thanks !

Cédric

This topic is closed to new replies.

Advertisement