C++/D3D: DrawIndexedPrimitiveUP()
Hello list,
We keep having problems using DrawIndexedPrimitiveUP(). If we use a 16 bit index buffer everything is working fine but if we use 32 bit, our computer (AMD XP1800/GF2MX) renders only the first triangle of a triangle fan (or strip, both are wrong). We're using DirectX SDK 8.1
Any tips?
Lyve
[edited by - Lyve on December 6, 2002 6:27:25 AM]
YOur driver DOES support 32 bit indices? Checked?
BTW - you do know that the UP functions are BAD :-)
Regards
Thomas Tomiczek
THONA Consulting Ltd.
(Microsoft MVP C#/.NET)
BTW - you do know that the UP functions are BAD :-)
Regards
Thomas Tomiczek
THONA Consulting Ltd.
(Microsoft MVP C#/.NET)
We are currently translating our opengl app into direct3d, and with opengl, we didn't have problems with 32 bit indices. Converting everything into 16 bit indices would kill us because we use 32 bit everywhere. Yes we know ...UP() is bad but the easiest way for us to convert from opengl into d3d. We have limited time to do the job and our cpu math costs much more performance than the rendering we need (we simulate turning, milling, bending for the industry).
(later) Forgot to say: We use the latest detonator 40.xx from NVidia, I doubt that the driver doesn't support 32 bit indices.
Lyve
[edited by - Lyve on December 6, 2002 8:28:49 AM]
(later) Forgot to say: We use the latest detonator 40.xx from NVidia, I doubt that the driver doesn't support 32 bit indices.
Lyve
[edited by - Lyve on December 6, 2002 8:28:49 AM]
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement