C++/D3D: DrawIndexedPrimitiveUP()

Started by
2 comments, last by Lyve 21 years, 4 months ago
Hello list, We keep having problems using DrawIndexedPrimitiveUP(). If we use a 16 bit index buffer everything is working fine but if we use 32 bit, our computer (AMD XP1800/GF2MX) renders only the first triangle of a triangle fan (or strip, both are wrong). We're using DirectX SDK 8.1 Any tips? Lyve [edited by - Lyve on December 6, 2002 6:27:25 AM]
_____________________________________http://www.winmaze.de, a 3D shoot em up in OpenGL, nice graphics, multiplayer, chat rooms, a nice community, worth visiting! ;)http://www.spheretris.tk, an upcoming Tetrisphere clone for windows, a 3D tetris game on a sphere with powerful graphics for Geforce FX and similar graphics cards.
Advertisement
YOur driver DOES support 32 bit indices? Checked?

BTW - you do know that the UP functions are BAD :-)


Regards

Thomas Tomiczek
THONA Consulting Ltd.
(Microsoft MVP C#/.NET)
RegardsThomas TomiczekTHONA Consulting Ltd.(Microsoft MVP C#/.NET)
We are currently translating our opengl app into direct3d, and with opengl, we didn't have problems with 32 bit indices. Converting everything into 16 bit indices would kill us because we use 32 bit everywhere. Yes we know ...UP() is bad but the easiest way for us to convert from opengl into d3d. We have limited time to do the job and our cpu math costs much more performance than the rendering we need (we simulate turning, milling, bending for the industry).

(later) Forgot to say: We use the latest detonator 40.xx from NVidia, I doubt that the driver doesn't support 32 bit indices.

Lyve


[edited by - Lyve on December 6, 2002 8:28:49 AM]
_____________________________________http://www.winmaze.de, a 3D shoot em up in OpenGL, nice graphics, multiplayer, chat rooms, a nice community, worth visiting! ;)http://www.spheretris.tk, an upcoming Tetrisphere clone for windows, a 3D tetris game on a sphere with powerful graphics for Geforce FX and similar graphics cards.
Ok I''ve checked it again, you were right, openGL seems to convert the indices to 16 bit before rendering and DirectX doesn''t and this is why it didn''t work. Thanks for the hint!

Lyve
_____________________________________http://www.winmaze.de, a 3D shoot em up in OpenGL, nice graphics, multiplayer, chat rooms, a nice community, worth visiting! ;)http://www.spheretris.tk, an upcoming Tetrisphere clone for windows, a 3D tetris game on a sphere with powerful graphics for Geforce FX and similar graphics cards.

This topic is closed to new replies.

Advertisement