Archived

This topic is now archived and is closed to further replies.

Really wierd mesh drawing problem...

This topic is 5653 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hey people. I''ve got my D3D application which draws my own mesh file format ,exported from Milkshape3D, fairly well.It draws fine on my GeForce4 and my GeForce2 GTS. However,the mesh gets completely screwed up when it is being drawn on a Riva TNT or a GeForce2Go. I have no idea why it happens. The way the mesh is drawn in my app is by using vertex and index buffers and a call to DrawPrimitive.When it is rendered on the GeForce2Go or the TNT, all I get is a jumbled up pile of triangles (lighting and texturing still works though) I don''t have the first idea where to begin looking to fix this problem.Any ideas? Thanx for any help.

Share this post


Link to post
Share on other sites
You probably set Hardware vertexprocessing when creating the device, or set the buffers up with hardware vertexprocessing.
Go through the code, and set everything to software.

T

--
MFC is sorta like the swedish police... It''''s full of crap, and nothing can communicate with anything else.

Share this post


Link to post
Share on other sites
I tried that.My engine detects if the user can support hardware vertex processing or not and creates the vertex buffers and device accordingly.I even tried to force certain settings but it still doesn''t work

Any other ideas?

Share this post


Link to post
Share on other sites
Couple thoughts -

Are you using 32-bit or 16-bit indices? A lot of cards don''t support 32-bit indices, so if so, try going down to 16-bit indices.

How many polygons per mesh? Try to keep under 16-bit limit (see above).

Check debug output - there might be a problem with vertex buffer locking, filling, etc. In my last book, I had a problem with locking/unlocking the vertex buffer that caused flashing on some video hardware.



Jim Adams
home.att.net/~rpgbook
Author, Programming Role-Playing Games with DirectX

Share this post


Link to post
Share on other sites
Hey Jim...thanx a million man! The problem WAS with the indices. I was using 32 bits and,like you said, some cards don''t support those.But what am I supposed to do if I really need 32 bit indices? What do D3D games out there use if they have really high polygon counts?

Share this post


Link to post
Share on other sites