Banshee crash

Started by
5 comments, last by padros123 21 years, 9 months ago
Hello, my directx 8.1 based terrain engine is working great on my machine (Windows XP, Athlon 600 with GeForce 256) and my friend''s machine (Windows XP, Athlon 1000 and GeForce3 card), but I just tried running it on my girlfriend''s machine and I have problems... Her machine is old (Windows 98, Pentium 233, Banshee). The engine starts and runs for a second before hanging the machine. The thing is, the terrain looks totally distorted and wrong - there are triangles pointing up into the sky and across the screen. I am pretty sure that I am setting up the display mode correctly (the Banshee only does 16 bit), and I check the return code of *every* function - nothing returns an error. From what I can see, the error seems to be in the geometry. The textures and so on seem to be correct. Does anyone have any general advice about where to start looking for the problem? (I realise that it would be impossible/rude to ask for a specific solution!) any help is much appreciated, thanks Padros
Advertisement
The first thing I''d do (if possible) would be to install the debug runtime on the target and run your program under a debugger, or use debugview from www.sysinternals.com or a similar utility to capture the debug output.

---
Come to #directxdev IRC channel on AfterNET
I''m not sure if this is still a problem with the Banshee, but I believe it was a problem on Voodoo 1 and 2 cards.

When programming with Glide, if you didn''t clamp the floating point values to a certain precision, the hardware could actually lock up.

Also, a later version of Glide that only seemed to work with the Voodoo2 used a totally different way of representing the vertices. Running the same program on a Voodoo1, the polygons were totally wrong and didn''t even connect with each other.

Isn''t the Banshee a basically a standalone Voodoo2 card?

Is it possible that DirectX 8.1 doesn''t support the Banshee as well as it should?
It's not what you're taught, it's what you learn.
Hi,

thanks for your responses.

I have installed the debug version of DirectX and downloaded Debug View as suggested. These have been very helpful. I now know that the errors are happening because DrawIndexedPrimitive is failing with the following error:

Invalid index in the index stream (136)

this is really weird because

a) All of my vertex buffers are of size 289
b) The code works fine on the other (non Banshee) machines
c) In this specific call I am only trying to draw a triangle strip with two triangles (so it's not the max primitive count)

Waverider: your suggestion that the Banshee is not properly supported in DirectX 8.1 sounds quite possible, but I have a 'brute force' version of my terrain renderer which works on this machine. The vertex buffers and index buffers are substantially larger [in the brute force version] than on this version, which is what makes this problem all the more strange.

does anyone have any ideas ?

thanks for your time

Padros


[edited by - padros123 on July 16, 2002 5:27:30 PM]
"the Banshee only does 16 bit"

Are you using the manufacturer''s drivers or the reference
drivers?

My Banshee worked in 32 bit just fine(if quite a bit slow).

-Hyatus
"da da da"
This is misinformation; Voodoo cards, right up until (and including) the Voodoo3, could only render 3D in 16-bit color. Desktop resolutions of 24- and 32-bit color were, however, possible (on the Banshee, Rush, and Voodoo3 cards).

(This in fact was the beginning of the end of 3dfx, as their competitors implemented 32-bit rendering to take the lead in the 3D graphics card race.)



MatrixCubed
http://MatrixCubed.cjb.net

Hi,

"Are you using the manufacturer''s drivers or the reference
drivers?"

I''m using the manufacturer''s drivers (it''s a Guillemot Maxi-Gamer Phoenix). I have had a quick look around and it seems that these drivers (and the latest reference drivers) for the Banshee only support DirectX 7.

Now, surely this means that my engine shouldn''t run at all because it''s coded using DirectX 8.1 ?

The only thing is, the brute force rendering version of my engine works on this card...

thanks

Padros

This topic is closed to new replies.

Advertisement