far z clipping with ati cards vs nvidia

Started by
10 comments, last by JoeyBlow2 20 years, 8 months ago
Hi, I'm working with an open source project, and all the other developers use Nvidia. I have ATI 9700. When running the app on my card I see no sky, but somehow it shows up for the folks with nvidia. The viewport is set to 100,000 max far z... The sky is drawing at 1,000,000 radius (its a sphere). This causes the sky to be beyond the zfar radius and comes up black for me. If I reduce the sky radius to 100,000 it works. The question I have does nvidia opengl drivers ignore the far clip range allowing it to work for them? Is there a way to turn off the far clip range with ATI and always draw a polygon, or is that impossible in theory? I always thought you needed a 6 sided cone from my directx experience. Anybody who could shed some light on this I would appreciate it. [edited by - Joeyblow2 on August 7, 2003 6:09:17 PM]
Advertisement
Assuming it''s just a simple skydome, you shouldn''t be drawing the sky writing to the depth buffer anyway. Turn off depth testing (and writing) to draw the sky, and draw it before everything else. The size of the sphere doesn''t need to be big, as long as it''s bigger than your near clip plane distance. Everything else will get drawn over it, since you didn''t write to the depth buffer.

____________________________________________________________
www.elf-stone.com

____________________________________________________________www.elf-stone.com | Automated GL Extension Loading: GLee 5.00 for Win32 and Linux

Hum personnaly I will render it at the end w/ depth test on (not write), so you save some fillrate.

Just my 2 cents.
_______________
Jester, studient programmerThe Jester Home in French
But it would be a moot point considering it would still have to perform depth calculations...I like the first idea better...and I have no clue why it would work on nvidia cards sorry
Why don't alcoholics make good calculus teachers?Because they don't know their limits!Oh come on, Newton wasn't THAT smart...
I think it''s caused by the insanely large 100,000 value of the far clipplane. There''s very little depth resolution with such a high value. This may cause the entire sky to fail the z-test and thus get not rendered. Probabely the depth resolution is calculated slightly different in Nvidia and ATI cards. No problem normally but because there is so little depth resolution with a clipplane at 100,000 it shows up here.

Sander Maréchal
[Lone Wolves Game Development][RoboBlast][Articles][GD Emporium][Webdesign][E-mail]


GSACP: GameDev Society Against Crap Posting
To join: Put these lines in your signature and don''t post crap!

<hr />
Sander Marechal<small>[Lone Wolves][Hearts for GNOME][E-mail][Forum FAQ]</small>

Perhaps your coders have made use of the nVidia extension that places anything that goes past the far clip plane to be adjusted to actually sit along the plane instead.

Cant remember the extension name, but its useful for depth-fail shadow volumes

-----------------------
"When I have a problem on an Nvidia, I assume that it is my fault. With anyone else''s drivers, I assume it is their fault" - John Carmack
-----------------------"When I have a problem on an Nvidia, I assume that it is my fault. With anyone else's drivers, I assume it is their fault" - John Carmack
Thanks guys. I''ll see if there is a Nvidia extension being used.
Fast Z clear is much faster than rendering something to the Z buffer.

Also, I''ve seen some bugs where a Radeon will keep some states at times where it shouldn''t, and nVIDIA would do the right thing, causing this kind of problem (but, for me, in reverse -- I develop on Radeon, and there it worked, but on nVIDIA, it correctly didn''t work).
I doubt that your buddies are using an Nvidia extensions. I develop on an Nvidia G4 TI 4600 and my app runs on an ATI Radeon 9700. Yeah I know, what a way to develop, but thats how we operate around here. Anyway, The ATI cards are crap IMHO. Everything can work and look great on an Nvidia card, but then there are always a ton of "bugs"(for lack of a better term) to "work around" to achieve an identical look on the 9700. I don''t use any card specific extensions at all in my app in case anyone was going to flame me, just straight OpenGL.

I think it boils down to the fact that the Nvidia OpenGL drivers are OGL version 1.4 and the newest ATI ones are 1.3 something. Anyway, it seems to me that their base OpenGL support is not as good as Nvidia.

J
Eh... If the far plane is set to z=100k and you draw something at z=1M, it''s going to be clipped by the far plane and not rendered. That''s, like, the definition of what the far plane does. If a card *doesn''t* clip the geometry by the far plane then there''s something very wrong with it.

This topic is closed to new replies.

Advertisement