What card for GL_ARB_point_sprite

Started by
39 comments, last by Sander 20 years, 7 months ago
I know GF3''s support your extension, and I think that even the first Radeons did as well. Not sure about that though.

As for the huge debate, it goes on everywhere. Thing is, I am a gamer AND a GL coder. I have experienced both on both levels. It all boils down to what you need to do. if you need *FULL* OpenGL support, higher frames in GL-only games (SoF2 for instance), and less problems, go with nVidia. If you prefer D3D or only need a decent amount of GL support, ATI is probably your best bet.

Now before I get flamed for inexperience or anything, I can assure you that I have used both cards. I have always been an nVidia fan, but I bought a Radeon when they came out for the SOLE purpose of giving the udner-dog a chance. In D3D the Radeon pushed more frames, but when I switched to GL I lost the ability to control the gamma (brightness) in a few of my games (Unreal Tournament for instance), and the nVidia smoked it. I later returned the card because I use GL for it''s insane higher frame count over D3D, and if a Radeon won''t provide me with equal or better results, I''ll stick with my nVidia.

My friend in Texas also bought a Radeon. If I remember it was a 9800 (9600??), one of the new ones. In UT2K3 and the original UT, his card rendered every player model ALL the time, no matter where they were. Yes, it gave him a HUGE advantage, but it looked crappy. In the original UT he couldn''t brighten his game either. Buying a GeForce IV not only fixed his problems, but it pumped higher frames on his older system.

Finally, if ATI cards are so "awesome", then why is it that game developers making GL games (SoF2, RTCW, etc) have to go in and detect these cards to run a special set of extra instructions to make them operate properly? I can take screenshots in-game of the options if you do not know what I am talking about. ATI cards are *NOT* 100% GL-compliant. By all means though, they are NOT suckage. They just have a few bugs that have been in them for the past ten years that ATI is too lazy to fix, or doesn''t know how to fix. However, if I build a system not for gaming, but for some other purpose, you can bet I will slap an ATI in it. Yet on this system (P4/3.56), I''ll stick with my GeForce FX5900 AGP 8X 256mb card and continue to smoke anybody who wants a true framerate test: in-game.

-The Great Sephiroth
-The Great Sephiroth
Advertisement
Hmmm.... I go to sleep a few measily hours and look what happens! Okay, here we go...

quote:Original post by Jeeky
As I recall, the one thing that this extension didn't implement on the GF3 was size attenuation. You still control the size of the sprite (max = 64) using glPointSize(). This attenuation feature has been implemented in the GF4s (not the MX).

Strange, since point size attenuation is supported on mt TNT2. I already implemented it and it looks great.
quote:Original post by Jeeky
What threads have I hijacked? Please enumerate them. We had an unfortunate flame war in one, yes, but when have I "hijacked" a thread?

You hijacked this thread after PipoDeClown posted an off-topic remark. However, since you're not the only one here and since everyone did keep it at a civil level (no shouting and namecalling. Hurray!) I won't mind. My problem was solved before it started.
quote:Original post by _the_phantom_
Back to the point of the thread and \o/ for GL_ARB_point_sprite extension, i was asking about point sprite support on ATI cards a while back, if we get this in the 1.5 driver from ATI then fantasic

Somewhat bad news here. GL_ARB_point_sprites will be in OpenGL 1.5 but it won't be in the core. It'll remain an extension.
quote:Original post by The Great Sephiroth
In UT2K3 and the original UT, his card rendered every player model ALL the time, no matter where they were. Yes, it gave him a HUGE advantage, but it looked crappy.

That's not crappy coding on ATI's site, but crappy coding by the developers. They probabely used a few vendor specific tricks to speed things up and they failed to do a good job with OpenGL on ATI cards. It's a problem with Unreals visibility routines drawing models in the wrong order with the wrong settings, not with ATI's OpenGL implementation.

Now please stop the nVidia-ATI wars before I need to move my own thread to the lounge. Thank you kindly!

Sander Maréchal
[Lone Wolves Game Development][RoboBlast][Articles][GD Emporium][Webdesign][E-mail]


[edited by - sander on September 2, 2003 3:03:32 AM]

<hr />
Sander Marechal<small>[Lone Wolves][Hearts for GNOME][E-mail][Forum FAQ]</small>

quote:Original post by jeeky
1. My view is that I have had trouble with ATI cards so I would prefer to use nVidia cards, which I have never had trouble with. What is misinformed about my view?

The bit where you jump to conclusions based on your very limited experience of using ATi hardware, and present those as fact.

quote:2. What threads have I hijacked? Please enumerate them. We had an unfortunate flame war in one, yes, but when have I "hijacked" a thread?

This is the second AFAIK. The other one I linked to above.

quote:3. Does saying "works for me" fix a bug or solve a problem?
Did I say it did? My intention was not to fix your problem, but to let you know that the bug only seemed to affect to your particular hardware combination.

quote:4. Does saying "works for me" even apply at all if you have different hardware than I do (Mobility FireGL vs. Radeon)?
See above.

If you need any more issues resolved, you should email me. We better let Sander have his thread back before he puts us on probation.

[edited by - benjamin bunny on September 2, 2003 9:41:19 AM]

____________________________________________________________www.elf-stone.com | Automated GL Extension Loading: GLee 5.00 for Win32 and Linux

quote:Original post by Sander
quote:Original post by _the_phantom_
Back to the point of the thread and \o/ for GL_ARB_point_sprite extension, i was asking about point sprite support on ATI cards a while back, if we get this in the 1.5 driver from ATI then fantasic

Somewhat bad news here. GL_ARB_point_sprites will be in OpenGL 1.5 but it won''t be in the core. It''ll remain an extension.


Well, core or non-core the addition of a point_sprite extension will be a welcome one as, when i last checked, my 9700pro with recent drivers didnt support any kind of point sprite extension at all, so its a nice step forward and will hopefully lead to it being core in 1.6

quote:Original post by _the_phantom_
...so its a nice step forward and will hopefully lead to it being core in 1.6

Well, I hope not. I hope that they skip 1.6 - 1.9 and head straight for 2.0!
quote:Original post by benjamin bunny
If you need any more issues resolved, you should email me. We better let Sander have his thread back before he puts us on probation.

Nah, I'm not going to do that. My questions are already answered. If you want the thread you can take it and i'll move it to the lounge. I could also change the thread title to "ATI - nVidia wars, bring your WMD's" so others can join the fray as well.

Sander Maréchal
[Lone Wolves Game Development][RoboBlast][Articles][GD Emporium][Webdesign][E-mail]


[edited by - sander on September 2, 2003 11:29:37 AM]

<hr />
Sander Marechal<small>[Lone Wolves][Hearts for GNOME][E-mail][Forum FAQ]</small>

quote:Original post by Sander
Well, I hope not. I hope that they skip 1.6 - 1.9 and head straight for 2.0!

1.5 is already very near to 2.0, if you count in all the recent GLSL extensions. A few parts are still missing, but the 1.5 to 2.0 jump would definitely be possible.

About the NV vs. ATI war: I have been developing for a long time on chipsets of both manufacturers. From my experience, NV tends to have the better drivers, and a lot more cutting edge extensions. ATI, on the other hand, has always been more standard compliant, their hardware chipdesign is better thought out (prime example: clip planes), and their devrel is much friendlier (ie. NV wants you to sign an NDA for every crap, while ATI will help registered developers on pretty much any issue).

IP issues are another point to consider. NVidia inherited a lot of patents from it's unvoluntary pseudo-mother SGI, during the "great SGI employee migration" period. ATI always had a hard stance fighting against those, and often needed to find alternative ways to implement a same functionality by other means. That's not always easy. But sides could easily flip in the future: don't forget that ATI was the first manufacturer with a DX9 capable card on the market.

All in all, I personally develop generally on nVidia. Primarily because of the driver/extension thing (especially because of their very good Linux drivers, that ATI lack). But I prefer ATI when it comes down to their hardware, business practices and developer relations.


[edited by - Yann L on September 2, 2003 1:09:46 PM]
just feel like pointing out that ATI are working on Linux drivers atm, so we could see some in a month or twos time as they are beta testing right now. Althought it probably will take a few revisions to get right.

As for OGL versions, time should be spent getting the OGL2.0 drivers right before making the jump, i''d rather live with 2 more revisions of OGL1.x and then move to OGL2.0 than make the version number switch now and for them not to be complete.
Either ways, point sprites on ATI hardware will be handy
Ah my bad, I forgot to bring up that ATI works just as good on Linux as nVidias. Thanks for reminding me, and for pointing that out.

Sander, that''s not a problem with Unreal, it''s a problem with ATI, because even on crappy Matrox or Diamond cards, it looks right. This problem is common-knowledge in the UT community and it happens ONLY on the Radeons. If it was happening on the crap-tastic GF4 MX series and other such cards, I''d fully agree with you, but since they''ve tested with numerous brands and makes of cards, only ATI Radeons have the problem. It *MAY* be a driver bug for the Radeons, but who knows? I just know that for max GL performance, you need an nVidia for gaming. But like I said, if I needed a good card for a box that wasn''t used for gaming, I''d go ATI easily. I mean, why buy an nVidia when an ATI is about $50 cheaper and it does D3D faster (in most cases) than any nVidia?

-The Great Sephiroth
-The Great Sephiroth
quote:Original post by The Great Sephiroth
Sander, that''s not a problem with Unreal, it''s a problem with ATI, because even on crappy Matrox or Diamond cards, it looks right. This problem is common-knowledge in the UT community and it happens ONLY on the Radeons.


Hmm... and the Unreal programmers have not been able to work around it? I played Unreal a loooong time ago but I don''t remember ever patching it or anything. That''s just a problem in the GD business. Driver coders make a booboo, game programmer needs to fix it by working around it. It''s usually pointless waiting for a rapid fix. And that applies to both ATI and nVidia drivers.

The thing that bugs me most with the new drivers is that they thend to choke on certain glHint(..) functions. It happens with ATI as well as nVidia. It makes a lot of older demo''s and tutorial binaries unusable.



Sander Maréchal
[Lone Wolves Game Development][RoboBlast][Articles][GD Emporium][Webdesign][E-mail]

<hr />
Sander Marechal<small>[Lone Wolves][Hearts for GNOME][E-mail][Forum FAQ]</small>

Are you sure about the choking thing? I not only never heard of it, but There''s nothing I can''t play on this box aside from DOS games. Well, Mechwarrior II won''t run, but that''s an XP kernel problem that even compatibility modes won''t fix. Actually, compatibility modes have NEVER fixed anything, and are there to comfort people into thinking all their stuff will run on XP. It is a joke, and so far, anything that won''t run in XP, will NOT run in compatibility mode. What a damn joke.

Anyways, sorry for the OT line about the other kernels, but it pisses me that they bloat my installation of Windows with them, and then they''re utterly useless. Let me know what you see this "choke" in because I want to test it on my FX5900 AGP 8X. I have never seen a slow-down before, but maybe I''ve not tried the right program yet.


-The Great Sephiroth
-The Great Sephiroth

This topic is closed to new replies.

Advertisement