Archived

This topic is now archived and is closed to further replies.

Sander

What card for GL_ARB_point_sprite

Recommended Posts

Sander    1332
I''m working on my particle systems lib. I wanted to use the GL_ARB_point_sprite extension but I noticed that it isn''t supported on my old TNT2 card. Does anyone know which cards do support it? GeForce2? Or do I need at least a GeForce3 for it to work? Sander Maréchal [Lone Wolves Game Development][RoboBlast][Articles][GD Emporium][Webdesign][E-mail]

Share this post


Link to post
Share on other sites
Sander    1332
That''s what I thought as well, so I updated my drivers yesterday. I have 37 extensions available but the point sprite extension still doesn''t show. I need to now what card is the oldest one that supports this extension so I know the minimum system specs my lib would require (I don''t want them to be too high).

Sander Maréchal
[Lone Wolves Game Development][RoboBlast][Articles][GD Emporium][Webdesign][E-mail]

Share this post


Link to post
Share on other sites
jesterlecodeur    122
Even 45.23 don''t support them :
http://www.delphi3d.net/hardware/viewreport.php?report=807

I don''t read ARB_point_sprite spec but if it''s similar to NV one you can take a look at http://www.delphi3d.net/hardware/extsupport.php?extension=GL_NV_point_sprite

_______________

Jester, studient programmer
The Jester Home in French

Share this post


Link to post
Share on other sites
Yann L    1802
ARB_point_sprite is a very recent extension. I''d expect the first OpenGL 1.5 drivers to support it in the near future.

In the meantime, I''d suggest using NV_point_sprite instead. Besides POINT_SPRITE_R_MODE_NV, it seems to be compatible with ARB_point_sprite (even the tokens have the same values).

Share this post


Link to post
Share on other sites
Sander    1332
Well, GL_NV_point_sprite is not supported on my TNT2. Nor is GL_EXT_point_sprite (from OpenGL 1.3 IIRC). I reckon that the cards that support the NV and EXT versions will also support the ARB version, but what cards could do that? Obviously not TNT2.

Sander Maréchal
[Lone Wolves Game Development][RoboBlast][Articles][GD Emporium][Webdesign][E-mail]

Share this post


Link to post
Share on other sites
Yann L    1802
ARB_point_sprite will probably not be exposed on anything below GeForce3/4, because chipsets prior to that did not have hardware point sprite support. A software emulation could be done, but that would be a little against the general OpenGL rule "if an extension is available, then it is hardware accelerated". This rule has already been broken a couple of times (vertex programs on GF2/GF4MX), so it''s basically upon nvidia and ATI, wether they implement a software codepath or not (and upon the ARB to certify it).

Share this post


Link to post
Share on other sites
jeeky    126
My GeForce3 Ti-500 handles GL_NV_point_sprite just fine. I haven''t updated my drivers in a little bit, but I would assume that if the latest and greatest drivers enumerate ARB_point_sprite that the behavior would be the same as GL_NV_point_sprite.

As I recall, the one thing that this extension didn''t implement on the GF3 was size attenuation. You still control the size of the sprite (max = 64) using glPointSize(). This attenuation feature has been implemented in the GF4s (not the MX).

Share this post


Link to post
Share on other sites
jeeky    126
quote:
Original post by Pipo DeClown
[off] ATi is the future? Rocking n Vidia from the throne! [/off]


WARNING: The following content may be offensive to some fanatics. If this is you, please turn off your browser right now.

[off]Right! That must be why ATI has all those buggy drivers, and bad developer support, and a much smaller developer site than nVidia, and . . .

Despite my strong feelings about ATI vs. nVidia, a cutting edge card of either brand is very cool![/off]


Share this post


Link to post
Share on other sites
benjamin bunny    838
quote:
Original post by jeeky
quote:
Original post by Pipo DeClown
[off] ATi is the future? Rocking n Vidia from the throne! [/off]


WARNING: The following content may be offensive to some fanatics. If this is you, please turn off your browser right now.

[off]Right! That must be why ATI has all those buggy drivers, and bad developer support, and a much smaller developer site than nVidia, and . . .

Despite my strong feelings about ATI vs. nVidia, a cutting edge card of either brand is very cool![/off]





We've been over this before. You're obviously a nvidia fanboy, and I'm sure you want to tell us how great nvidia is compared to ATi, but until you have something of substance (ideally based in reality*) to tell us, please quit posting random BS.

*yes, that means you have to have actually used an ATi card, or experienced ATi's developer support - which is in my experience very good.

____________________________________________________________
www.elf-stone.com | Automated GL Extension Loading: GLee

[edited by - benjamin bunny on September 1, 2003 12:39:06 PM]

Share this post


Link to post
Share on other sites
jeeky    126
quote:

We've been over this before. You're obviously a nvidia fanboy, and I'm sure you want to tell us how great nvidia is compared to ATi, but until you have something of substance (ideally based in reality*) to tell us, please quit posting random BS.

*yes, that means you have to have actually used an ATi card, or experienced ATi's developer support - which is in my experience very good.



Sure, here's an example of someone having trouble with ATI cards.

If you feel that Pipo Declown's statement is based on fact, I would like that backed up as well. I don't see you quoting him.

I have had plenty of development on both brands of cards. If I were only an end user, I would take either. As a programmer, however, I have had much more trouble with ATI cards. That is a fact. I have given examples before. You still haven't responded to those issues.

Man, what is your problem? Can't you just accept the fact that people have opinions? Don't you realize that when people have problems with a particular product they don't want to use it anymore?

If you had driven and worked on two different brands of cars and one had more problems than the other, would you keep buying the brand that had issues? I wouldn't until I saw a real difference in quality. It would take more than just a fan of the inferior car company telling me that everything was fixed.

I also don't think you read my entire post. As I stated, if someone said "Here's the latest and greatest ATI card. Go replace your GeForce3", I would do it in a heartbeat. I am not saying I hate ATI or their cards. I am saying that I have had less issues with nVidia cards than ATI cards. What is wrong with that?

Stop freaking out about everything.

[edited by - jeeky on September 1, 2003 9:25:20 PM]

Share this post


Link to post
Share on other sites
benjamin bunny    838
quote:
Original post by jeeky
Sure, here's an example of someone having trouble with ATI cards.


So I pointed out that ATi doesn't support a certain non-standard feature. Your point is?

quote:

If you feel that Pipo Declown's statement is based on fact, I would like that backed up as well. I don't see you quoting him.


I haven't noticed Pipo declown repeatedly spamming the boards with the same misinformed viewpoint.

quote:
Man, what is your problem? Can't you just accept the fact that people have opinions? Don't you realize that when people have problems with a particular product they don't want to use it anymore?

My main objection to your posts is the fact you constantly feel the need to hijack otherwise worthwhile threads to make the same ill-informed point.

quote:
If you had driven and worked on two different brands of cars and one had more problems than the other, would you keep buying the brand that had issues? I wouldn't until I saw a real difference in quality. It would take more than just a fan of the inferior car company telling me that everything was fixed.

I'm not an ATi fan. I bought an ATi card because they happened to be a better deal at the time. My previous cards include a Geforce 2 Pro, an original Radeon and a TNT, and I was very happy with each of these. The reason I take exception to your anti-ATi rants is because they're obviously based on your very limited experience of using the cards.

quote:
I have had plenty of development on both brands of cards. If I were only an end user, I would take either. As a programmer, however, I have had much more trouble with ATI cards. That is a fact. I have given examples before. You still haven't responded to those issues.


Like this issue which you mentioned here:
quote:
Oh, and benjamin, since the ATI drivers now rock, please tell me where I can get the driver which enables hardware accelerated OpenGL with a dual monitor setup. Thanks in advance.

...which I replied to. Note that neither I nor anyone else posting in that thread had experienced that paricular issue when using ATi cards with dual monitors.

[edited by - benjamin bunny on September 1, 2003 9:56:07 PM]

Share this post


Link to post
Share on other sites
jeeky    126
quote:
Original post by benjamin bunny
quote:
Original post by jeeky
Sure, here's an example of someone having trouble with ATI cards.


So I pointed out that ATi doesn't support a certain non-standard feature. Your point is?

quote:

If you feel that Pipo Declown's statement is based on fact, I would like that backed up as well. I don't see you quoting him.


I haven't noticed Pipo declown repeatedly spamming the boards with the same misinformed viewpoint.

quote:
Man, what is your problem? Can't you just accept the fact that people have opinions? Don't you realize that when people have problems with a particular product they don't want to use it anymore?

My main objection to your posts is the fact you constantly feel the need to hijack otherwise worthwhile threads to make the same ill-informed point.

quote:
If you had driven and worked on two different brands of cars and one had more problems than the other, would you keep buying the brand that had issues? I wouldn't until I saw a real difference in quality. It would take more than just a fan of the inferior car company telling me that everything was fixed.

I'm not an ATi fan. I bought an ATi card because they happened to be a better deal at the time. My previous cards include a Geforce 2 Pro, an original Radeon and a TNT, and I was very happy with each of these. The reason I take exception to your anti-ATi rants is because they're obviously based on your very limited experience of using the cards.

quote:
I have had plenty of development on both brands of cards. If I were only an end user, I would take either. As a programmer, however, I have had much more trouble with ATI cards. That is a fact. I have given examples before. You still haven't responded to those issues.


Like this issue which you mentioned here:
quote:
Oh, and benjamin, since the ATI drivers now rock, please tell me where I can get the driver which enables hardware accelerated OpenGL with a dual monitor setup. Thanks in advance.

...which I replied to. Note that neither I nor anyone else posting in that thread had experienced that paricular issue when using ATi cards with dual monitors.

[edited by - benjamin bunny on September 1, 2003 9:56:07 PM]


Holy crap, will this pissing contest never end?

Ok, please answer the following so we can achieve resolution:
1. My view is that I have had trouble with ATI cards so I would prefer to use nVidia cards, which I have never had trouble with. What is misinformed about my view?

2. What threads have I hijacked? Please enumerate them. We had an unfortunate flame war in one, yes, but when have I "hijacked" a thread?

3. Does saying "works for me" fix a bug or solve a problem?

4. Does saying "works for me" even apply at all if you have different hardware than I do (Mobility FireGL vs. Radeon)?

[edited by - jeeky on September 1, 2003 10:42:19 PM]

Share this post


Link to post
Share on other sites
_the_phantom_    11250
At the risk of derailing this thread even more, yes ATI have some problems with their drivers, but if you''ve spent any time on the rage3d boards you''d notice there are a couple of driver dev ppl hanging out there and they know about the bugs and are working to fix them (in fact, the only major one in the current driver set that i know of is the shadow issue the current cards seem to have).
In my experiance ATI''s drivers tend to follow the specs, where as Nvidia''s seem to follow what coders do (see the clamp_to_edge problem which on ATI cards with standard following drivers leads to black lines and on Nvidia cards with broken drivers leads to the correct image but only because coders have coded it wrong).
I was a great critic of ATIs drivers for ages because, and most ppl will agree with this, until the cat drivers the drivers were horrible. However i''ve since swapped across from Nvidia to ATI simply coz the cards are better (anyone who has seen the test done between the top of the range cards for both brands on TR:AOD and saw the Nvidia cards getting spanked will know this - http://www.beyond3d.com/misc/traod_dx9perf/ if you havent seen it) and i''ve had NO driver trouble with them since installing them (and learning the hard way you have to uninstall old before installing new, which was my bad not theirs) and find them to be standard compilent and for ATI to be open about problems they have.
I cant say i''ve heard the same about Nvidia... it also amuses me how everyone claims that ATI put out buggy drivers, yet fail to comment on Nvidia''s ''throw out a driver everyother week'' (maybe not that often, but you get the idea) system and get the public to beta test... odd that isnt it

Back to the point of the thread and \o/ for GL_ARB_point_sprite extension, i was asking about point sprite support on ATI cards a while back, if we get this in the 1.5 driver from ATI then fantasic

Share this post


Link to post
Share on other sites
jeeky    126
quote:
Original post by _the_phantom_
At the risk of derailing this thread even more, yes ATI have some problems with their drivers, but if you''ve spent any time on the rage3d boards you''d notice there are a couple of driver dev ppl hanging out there and they know about the bugs and are working to fix them (in fact, the only major one in the current driver set that i know of is the shadow issue the current cards seem to have).


Hey phantom, thanks for pointing me to these boards. Is there anywhere else you know of that ATI driver developers hang out? I would love to ask them about the problems I am having.

Thanks again!

Share this post


Link to post
Share on other sites
I know GF3''s support your extension, and I think that even the first Radeons did as well. Not sure about that though.

As for the huge debate, it goes on everywhere. Thing is, I am a gamer AND a GL coder. I have experienced both on both levels. It all boils down to what you need to do. if you need *FULL* OpenGL support, higher frames in GL-only games (SoF2 for instance), and less problems, go with nVidia. If you prefer D3D or only need a decent amount of GL support, ATI is probably your best bet.

Now before I get flamed for inexperience or anything, I can assure you that I have used both cards. I have always been an nVidia fan, but I bought a Radeon when they came out for the SOLE purpose of giving the udner-dog a chance. In D3D the Radeon pushed more frames, but when I switched to GL I lost the ability to control the gamma (brightness) in a few of my games (Unreal Tournament for instance), and the nVidia smoked it. I later returned the card because I use GL for it''s insane higher frame count over D3D, and if a Radeon won''t provide me with equal or better results, I''ll stick with my nVidia.

My friend in Texas also bought a Radeon. If I remember it was a 9800 (9600??), one of the new ones. In UT2K3 and the original UT, his card rendered every player model ALL the time, no matter where they were. Yes, it gave him a HUGE advantage, but it looked crappy. In the original UT he couldn''t brighten his game either. Buying a GeForce IV not only fixed his problems, but it pumped higher frames on his older system.

Finally, if ATI cards are so "awesome", then why is it that game developers making GL games (SoF2, RTCW, etc) have to go in and detect these cards to run a special set of extra instructions to make them operate properly? I can take screenshots in-game of the options if you do not know what I am talking about. ATI cards are *NOT* 100% GL-compliant. By all means though, they are NOT suckage. They just have a few bugs that have been in them for the past ten years that ATI is too lazy to fix, or doesn''t know how to fix. However, if I build a system not for gaming, but for some other purpose, you can bet I will slap an ATI in it. Yet on this system (P4/3.56), I''ll stick with my GeForce FX5900 AGP 8X 256mb card and continue to smoke anybody who wants a true framerate test: in-game.

-The Great Sephiroth

Share this post


Link to post
Share on other sites
Sander    1332
Hmmm.... I go to sleep a few measily hours and look what happens! Okay, here we go...

quote:
Original post by Jeeky
As I recall, the one thing that this extension didn't implement on the GF3 was size attenuation. You still control the size of the sprite (max = 64) using glPointSize(). This attenuation feature has been implemented in the GF4s (not the MX).


Strange, since point size attenuation is supported on mt TNT2. I already implemented it and it looks great.
quote:
Original post by Jeeky
What threads have I hijacked? Please enumerate them. We had an unfortunate flame war in one, yes, but when have I "hijacked" a thread?

You hijacked this thread after PipoDeClown posted an off-topic remark. However, since you're not the only one here and since everyone did keep it at a civil level (no shouting and namecalling. Hurray!) I won't mind. My problem was solved before it started.
quote:
Original post by _the_phantom_
Back to the point of the thread and \o/ for GL_ARB_point_sprite extension, i was asking about point sprite support on ATI cards a while back, if we get this in the 1.5 driver from ATI then fantasic

Somewhat bad news here. GL_ARB_point_sprites will be in OpenGL 1.5 but it won't be in the core. It'll remain an extension.
quote:
Original post by The Great Sephiroth
In UT2K3 and the original UT, his card rendered every player model ALL the time, no matter where they were. Yes, it gave him a HUGE advantage, but it looked crappy.

That's not crappy coding on ATI's site, but crappy coding by the developers. They probabely used a few vendor specific tricks to speed things up and they failed to do a good job with OpenGL on ATI cards. It's a problem with Unreals visibility routines drawing models in the wrong order with the wrong settings, not with ATI's OpenGL implementation.

Now please stop the nVidia-ATI wars before I need to move my own thread to the lounge. Thank you kindly!

Sander Maréchal
[Lone Wolves Game Development][RoboBlast][Articles][GD Emporium][Webdesign][E-mail]


[edited by - sander on September 2, 2003 3:03:32 AM]

Share this post


Link to post
Share on other sites
benjamin bunny    838
quote:
Original post by jeeky
1. My view is that I have had trouble with ATI cards so I would prefer to use nVidia cards, which I have never had trouble with. What is misinformed about my view?

The bit where you jump to conclusions based on your very limited experience of using ATi hardware, and present those as fact.

quote:
2. What threads have I hijacked? Please enumerate them. We had an unfortunate flame war in one, yes, but when have I "hijacked" a thread?

This is the second AFAIK. The other one I linked to above.

quote:
3. Does saying "works for me" fix a bug or solve a problem?
Did I say it did? My intention was not to fix your problem, but to let you know that the bug only seemed to affect to your particular hardware combination.

quote:
4. Does saying "works for me" even apply at all if you have different hardware than I do (Mobility FireGL vs. Radeon)?
See above.

If you need any more issues resolved, you should email me. We better let Sander have his thread back before he puts us on probation.

[edited by - benjamin bunny on September 2, 2003 9:41:19 AM]

Share this post


Link to post
Share on other sites
_the_phantom_    11250
quote:
Original post by Sander
quote:
Original post by _the_phantom_
Back to the point of the thread and \o/ for GL_ARB_point_sprite extension, i was asking about point sprite support on ATI cards a while back, if we get this in the 1.5 driver from ATI then fantasic

Somewhat bad news here. GL_ARB_point_sprites will be in OpenGL 1.5 but it won''t be in the core. It''ll remain an extension.



Well, core or non-core the addition of a point_sprite extension will be a welcome one as, when i last checked, my 9700pro with recent drivers didnt support any kind of point sprite extension at all, so its a nice step forward and will hopefully lead to it being core in 1.6

Share this post


Link to post
Share on other sites
Sander    1332
quote:
Original post by _the_phantom_
...so its a nice step forward and will hopefully lead to it being core in 1.6

Well, I hope not. I hope that they skip 1.6 - 1.9 and head straight for 2.0!
quote:
Original post by benjamin bunny
If you need any more issues resolved, you should email me. We better let Sander have his thread back before he puts us on probation.


Nah, I'm not going to do that. My questions are already answered. If you want the thread you can take it and i'll move it to the lounge. I could also change the thread title to "ATI - nVidia wars, bring your WMD's" so others can join the fray as well.

Sander Maréchal
[Lone Wolves Game Development][RoboBlast][Articles][GD Emporium][Webdesign][E-mail]


[edited by - sander on September 2, 2003 11:29:37 AM]

Share this post


Link to post
Share on other sites