Jump to content
  • Advertisement
Sign in to follow this  
Tsumuji

OpenGL Question about OGL2 and extensions.

This topic is 4404 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Since the OpenGL 2.0 imcorporated the shaders in the core now, why I still need to load extensions to use the shaders? I'm using GLee here to load everithynmg automaticaly, but if possible, I want to dismiss GLee from my project.

Share this post


Link to post
Share on other sites
Advertisement
I am afraid you are going to have to continue using GLee for a long time yet (unless you are on a Mac).
The problem here is that even though OpenGL 2.0 incorporated most of the common extensions into the core, Microsoft's OpenGL lib is actually version 1.1, so you have to load the newer features as pointers to extensions.
ASFAIK, the only platform that currently has a complete OpenGL 2.0 implementation (i.e. full spec in core lib), is Apple's OS X (10.4 >).
And in the future, it seems that Microsoft is not upgrading all the way to 2.0 even with Vista.

Share this post


Link to post
Share on other sites
Quote:
Original post by Tsumuji
but I use linux. Forgot to mention...

Well there it depends on your distro, but I can't think of any with 2.0 in core off hand.

Share this post


Link to post
Share on other sites
Quote:
Original post by Tsumuji
but I use linux. Forgot to mention...

1.3 is the last version in the core libs in Nvidia's release 80 drivers, IIRC. Don't ask me why, for the simple reason that I don't know [grin]. It could be because they don't have a defined behaviour when the function is not available (say on a lesser card) but I admit that's really lame.

Share this post


Link to post
Share on other sites
Quote:
Original post by deavik
It could be because they don't have a defined behaviour when the function is not available (say on a lesser card) but I admit that's really lame.

Even with GL 2.0 in core, you still have to check the extensions string for many things, it is just that the function pointers are already linked in if it is available. I have a feeling that calling a 2.0 function for an extension that isnot available on the current hardware is 'undefined'

Share this post


Link to post
Share on other sites
Quote:
Original post by swiftcoder
Quote:
Original post by deavik
It could be because they don't have a defined behaviour when the function is not available (say on a lesser card) but I admit that's really lame.

Even with GL 2.0 in core, you still have to check the extensions string for many things, it is just that the function pointers are already linked in if it is available. I have a feeling that calling a 2.0 function for an extension that isnot available on the current hardware is 'undefined'

I was thinking about what I wrote after I posted that message, and actually it's not unprecendented to have a function pointer on hardware that isn't capable. Any respectable Nvidia card can use their latest drivers (and thus acquire all the function pointers) without having the hardware capability. Calling such a function usually sets an INVALID_OPERATION error and the function call is ignored.

Another eg: Because 2.0 functions are available on a Geforce2 so you can call glCompileShader for a fragment shader, and glLinkProgram. IIRC it doesn't link and LINK_STATUS is set to FALSE.

So you see, they already have ways to circumvent this "undefined behavior". Having said that, there must be a very good reason the function pointers for newer functions aren't set by default, because many people are bound to have asked before this. I, for one, would like very much to know ...

Share this post


Link to post
Share on other sites
Quote:
Another eg: Because 2.0 functions are available on a Geforce2 so you can call glCompileShader for a fragment shader, and glLinkProgram. IIRC it doesn't link and LINK_STATUS is set to FALSE.


I thought 2.0 wasn't available on Gf2
Which drivers are you talking about?

Share this post


Link to post
Share on other sites
Quote:
Original post by V-man
Quote:
Another eg: Because 2.0 functions are available on a Geforce2 so you can call glCompileShader for a fragment shader, and glLinkProgram. IIRC it doesn't link and LINK_STATUS is set to FALSE.


I thought 2.0 wasn't available on Gf2
Which drivers are you talking about?

2.0 isn't reported in the drivers because of hardware constraints (my Geforce4MX reports 1.5.4 with Forceware 84.21) but the function pointers for all 2.0 functions are available (ie. GetProcAddress returns a non-null address). However, as I said, the functions don't work if the hardware isn't capable, often setting an INVALID_OPERATION flag.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!