Jump to content
  • Advertisement
Sign in to follow this  
RPTD

OpenGL BindProgram depracts Enable/Disable(GL_TEXTURE_xxx)?

This topic is 3367 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Trying to find some concise informations about GL_MAX_TEXTURE_UNITS, GL_MAX_TEXTURE_UNITS and GL_MAX_TEXTURE_IMAGE_UNITS I stumbled across this FAQ on nVidia: http://developer.nvidia.com/object/General_FAQ.html#t6 . Now there they mention the following:
Quote:
Incidentally, the classification above also tells you that all the calls corresponding to GL_MAX_TEXTURE_UNITS are useless when using fragment programs and actually ignored by the driver (so, making those calls has no adverse performance impact). For example, you don't need to call glEnable/glDisable(GL_TEXTURE_xxx) since the texture target is passed as a parameter to the TEX/TXP/etc instructions called inside the fragment program. Making those calls will actually generate errors if the active texture index is above GL_MAX_TEXTURE_UNITS. glBindProgramARB is the one doing all the work of state configuration and texture enabling/disabling. One glBindProgramARB basically replaces a dozen or more of glActiveTexture, glEnable/glDisable, and glTexEnv calls.
So if I get this correctly you just bind textures to units using BindTexture and not requiring Enable( TEXTURE_2D ) anymore? Is this just some nVidia idea or is this somehow given by the OpenGL specs? Since I can not remember seeing ATI liking this behavior. Some clarification would be nice.

Share this post


Link to post
Share on other sites
Advertisement
Yes, you are correct. When you are binding texture, you are binding it to active texture unit. Texture unit number is set as sampler in GLSL shader. No need to enable/disable GL_TEXTURE_xD - that is for fixed functionality only. I had no problems with this "feature" on ATI cards.

It is similar like you don't need to glEnable/Disable(GL_LIGHTING) if you are using GLSL shaders. You either calculate or don't lighting calculations in shader. Fixed functionatly boolean switch GL_LIGHTING doesn't influence it.

Share this post


Link to post
Share on other sites
Just asking since my ATI card did some really strange things if I did not properly glEnable the right texture target ( texture 2D or some cube map face ).

Share this post


Link to post
Share on other sites
Quote:
Original post by RPTD
Just asking since my ATI card did some really strange things if I did not properly glEnable the right texture target ( texture 2D or some cube map face ).
Beware that if you mix shader with fixed functionality in the same application (for instance, with a fixed-function GUI library), then you still need to set all the texture state correctly for the fixed-function section of the program.

Usually that would result in code like this before rendering your GUI:
glBindProgram(0);
glActiveTexture(GL_TEXTURE0);
glClientActiveTexture(GL_TEXTURE0);
glDisable(GL_TEXTURE_3D);
glEnable(GL_TEXTURE_2D);

Share this post


Link to post
Share on other sites
Quote:
Original post by RPTD
Just asking since my ATI card did some really strange things if I did not properly glEnable the right texture target ( texture 2D or some cube map face ).


I haven't had that problem. I'm running the same code on nvidia and ati and everything done with shaders. I don't use glLight, glEnable(GL_TEXTURE_2D), glMatrixMode, glFrustum, glOrtho, glRotate, glTexEnv and many other classical things.

Share this post


Link to post
Share on other sites
There happen various problems amongst others those two:

1) Attaching a 2D texture, then use a shader, then detach the 2D texture and attach a cube texture, run another shader => GL_INVALID_OPERATION ( with glEnable and company works )

2) Attaching a depth texture without 2D texture set => garbled shadow maps ( with glEnable and company works )

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!