BindProgram depracts Enable/Disable(GL_TEXTURE_xxx)?

Started by
5 comments, last by RPTD 14 years, 10 months ago
Trying to find some concise informations about GL_MAX_TEXTURE_UNITS, GL_MAX_TEXTURE_UNITS and GL_MAX_TEXTURE_IMAGE_UNITS I stumbled across this FAQ on nVidia: http://developer.nvidia.com/object/General_FAQ.html#t6 . Now there they mention the following:
Quote:Incidentally, the classification above also tells you that all the calls corresponding to GL_MAX_TEXTURE_UNITS are useless when using fragment programs and actually ignored by the driver (so, making those calls has no adverse performance impact). For example, you don't need to call glEnable/glDisable(GL_TEXTURE_xxx) since the texture target is passed as a parameter to the TEX/TXP/etc instructions called inside the fragment program. Making those calls will actually generate errors if the active texture index is above GL_MAX_TEXTURE_UNITS. glBindProgramARB is the one doing all the work of state configuration and texture enabling/disabling. One glBindProgramARB basically replaces a dozen or more of glActiveTexture, glEnable/glDisable, and glTexEnv calls.
So if I get this correctly you just bind textures to units using BindTexture and not requiring Enable( TEXTURE_2D ) anymore? Is this just some nVidia idea or is this somehow given by the OpenGL specs? Since I can not remember seeing ATI liking this behavior. Some clarification would be nice.

Life's like a Hydra... cut off one problem just to have two more popping out.
Leader and Coder: Project Epsylon | Drag[en]gine Game Engine

Advertisement
Yes, you are correct. When you are binding texture, you are binding it to active texture unit. Texture unit number is set as sampler in GLSL shader. No need to enable/disable GL_TEXTURE_xD - that is for fixed functionality only. I had no problems with this "feature" on ATI cards.

It is similar like you don't need to glEnable/Disable(GL_LIGHTING) if you are using GLSL shaders. You either calculate or don't lighting calculations in shader. Fixed functionatly boolean switch GL_LIGHTING doesn't influence it.

http://www.opengl.org/wiki/GLSL_:_common_mistakes
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);
Just asking since my ATI card did some really strange things if I did not properly glEnable the right texture target ( texture 2D or some cube map face ).

Life's like a Hydra... cut off one problem just to have two more popping out.
Leader and Coder: Project Epsylon | Drag[en]gine Game Engine

Quote:Original post by RPTD
Just asking since my ATI card did some really strange things if I did not properly glEnable the right texture target ( texture 2D or some cube map face ).
Beware that if you mix shader with fixed functionality in the same application (for instance, with a fixed-function GUI library), then you still need to set all the texture state correctly for the fixed-function section of the program.

Usually that would result in code like this before rendering your GUI:
glBindProgram(0);glActiveTexture(GL_TEXTURE0);glClientActiveTexture(GL_TEXTURE0);glDisable(GL_TEXTURE_3D);glEnable(GL_TEXTURE_2D);

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

Quote:Original post by RPTD
Just asking since my ATI card did some really strange things if I did not properly glEnable the right texture target ( texture 2D or some cube map face ).


I haven't had that problem. I'm running the same code on nvidia and ati and everything done with shaders. I don't use glLight, glEnable(GL_TEXTURE_2D), glMatrixMode, glFrustum, glOrtho, glRotate, glTexEnv and many other classical things.
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);
There happen various problems amongst others those two:

1) Attaching a 2D texture, then use a shader, then detach the 2D texture and attach a cube texture, run another shader => GL_INVALID_OPERATION ( with glEnable and company works )

2) Attaching a depth texture without 2D texture set => garbled shadow maps ( with glEnable and company works )

Life's like a Hydra... cut off one problem just to have two more popping out.
Leader and Coder: Project Epsylon | Drag[en]gine Game Engine

This topic is closed to new replies.

Advertisement