GL_MAX_TEXTURE_UNITS is wrong?

Started by
8 comments, last by GreyHound 14 years, 8 months ago
I am using a GTX 260, and when I used a GetIntegerv on GL_MAX_TEXTURE_UNITS, I got back 4. I tought this was a bit low (I ekpected at least 8 on this card), so I went to the DirectX caps viewer, which told me that the maximum number was 8. I was wondering if this was just an example of the opengl drivers being able to do less, or if there was something wrong with the results. Regardless, I decided to download the latest drivers. Any thoughts on this?

Stupid details.....always get in the way....it's time to write the dwim(x) function ("do what i mean to")
Advertisement
You're looking to the wrong value, GL_MAX_TEXTURE_UNITS describes the maximum number of MULTITEXTURING texture units, using fixed function. To get the maximum number of texture units for shaders (Which is what you probably are looking for), try with GL_MAX_VERTEX_TEXTURE_IMAGE_UNITS, GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS, and GL_MAX_TEXTURE_IMAGE_UNITS.
as Hunts,am said
GetIntegerv( GL_MAX_TEXTURE_IMAGE_UNITS_ARB,&max_texture_image_units )

+ you should expect more than 8, the earliest nvidia shader card support 16, my card (a generation earlier than yours supports 32)
There are three different variants of the GTX 260, the original one, with 64 TMUs (192 shader units), a 216 shader unit version, with 72 TMUs and a die shrunk version of the 216 shader version, also with 72 TMUs.
Hi,

I only recently discovered the answer to this... by accident.
Basically, GL_MAX_TEXTURE_UNITS will always return 4 intentionally!
Instead, use GL_MAX_TEXTURE_IMAGE_UNITS_ARB

This nvidia FAQ has a great explanation:
http://developer.nvidia.com/object/General_FAQ.html#t6


So, gg opengl, you truly do suck.

Quote:Original post by meatcow
Hi,

I only recently discovered the answer to this... by accident.
Basically, GL_MAX_TEXTURE_UNITS will always return 4 intentionally!
Instead, use GL_MAX_TEXTURE_IMAGE_UNITS_ARB

This nvidia FAQ has a great explanation:
http://developer.nvidia.com/object/General_FAQ.html#t6


So, gg opengl, you truly do suck.


I know what you mean! We don't need no backwards compatibility!!1
Quote:Original post by meatcow
So, gg opengl, you truly do suck.


lol
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);
Quote:So, gg opengl, you truly do suck.

Yet another great constructive conclusion.

GL_MAX_TEXTURE_UNITS became deprecated with OpenGL 3, so I'm pretty sure the OpenGL authors are fine with the fact that GL_MAX_TEXTURE_UNITS may return something different that some developers would "expect" it to return.
New section added to Wiki
http://www.opengl.org/wiki/Textures_-_more

"Max Texture Units" has been added
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);
Quote:Original post by V-man
New section added to Wiki
http://www.opengl.org/wiki/Textures_-_more

"Max Texture Units" has been added


Excellent Job, just perfect, that's the exact reason why i visited that thread. I wanted to learn about how many texture coordinates are available.



This topic is closed to new replies.

Advertisement