Max Texture Slots in OpenGL

Started by
3 comments, last by L. Spiro 11 years, 8 months ago
All I want to know is the correct cross-platform/cross-machine way to get the highest texture slot available to me in OpenGL. By this I mean the highest slot that can be passed to ::glActiveTexture() and which can be read by a shader.

Before you say how well documented and trivial this is, allow me to explain why it is not.
The documentation says very clearly that I should be able to pass any value from 0 to (GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS - 1) to ::glActiveTexture().

On one machine, GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS returns 128, however if I pass any value equal to or above 8 (the actual number of texture units available on that machine) I get GL_INVALID_ENUM, in exact contradiction with the specs.

From here out, let me keep things simple by just saying that all of my 4 test machines actually factually have 8 texture units. Putting a texture into any unit higher than index 7 results in a black texture, as it is no longer readable by the shader (and probably not even applied at all).

Machine #1:
GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS = 32
GL_MAX_TEXTURE_UNITS = 8 // Correct!
GL_MAX_TEXTURE_COORDS = 16
GL_MAX_TEXTURE_IMAGE_UNITS = 16

Machine #2:
GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS = 32
GL_MAX_TEXTURE_UNITS = 4
GL_MAX_TEXTURE_COORDS = 8 // Correct!
GL_MAX_TEXTURE_IMAGE_UNITS = 32

I don’t have data for the other 2 machines now. One is a Macintosh OS X and does not define GL_MAX_TEXTURE_COORDS at all.

On the other hand, for iOS OpenGL ES 2.0, GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS works as expected and returns the correct number of slots available.



What is the deal here?
I am not willing to do min( GL_MAX_TEXTURE_UNITS, GL_MAX_TEXTURE_COORDS ) because that leaves one machine with only 4 units when it actually has 8.
Is the only true way to know how many units there are to simply call ::glActiveTexture() until it throws GL_INVALID_ENUM?
That seems ridiculous. What is the correct way to get the actual number of units available to both ::glActiveTexture() and the shaders?


L. Spiro

I restore Nintendo 64 video-game OST’s into HD! https://www.youtube.com/channel/UCCtX_wedtZ5BoyQBXEhnVZw/playlists?view=1&sort=lad&flow=grid

Advertisement
GL_MAX_TEXTURE_UNITS I believe is the enum that you want.

*edit, read more of your post.

http://www.opengl.org/wiki/Textures_-_more
Search for Max Texture Units

NBA2K, Madden, Maneater, Killing Floor, Sims http://www.pawlowskipinball.com/pinballeternal

According to that site, GL_MAX_TEXTURE_IMAGE_UNITS would be the correct choice, but the results are not consistent.
In both cases, I can’t actually sample from texture unit 15 in the fragment shader. Anything above 7 is black.

Which is also inconsistent. DirectX 11 has no problem sampling from unit 15. This confirms that I have not made any stupid mistakes in the pipeline that artificially prevent reading from texture units 8 and above. There is one variable that dictates the highest texture unit available and the only difference between each API I support is what number is put into that variable. The rest of the engine is completely agnostic, so there is no chance of something on just the OpenGL side that is screwing with my results.

16 is the actual number of units on this hardware (machine #1), but the fragment shader can only read up to index 7 in OpenGL.


L. Spiro

I restore Nintendo 64 video-game OST’s into HD! https://www.youtube.com/channel/UCCtX_wedtZ5BoyQBXEhnVZw/playlists?view=1&sort=lad&flow=grid

GL_MAX_TEXTURE_UNITS is essentially the fixed function limit
GL_MAX_TEXTURE_IMAGE_UNITS is texture map access from the fragment shader
another one
GL_MAX_VERTEX_TEXTURE_IMAGE_UNITS is texture map access from the vertex shader

so i think youre looking at

GL_MAX_TEXTURE_COORDS
GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS

and the spec says the larger of the two, -1, is the limit of ActiveTexture. Myself, I go by GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS, although I must admit I havent tried testing the limit.

GL_MAX_TEXTURE_COORDS is the max tex coord sets available to vertex and fragment shader.
GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS is the max that can be accessed between vertex and fragment shaders together.

The language could be clearer, It is pretty confusing.
I have solved the issue.
My conclusion that I was not doing something to screw up my own results was wrong, although I don’t actually know why.
I had a deprecated enum for my own max texture support which was used to keep track of redundant texture states. It is deprecated and was supposed to be a different enum.
However, it is used on all platforms, but only causes problems in OpenGL. This part I don’t understand, but I am getting rid of that enum and switching to the proper one.

This fixes the issue and I am now able to access texture unit 15 as I should in OpenGL.


Thank you for the replies.


L. Spiro

I restore Nintendo 64 video-game OST’s into HD! https://www.youtube.com/channel/UCCtX_wedtZ5BoyQBXEhnVZw/playlists?view=1&sort=lad&flow=grid

This topic is closed to new replies.

Advertisement