Jump to content
  • Advertisement
Sign in to follow this  
L. Spiro

OpenGL Max Texture Slots in OpenGL

This topic is 2305 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

All I want to know is the correct cross-platform/cross-machine way to get the highest texture slot available to me in OpenGL. By this I mean the highest slot that can be passed to ::glActiveTexture() and which can be read by a shader.

Before you say how well documented and trivial this is, allow me to explain why it is not.
The documentation says very clearly that I should be able to pass any value from 0 to (GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS - 1) to ::glActiveTexture().

On one machine, GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS returns 128, however if I pass any value equal to or above 8 (the actual number of texture units available on that machine) I get GL_INVALID_ENUM, in exact contradiction with the specs.

From here out, let me keep things simple by just saying that all of my 4 test machines actually factually have 8 texture units. Putting a texture into any unit higher than index 7 results in a black texture, as it is no longer readable by the shader (and probably not even applied at all).

Machine #1:
GL_MAX_TEXTURE_UNITS = 8 // Correct!

Machine #2:

I don’t have data for the other 2 machines now. One is a Macintosh OS X and does not define GL_MAX_TEXTURE_COORDS at all.

On the other hand, for iOS OpenGL ES 2.0, GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS works as expected and returns the correct number of slots available.

What is the deal here?
I am not willing to do min( GL_MAX_TEXTURE_UNITS, GL_MAX_TEXTURE_COORDS ) because that leaves one machine with only 4 units when it actually has 8.
Is the only true way to know how many units there are to simply call ::glActiveTexture() until it throws GL_INVALID_ENUM?
That seems ridiculous. What is the correct way to get the actual number of units available to both ::glActiveTexture() and the shaders?

L. Spiro

Share this post

Link to post
Share on other sites
GL_MAX_TEXTURE_UNITS I believe is the enum that you want.

*edit, read more of your post.

Search for Max Texture Units Edited by dpadam450

Share this post

Link to post
Share on other sites
According to that site, GL_MAX_TEXTURE_IMAGE_UNITS would be the correct choice, but the results are not consistent.
In both cases, I can’t actually sample from texture unit 15 in the fragment shader. Anything above 7 is black.

Which is also inconsistent. DirectX 11 has no problem sampling from unit 15. This confirms that I have not made any stupid mistakes in the pipeline that artificially prevent reading from texture units 8 and above. There is one variable that dictates the highest texture unit available and the only difference between each API I support is what number is put into that variable. The rest of the engine is completely agnostic, so there is no chance of something on just the OpenGL side that is screwing with my results.

16 is the actual number of units on this hardware (machine #1), but the fragment shader can only read up to index 7 in OpenGL.

L. Spiro Edited by L. Spiro

Share this post

Link to post
Share on other sites
GL_MAX_TEXTURE_UNITS is essentially the fixed function limit
GL_MAX_TEXTURE_IMAGE_UNITS is texture map access from the fragment shader
another one
GL_MAX_VERTEX_TEXTURE_IMAGE_UNITS is texture map access from the vertex shader

so i think youre looking at


and the spec says the larger of the two, -1, is the limit of ActiveTexture. Myself, I go by GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS, although I must admit I havent tried testing the limit.

GL_MAX_TEXTURE_COORDS is the max tex coord sets available to vertex and fragment shader.
GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS is the max that can be accessed between vertex and fragment shaders together.

The language could be clearer, It is pretty confusing. Edited by NumberXaero

Share this post

Link to post
Share on other sites
I have solved the issue.
My conclusion that I was not doing something to screw up my own results was wrong, although I don’t actually know why.
I had a deprecated enum for my own max texture support which was used to keep track of redundant texture states. It is deprecated and was supposed to be a different enum.
However, it is used on all platforms, but only causes problems in OpenGL. This part I don’t understand, but I am getting rid of that enum and switching to the proper one.

This fixes the issue and I am now able to access texture unit 15 as I should in OpenGL.

Thank you for the replies.

L. Spiro

Share this post

Link to post
Share on other sites
Sign in to follow this  

  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!