Advertisement Jump to content
Sign in to follow this  
ProtectedMode

OpenGL OpenGL using data outside texture coordinates range when enabling filtering

This topic is 1752 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I couldn't find anything about my issue online, maybe because I don't know how to express my question.

 

I'm using a tilemap to use different textures during the same draw call. But when I enable for example linear filtering instead of using nearest, my textures start to look weird. The filtering seems to take pixels from outside the specified area (using texture coordinates)...

 

Is there an easy solution for this that doesn't involve hacking? Maybe an option to specify that it will use the texels from the other side of the specified region instead of the texel outside of the specified texture "range"?

Share this post


Link to post
Share on other sites
Advertisement

 


Is there an easy solution for this that doesn't involve hacking?

 

Absolutely. Use an array texture. They've been available since OpenGL 3.0.

 

 

Thanks for your reply. smile.png I haven't heard of array textures before, I'm pretty new to OpenGL.

 

 

but be aware that if you are limited by the number of hardware texture units, you may still be better off using atlases (perhaps even in conjunction with array textures).

This quote from the page you linked suggests however that you're still limited by the number of hardware texture units. I want to have the possibility of more than a hundred different textures but as most GPU's only support 32 or so (according to some pages I found), I doubt if this is a solution... Am I wrong?

Share this post


Link to post
Share on other sites

This quote from the page you linked suggests however that you're still limited by the number of hardware texture units. I want to have the possibility of more than a hundred different textures but as most GPU's only support 32 or so (according to some pages I found), I doubt if this is a solution... Am I wrong?

A single array texture (GL_TEXTURE_2D_ARRAY in your case) is bound to a single texture unit / sampler. It is not so that you allocate more than a single unit because the array texture has more than 1 slice.

 

But notice that 1D /2D array textures have (may have, at least) another maximal size compared to "ordinary" 1D / 2D textures. On my computer, for example, an array texture can be as big as 2k by 2k texels by 2k slices, while a 2D texture can be as big as 8k by 8k texels. Use the following getters to be informed about the respective texture sizes:

   ::glGetIntegerv(GL_MAX_TEXTURE_SIZE, &intValue);
   ::glGetIntegerv(GL_MAX_RECTANGLE_TEXTURE_SIZE, &intValue);
   ::glGetIntegerv(GL_MAX_3D_TEXTURE_SIZE, &intValue);
Edited by haegarr

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!