Jump to content

  • Log In with Google      Sign In   
  • Create Account

We're offering banner ads on our site from just $5!

1. Details HERE. 2. GDNet+ Subscriptions HERE. 3. Ad upload HERE.


GL_TEXTURE_3D or GL_TEXTURE_2D?


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
6 replies to this topic

#1 Retsu90   Members   -  Reputation: 208

Like
0Likes
Like

Posted 09 July 2012 - 11:43 AM

Hi!
I'm developing a game that manage a lot of textures per frame. To avoid the high use of glBindTexture I realized that these textures can be unified in a single big texture, because a lot of them uses the same size and now the theoretical problem comes:
1) Using 2D texture: If I have 8 256x256 textures, I can put them in a single 256x2048 texture. It can be a problem because some video cards doesn't support big textures (mine supports 8192x8192). If the object has only one texture, its texture will be 256x256, pretty simple.
2) Using 3D texture: If I have 8 256x256 textures, I can put them in a single 256x256x8 texture. I think that here there aren't size problems with this way. If the object has only one texture, the 3D texture can't be 256x256x1, because the official site of OpenGL tells me that a texture can't have odd numbers, so I need to put it in a 2D texture, calls glEnable(GL_TEXTURE_2D) apply it on my 3D model and when I need to use again the 3D texture I'll need to call glEnable(Gl_TEXTURE_3D) again (this is what I read somewhere that tells to call glEnable every time that you need to switch between a 1D, 2D and 3D texture.
3) Continue to using my current method: a lot of 256x256 textures, a lot of glBindTexture, glUniform to upload the texture ID to the fragment shader and sacrifice performance to not have the problems in point 1 and 2.

What I suppose to do in this situation? There are other simple ways to improve the engine? What is it the best choice?
I'm waiting replies.

Sponsor:

#2 mark ds   Members   -  Reputation: 1482

Like
1Likes
Like

Posted 09 July 2012 - 01:44 PM

If you're using OpenGL 3.0 or above, 'texture arrays' (search for them) are designed specifically to do this. They're a bit like 3D textures, except that filtering only occurs on the layer you are using.

#3 Retsu90   Members   -  Reputation: 208

Like
0Likes
Like

Posted 09 July 2012 - 02:21 PM

If you're using OpenGL 3.0 or above, 'texture arrays' (search for them) are designed specifically to do this. They're a bit like 3D textures, except that filtering only occurs on the layer you are using.

In this page http://www.opengl.org/wiki/Array_Texture it describes that texture2DArray is used for this purpose, but isn't the same z = (1.0f/texDepth*index) without to break the compatibility with OpenGL < 3.0? And the problem about the texture3D with at least 2px of depth remains...

#4 mark ds   Members   -  Reputation: 1482

Like
1Likes
Like

Posted 09 July 2012 - 02:38 PM

Don't forget that you can't mipmap your 2D textures if stored in 3D.

Edit - unless you want to manually calculate the correct mipmap by interpolating across two 3D textures, that is.

Also, you can have one layer in a 3D texture, but depending on your hardware, it may require a power of 2.

What hardware are you running on?

Edited by mark ds, 09 July 2012 - 02:42 PM.


#5 Retsu90   Members   -  Reputation: 208

Like
0Likes
Like

Posted 09 July 2012 - 02:54 PM

Don't forget that you can't mipmap your 2D textures if stored in 3D.

Edit - unless you want to manually calculate the correct mipmap by interpolating across two 3D textures, that is.

Also, you can have one layer in a 3D texture, but depending on your hardware, it may require a power of 2.

What hardware are you running on?


I'm using an ATI Radeon 4870 and it supports textures with any size with a maximum of 8192x8192. It is okay, but I want to be sure that the compatibility with other old video cards will not break the executable due to a not-optimization of the code (for example I rewrote part of the shaders because my video card supports bitwise operations but my notebook with an Intel HD 3000 won't compile it). I'm planning to finish the core in OpenGL and next porting the code in OpenGL ES and I want to be sure that a lot of things will remains unchanged.

#6 mhagain   Crossbones+   -  Reputation: 8275

Like
2Likes
Like

Posted 09 July 2012 - 06:21 PM

Texture arrays would be the preferred route, yes, but even if you can't use them on account of your minimum specs, it's not all bad news.

Firstly, older hardware isn't that crap - really. The last desktop video card that could only handle 256x256 textures was a 3dfx back in the 1990s - I'm reasonably certain that you don't want to aim that low!

A more reasonable starting point would be somewhere in the GL 1.5 to 2.x class, which more or less guarantees you a minimum of 2048x2048 (there's one Intel in this class that only does 1024x1024, but I don't know of any other). You say that you're using shaders so that puts you in the guaranteed 2048x2048 bracket.

If you're still in doubt, a good place to jump off from might be this OpenGL capabilities database: http://feedback.wildfiregames.com/report/opengl/ - here's the page for GL_MAX_TEXTURE_SIZE: http://feedback.wildfiregames.com/report/opengl/feature/GL_MAX_TEXTURE_SIZE.

You'll see that ~2% of their listed hardware supports less than 2048x2048. Jumping down to the actual cards (http://feedback.wildfiregames.com/report/opengl/feature/GL_MAX_TEXTURE_SIZE#1024) you'll see that a huge majority (495 out of 553) of this 2% are "Microsoft GDI Generic" - in other words they don't even have an OpenGL driver installed. Not supporting 2048x2048 textures is going to be the least of their problems. So you can quite confidently set that as a minimum requirement, and be certain to have 100% coverage of the end-users you care about.

For mobile platforms I'm not sure how things work out, so you can probably run with a hybrid solution. Instead of restricting yourself to a choice between a single 2048x2048 texture or 64 256x256 textures, why not create 4 1024x1024 textures? You'll still get much of the benefit of reduced draw calls, but still also fit within the texture budget of the hardware. Likewise, if you've determined that the hardware only supports 512x512 textures then create 16 textures at this resolution.

Specifically for 3D textures as a solution to this, there are some things to watch out for. The max 3D texture size may be smaller than the max 2D size - I don't think that OpenGL specifies this, but it can and does happen. Also, you need to be careful of your filtering modes - basically, set up any kind of linear filter and you'll be filtering between different slices in the 3D texture, which is not what you want in this case (a texture array wouldn't have this problem) - it's GL_NEAREST all the way here. Finally, and only if you're really being ultra-conservative, 3D textures are an OpenGL 1.2 feature, so they're not going be available on some of the more prehistoric hardware (which nobody really has any more so it's a non-issue; just mentioning for info). Are 3D textures available on mobile platforms? Don't know - someone else will have to tell you that.

It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.


#7 Retsu90   Members   -  Reputation: 208

Like
0Likes
Like

Posted 11 July 2012 - 04:43 AM

Texture arrays would be the preferred route, yes, but even if you can't use them on account of your minimum specs, it's not all bad news.

Firstly, older hardware isn't that crap - really. The last desktop video card that could only handle 256x256 textures was a 3dfx back in the 1990s - I'm reasonably certain that you don't want to aim that low!

A more reasonable starting point would be somewhere in the GL 1.5 to 2.x class, which more or less guarantees you a minimum of 2048x2048 (there's one Intel in this class that only does 1024x1024, but I don't know of any other). You say that you're using shaders so that puts you in the guaranteed 2048x2048 bracket.

If you're still in doubt, a good place to jump off from might be this OpenGL capabilities database: http://feedback.wild.../report/opengl/ - here's the page for GL_MAX_TEXTURE_SIZE: http://feedback.wild...AX_TEXTURE_SIZE.

You'll see that ~2% of their listed hardware supports less than 2048x2048. Jumping down to the actual cards (http://feedback.wild...XTURE_SIZE#1024) you'll see that a huge majority (495 out of 553) of this 2% are "Microsoft GDI Generic" - in other words they don't even have an OpenGL driver installed. Not supporting 2048x2048 textures is going to be the least of their problems. So you can quite confidently set that as a minimum requirement, and be certain to have 100% coverage of the end-users you care about.

For mobile platforms I'm not sure how things work out, so you can probably run with a hybrid solution. Instead of restricting yourself to a choice between a single 2048x2048 texture or 64 256x256 textures, why not create 4 1024x1024 textures? You'll still get much of the benefit of reduced draw calls, but still also fit within the texture budget of the hardware. Likewise, if you've determined that the hardware only supports 512x512 textures then create 16 textures at this resolution.

Specifically for 3D textures as a solution to this, there are some things to watch out for. The max 3D texture size may be smaller than the max 2D size - I don't think that OpenGL specifies this, but it can and does happen. Also, you need to be careful of your filtering modes - basically, set up any kind of linear filter and you'll be filtering between different slices in the 3D texture, which is not what you want in this case (a texture array wouldn't have this problem) - it's GL_NEAREST all the way here. Finally, and only if you're really being ultra-conservative, 3D textures are an OpenGL 1.2 feature, so they're not going be available on some of the more prehistoric hardware (which nobody really has any more so it's a non-issue; just mentioning for info). Are 3D textures available on mobile platforms? Don't know - someone else will have to tell you that.


You open my eyes.
I was in search about a website like that! It's interesting how a lot of stuff was already in 10 years old cards. Maybe I too care about retro-compatibility.
Well, my framework is for a 2D porting where I have almost all the graphic resources that I put on 256x256 textures. Another guy is working on double-size textures so the final textures will be 512x512 (but the player can choice to play with the old megadrive style using internally the 256x256 textures) so I need to be sure that also with quad-size textures my framework will run also on graphic cards with a maximum of 2048x2048 size. My choice to increase the texture sizes only in height is because I need to recreate the texture every 2 frames and I won't to align every time 8 256x256 textures in a single 1024x1024 and I prefer to put them into a 256x2048 texture.
gl_ext_texture_array seems supported only by OGL 3 > and not by OpenGL ES, so I decided to not use it.
GL_ARB_texture_non_power_of_two is used by the 91% of videocards, so I decide to use 3D textures with 1 pixel of depth in case that the objects has only 1 texture. Also a lot of current smartphones supports GL_OES_texture_npot.
Good idea to implement an hybrid solution, but at this point the 3D textures are the best solution to get high compatibility. As I replied to mark ds, z = (1.0f/texDepth*index) can't be a replacement for texture2DArray? However I'm using GL_NEAREST in every texture, I don't like filters on a 2D game =P and GL_LINEAR breaks CLUT textures




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS