Jump to content
  • Advertisement
Sign in to follow this  
iNsAn1tY

OpenGL Non-power-of-two textures point blank refusing to work

This topic is 4085 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm at a tear-your-hair-out stage with these goddamn texture rectangles. OpenGL absolutely refuses to use them. I specify a texture size like (512, 513) as a test, and bam, a nice helpful "invalid operation" error. Every time. With any combination of internal format, format and type supplied to glTexImage2D. I'm specifying GL_TEXTURE_RECTANGLE_ARB to glTexImage2D, and I'm running a GeForce 7800 GTX with the latest drivers. The most annoying thing of all is that I've found an NVIDIA demo which uses GL_TEXTURE_RECTANGLE_NV (which I know is an alias for the ARB extension, OpenGL Extension Viewer tells me). And it works. For no good reason. I even tried specifying the same parameters in my application, and it wasn't having any of it. Any idea at all what I'm doing wrong? Here's some code:
glBindTexture( meTarget, muID );

glTexImage2D( meTarget,             // GL_TEXTURE_BINDING_RECTANGLE_ARB
              0,
              lTD.meInternalFormat, // GL_RGBA8 (tried GL_RGB like NVIDIA example)
              lTD.mDimensions.x,    // 512
              lTD.mDimensions.y,    // 513
              0,
              lTD.meFormat,         // GL_RGBA (tried GL_RGB like NVIDIA example)
              lTD.meType,           // GL_UNSIGNED_BYTE (tried GL_FLOAT like NVIDIA example)
              0 );




All of this works fine when you specify GL_TEXTURE_2D and change 513 back to 512. I'm beginning to suspect problems with the driver. I also get the invalid operation error when I specify GL_TEXTURE_2D with a texture height of 513, even though my card supports the ARB non-power-of-two extension. It's ridiculous. Thanks in advance for any replies. [Edited by - iNsAn1tY on May 2, 2007 11:15:14 AM]

Share this post


Link to post
Share on other sites
Advertisement
If one works, and one doesn't, but the parameters are the same, then something else is wrong. How about the rendering context? It is activated at the time you upload the texture?

edit: Never mind. You said it worked for power of two regular texture, so can't be it...

Share this post


Link to post
Share on other sites
Never assume non power of two textures will work. Out of habit you should avoid using them, it is more optimal, and more supported to stick with power of two textures.

Share this post


Link to post
Share on other sites
This is mostly guessing, but it might work. I guess youre using windows, use some lib like glew to load all the extensions, so that the non_power_of_two extension is loaded, and try now, however, this shouldnt make any difference, because youre not using any new methods, but since windows is using a really old version of opengl (shouldnt make any difference either, since your driver is new) it might be causing the problem.

Even though this is a very rare reason, you never know :P

Share this post


Link to post
Share on other sites
I would expect this kind of ridiculousness from ATI's OpenGL drivers, but nVidia's? What is the world coming to? [headshake]

What I would do is grab a program like gDEBugger (free 30-day trial), GLExpert, GLIntercept, NVPerfKit, etc., and see if you can't get some more information on why the call is failing. There's absolutely nothing wrong with the code you posted, but it's possible that OpenGL might be in some crazy state. I feel your pain though, I just got down with a major OpenGL project and I don't have any hair left to pull!

Share this post


Link to post
Share on other sites
That's bizarre, I've never encountered any problems with NPT textures on nvidia hardware.

I find it interesting that it works with the NV demo, but not with your own code. There must be a difference somewhere, probably in the current OpenGL state, as others have mentioned.

Have you tried isolating the NPT texture creation call, maybe by placing it directly after the context creation ? If that works (and it should), then try moving it around, until it doesn't work anymore. Eventhough there shouldn't be any, there definitely are a few GL calls that can put OpenGL in an undefined state. Especially when working with GLSL.

And while this may sound silly, make 100% sure that the error state is clean before calling glTexImage, maybe by just calling glGetError prior to teximage. I've once tried to hunt down a 'driver bug', and finally realized (by using gDebugger) that the API call I thought would trigger the error wasn't actually the one...

Quote:

This is mostly guessing, but it might work. I guess youre using windows, use some lib like glew to load all the extensions, so that the non_power_of_two extension is loaded, and try now, however, this shouldnt make any difference, because youre not using any new methods, but since windows is using a really old version of opengl (shouldnt make any difference either, since your driver is new) it might be causing the problem.

The NPT extensions don't have to be 'loaded', they don't expose new entry points. They're just a bunch of enumerants. The ARB_texture_non_power_of_two extension doesn't define anything at all, it's just a capability flag.

Share this post


Link to post
Share on other sites
GL_TEXTURE_BINDING_RECTANGLE_ARB? Shouldn't that just be GL_TEXTURE_RECTANGLE_ARB?

Share this post


Link to post
Share on other sites
Quote:
Original post by Yann L
Quote:

This is mostly guessing, but it might work. I guess youre using windows, use some lib like glew to load all the extensions, so that the non_power_of_two extension is loaded, and try now, however, this shouldnt make any difference, because youre not using any new methods, but since windows is using a really old version of opengl (shouldnt make any difference either, since your driver is new) it might be causing the problem.

The NPT extensions don't have to be 'loaded', they don't expose new entry points. They're just a bunch of enumerants. The ARB_texture_non_power_of_two extension doesn't define anything at all, it's just a capability flag.


Yeah, youre right, thats why i said i was only guessing :)

Share this post


Link to post
Share on other sites
Long shot, but what is the type of meTarget? GL_TEXTURE_RECTANGLE_ARB (which you should be using, not GL_TEXTURE_BINDING_RECTANGLE_ARB) has a value of 0x84F5 and GL_TEXTURE_2D has a value of 0xDE1. If you're using shorts then you won't be able to store the GL_TEXTURE_RECTANGLE_ARB value correctly. meTarget should ideally be GLuint.

Perhaps try replacing the glTexImage2D parameters with the actual values you want instead of the variables in case that is where the problem is coming from.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!