Texture Border
Under certain conditions I'm getting GL_INVALID_VALUE returned when I attempt to call glCopyTexSubImage2D() with "(yoffset+height) == h" (where h is the GL_TEXTURE_HEIGHT). The OpenGL redbook states that this error will be returned when "(yoffset + height) > (h-b)" (where b is the GL_TEXTURE_BORDER). Is there a way I can explicitly disable GL_TEXTURE_BORDER?
You are setting the border on your own, as a 6th parameter of glTexImage2D() or similar functions. Pass 0 as a border and there will be none!
I am passing 0 to the "border" parameter of glTexImage2D. And most settings are the default, with the exception of the min/mag filters, which I've set to GL_NEAREST.
Ugh.. I just observed another odd behavior... and it has convinced me it's no longer a border issue...
The certain conditions is when the window size increases to next power-of-2 threshold; as the window size increases past these thresholds the underlying off-screen buffer that I render to needs to get resized.
When the destination texture has a GL_TEXTURE_HEIGHT hardcoded to 1024 (1024x1024) everything is fine (even when "(yoffset+height) == h"). However, when I hardcode the GL_TEXTURE_HEIGHT to 2048 (2048x2048), the error occurs.
My ATI Radeon HD 4650 is suppose to be able to support 8192x8192 textures though, so this is kinda weird. :|
The certain conditions is when the window size increases to next power-of-2 threshold; as the window size increases past these thresholds the underlying off-screen buffer that I render to needs to get resized.
When the destination texture has a GL_TEXTURE_HEIGHT hardcoded to 1024 (1024x1024) everything is fine (even when "(yoffset+height) == h"). However, when I hardcode the GL_TEXTURE_HEIGHT to 2048 (2048x2048), the error occurs.
My ATI Radeon HD 4650 is suppose to be able to support 8192x8192 textures though, so this is kinda weird. :|
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement