glTexSubImage2D

Started by
5 comments, last by kusma 18 years, 10 months ago
I'm passing to this funciton a set of pixels 800*600*3. The texture set is 800*600*3. When the texture is this size, the function returns an error. This function works when the texture is 1600*1200*3. Is there any way to get this function working with the same sized pixel data as the texture? Thank you
Advertisement
what error does the function return?
how are you calling it?
I'm calling it like this:-

glTexSubImage2D( GL_TEXTURE_2D, nMipMap, nXOff, nYOff, w,h, dwFormat, GL_UNSIGNED_BYTE, pPixels );

The error I get is this:-

GLI | GL ERROR - Function glTexSubImage2D(GL_TEXTURE_2D,0,0,0,800,600,GL_RGB,GL_UNSIGNED_BYTE,0x4900040) generated error GL_INVALID_VALUE
Have you first called glTexImage2D() on the texture object that's currently bound when you call glTexSubImage2D()? And is that texture big enough to fit the 800x600 image?
enum Bool { True, False, FileNotFound };
Yes. The texture is 800*600. But as I say, if I double the size of the texture, it works fine. I just dont want to double it.
Correct me if I am wrong but you need to have a POT texture to use glTexSubImage()
if i'm not mistaken, you need a texture with power of two dimentions for the GL_TEXTURE_2D-target. GL_TEXTURE_RECTANGLE_ARB, can however use 800x600-textures.
you could always use a 1024x1024-texture and only use the lower left 800x600 texels or something.

This topic is closed to new replies.

Advertisement