OpenGL doesn't support 16bit??

Started by
2 comments, last by Myopic Rhino 18 years, 9 months ago
I have a 16bit image, im reading in and trying to pass to: glTexImage2D(GL_TEXTURE_2D, 0, 3,objData->PaddedWidth(),objData->PaddedHeight(),0,GL_RGB, GL_UNSIGNED_BYTE, rawImage); I have tried almost every combination of internal format being GL_RGB5 and external format being GL_UNSIGNED_SHORT to whatever. OpenGL takes 32bit images fine. But I have a 16 bit image I need imported. Any idea?? Does opengl require me to import as a 32 bit image?? Whats wrong with opengl. I was able to finally see something with messing with the formats but only a grayscale image and like a double image. Weird! Thanks Jeff.
Advertisement
Try this:
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB16,objData->PaddedWidth(),objData->PaddedHeight(),0,GL_RGB, GL_UNSIGNED_SHORT, rawImage);
Make sure you have glext.h (available from opengl.org) and then use one of:
GL_UNSIGNED_SHORT_5_6_5
GL_UNSIGNED_SHORT_5_6_5_REV
GL_UNSIGNED_SHORT_4_4_4_4
GL_UNSIGNED_SHORT_4_4_4_4_REV
GL_UNSIGNED_SHORT_5_5_5_1
GL_UNSIGNED_SHORT_1_5_5_5_REV
as your format.

Also, you shouldn't be using 3 as the internal format. Such use is deprecated. Instead use one of the symbolic constants (GL_RGB is the symbolic equivalent to 3).

Enigma
Quote:Original post by Kwizatz
Try this:
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB16,objData->PaddedWidth(),objData->PaddedHeight(),0,GL_RGB, GL_UNSIGNED_SHORT, rawImage);
That would indicate 16-bits *per-channel*, not per-pixel.

This topic is closed to new replies.

Advertisement