Sign in to follow this  
webjeff

OpenGL OpenGL doesn't support 16bit??

Recommended Posts

I have a 16bit image, im reading in and trying to pass to: glTexImage2D(GL_TEXTURE_2D, 0, 3,objData->PaddedWidth(),objData->PaddedHeight(),0,GL_RGB, GL_UNSIGNED_BYTE, rawImage); I have tried almost every combination of internal format being GL_RGB5 and external format being GL_UNSIGNED_SHORT to whatever. OpenGL takes 32bit images fine. But I have a 16 bit image I need imported. Any idea?? Does opengl require me to import as a 32 bit image?? Whats wrong with opengl. I was able to finally see something with messing with the formats but only a grayscale image and like a double image. Weird! Thanks Jeff.

Share this post


Link to post
Share on other sites
Make sure you have glext.h (available from opengl.org) and then use one of:
GL_UNSIGNED_SHORT_5_6_5
GL_UNSIGNED_SHORT_5_6_5_REV
GL_UNSIGNED_SHORT_4_4_4_4
GL_UNSIGNED_SHORT_4_4_4_4_REV
GL_UNSIGNED_SHORT_5_5_5_1
GL_UNSIGNED_SHORT_1_5_5_5_REV

as your format.

Also, you shouldn't be using 3 as the internal format. Such use is deprecated. Instead use one of the symbolic constants (GL_RGB is the symbolic equivalent to 3).

Enigma

Share this post


Link to post
Share on other sites
Quote:
Original post by Kwizatz
Try this:
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB16,objData->PaddedWidth(),objData->PaddedHeight(),0,GL_RGB, GL_UNSIGNED_SHORT, rawImage);
That would indicate 16-bits *per-channel*, not per-pixel.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this