This may seem silly but....

Started by
12 comments, last by mameman 16 years, 12 months ago
Does OpenGL support 16-bit pixelformats?. MAMEman.
Advertisement
Well, there's no way to specify 16bpp data into glTexImage2D, however you can tell OpenGL to use the GL_RGBA4 format internally. I think at least on modern nVidia cards this is ignored and GL_RGBA8 is used anyway.
Yes, it does, but your question is specific. texture? framebuffer?
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);
Quote:Original post by ZQJ
Well, there's no way to specify 16bpp data into glTexImage2D, however you can tell OpenGL to use the GL_RGBA4 format internally. I think at least on modern nVidia cards this is ignored and GL_RGBA8 is used anyway.

There certainly are ways to specify 16 bpp image formats. Check out the packed pixel formats introduced in OpenGL 1.2.
My question is specifically for targeting 16-bit displays. I want to be able to display a 16-bit texture on a 16-bit display. I'm not worried about the various 16-bit formats that are available, I just want to know if it CAN be done.

Thanks,

MAMEman.

From OpenGL's point of view, any bit depth works. What matters is what the particular implementations (your driver) can handle. But what do you mean with 16 bits anyway? 16 bits per color channel or 16 bits per pixel? 16 bits per pixel has been supported by most common implementations for ages.
Quote:Original post by mameman
My question is specifically for targeting 16-bit displays. I want to be able to display a 16-bit texture on a 16-bit display. I'm not worried about the various 16-bit formats that are available, I just want to know if it CAN be done.

Thanks,

MAMEman.


if you create a GL_RGBA texture (not specifying wich color depth) the driver gets to determine wich color depth it should use.
so, dont worry - opengl will probably handle everything you can throw at it
Hmmm...

I'm not so sure. If say I decide to load in a 16bit image. I want to create a texture with a colour depth of 16bits. Now WHAT format parameter do I specify in glTexImage2D() to get the correct pixel type?. Assuming R5G6B5.

Thanks,

MAMEman.
Depends a little bit on exact channel order. Try some combination of GL_UNSIGNED_SHORT_5_6_5 or GL_UNSIGNED_SHORT_5_6_5_REV for the format parameter, and GL_RGB or GL_BGR for the type parameter.
AT LAST!. Many thanks Brother Bob. I could not find information on these format parameters in any OpenGL references that I have :(. Mind you it is VERY old as I'm using VC++ 6.0 with the MSDN that came with it.

I've downloaded the latest version of the specification (1st Dec' 2006) from the OGL site but there does not appear to be any documentation. How long have these been around for?. Could you give me any links to this information?.

Once again thanks,

MAMEman.

This topic is closed to new replies.

Advertisement