Jump to content
  • Advertisement
Sign in to follow this  
mameman

OpenGL This may seem silly but....

This topic is 4170 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Advertisement
Well, there's no way to specify 16bpp data into glTexImage2D, however you can tell OpenGL to use the GL_RGBA4 format internally. I think at least on modern nVidia cards this is ignored and GL_RGBA8 is used anyway.

Share this post


Link to post
Share on other sites
Quote:
Original post by ZQJ
Well, there's no way to specify 16bpp data into glTexImage2D, however you can tell OpenGL to use the GL_RGBA4 format internally. I think at least on modern nVidia cards this is ignored and GL_RGBA8 is used anyway.

There certainly are ways to specify 16 bpp image formats. Check out the packed pixel formats introduced in OpenGL 1.2.

Share this post


Link to post
Share on other sites
My question is specifically for targeting 16-bit displays. I want to be able to display a 16-bit texture on a 16-bit display. I'm not worried about the various 16-bit formats that are available, I just want to know if it CAN be done.

Thanks,

MAMEman.

Share this post


Link to post
Share on other sites
From OpenGL's point of view, any bit depth works. What matters is what the particular implementations (your driver) can handle. But what do you mean with 16 bits anyway? 16 bits per color channel or 16 bits per pixel? 16 bits per pixel has been supported by most common implementations for ages.

Share this post


Link to post
Share on other sites
Quote:
Original post by mameman
My question is specifically for targeting 16-bit displays. I want to be able to display a 16-bit texture on a 16-bit display. I'm not worried about the various 16-bit formats that are available, I just want to know if it CAN be done.

Thanks,

MAMEman.


if you create a GL_RGBA texture (not specifying wich color depth) the driver gets to determine wich color depth it should use.
so, dont worry - opengl will probably handle everything you can throw at it

Share this post


Link to post
Share on other sites
Hmmm...

I'm not so sure. If say I decide to load in a 16bit image. I want to create a texture with a colour depth of 16bits. Now WHAT format parameter do I specify in glTexImage2D() to get the correct pixel type?. Assuming R5G6B5.

Thanks,

MAMEman.

Share this post


Link to post
Share on other sites
Depends a little bit on exact channel order. Try some combination of GL_UNSIGNED_SHORT_5_6_5 or GL_UNSIGNED_SHORT_5_6_5_REV for the format parameter, and GL_RGB or GL_BGR for the type parameter.

Share this post


Link to post
Share on other sites
AT LAST!. Many thanks Brother Bob. I could not find information on these format parameters in any OpenGL references that I have :(. Mind you it is VERY old as I'm using VC++ 6.0 with the MSDN that came with it.

I've downloaded the latest version of the specification (1st Dec' 2006) from the OGL site but there does not appear to be any documentation. How long have these been around for?. Could you give me any links to this information?.

Once again thanks,

MAMEman.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!