Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

phong666

Glut depth buffer

This topic is 5752 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I''m using the glut library to create my OpenGL window and set the display mode via glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGBA | GLUT_DEPTH); The standard size for the depth buffer is 16 bits, but I need 32 bits precision. How can I change the depth buffer size? Thanks in advance

Share this post


Link to post
Share on other sites
Advertisement
You obviously don''t understand why GLUT_DEPTH is set to 16. GLUT_DEPTH is a flag, the fifth bit in an integer, which happens to have the decimal value 16. If you change it to 32, it will represent the sixth bit, which means asking for a stencil buffer (GLUT_STENCIL is defined as 32).

Share this post


Link to post
Share on other sites
glutInitDisplayString seems to do what you want.

http://flex.ee.uec.ac.jp/texi/glut/glutInitDisplayString.3xglut.html

Share this post


Link to post
Share on other sites
btw many cards dont support 32bit depth
eg with all nvidia cards 24bit is the max

http://uk.geocities.com/sloppyturds/kea/kea.html
http://uk.geocities.com/sloppyturds/gotterdammerung.html

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!