• Advertisement

Archived

This topic is now archived and is closed to further replies.

Glut depth buffer

This topic is 5512 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I''m using the glut library to create my OpenGL window and set the display mode via glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGBA | GLUT_DEPTH); The standard size for the depth buffer is 16 bits, but I need 32 bits precision. How can I change the depth buffer size? Thanks in advance

Share this post


Link to post
Share on other sites
Advertisement
You obviously don''t understand why GLUT_DEPTH is set to 16. GLUT_DEPTH is a flag, the fifth bit in an integer, which happens to have the decimal value 16. If you change it to 32, it will represent the sixth bit, which means asking for a stencil buffer (GLUT_STENCIL is defined as 32).

Share this post


Link to post
Share on other sites
glutInitDisplayString seems to do what you want.

http://flex.ee.uec.ac.jp/texi/glut/glutInitDisplayString.3xglut.html

Share this post


Link to post
Share on other sites
btw many cards dont support 32bit depth
eg with all nvidia cards 24bit is the max

http://uk.geocities.com/sloppyturds/kea/kea.html
http://uk.geocities.com/sloppyturds/gotterdammerung.html

Share this post


Link to post
Share on other sites

  • Advertisement