Glut depth buffer

Started by
4 comments, last by phong666 21 years, 2 months ago
I''m using the glut library to create my OpenGL window and set the display mode via glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGBA | GLUT_DEPTH); The standard size for the depth buffer is 16 bits, but I need 32 bits precision. How can I change the depth buffer size? Thanks in advance
Advertisement

The most simple way would be to change the constant GLUT_DEPTH in glut.h from 16 to 32

Edo
Edo
You obviously don''t understand why GLUT_DEPTH is set to 16. GLUT_DEPTH is a flag, the fifth bit in an integer, which happens to have the decimal value 16. If you change it to 32, it will represent the sixth bit, which means asking for a stencil buffer (GLUT_STENCIL is defined as 32).
glutInitDisplayString seems to do what you want.

http://flex.ee.uec.ac.jp/texi/glut/glutInitDisplayString.3xglut.html
Thanks for your help.

A call to glutInitDisplayString() did the trick.

btw many cards dont support 32bit depth
eg with all nvidia cards 24bit is the max

http://uk.geocities.com/sloppyturds/kea/kea.html
http://uk.geocities.com/sloppyturds/gotterdammerung.html

This topic is closed to new replies.

Advertisement