Stencil buffer on a geforce 2 ?

Started by
3 comments, last by noisecrime 20 years, 4 months ago
Hi, Got an odd problem with a program i'm working on that tries to use the stencil buffer. Basically it doesn't work on my Geforce2 ultra, but it has no problems on my radeon 9800. Its a little tricky as i'm trying to add this feature through manipulating the openGL stream. So either i'm missing some setting or there is something special about using stencil buffers on the geforce2 that I'm not aware of. I'd be grateful for any ideas on how to track down what the issue is. I've tried every solution I can think of, but no matter what a call to glGetIntegerv(GL_STENCIL_BITS, &tresult) always returns 0 on the Geforce, and 8 on the Radeon. There is no path code so both systems are using the same code, yet clearly the stencil buffer is not avaiable on the geforce. This suggests there is nothing wrong with the stencil buffer code, but possibly something on the hardware level, or the initiation of the buffer. Oh i've also tested various other openGL apps on the Geforce that use stencilbuffer and these all work fine, so its not a question of drivers. Once the app is running these are the results I get for the pfd iPixelFormat: 3 describePixelFormat: 34 pfd.cStencilBits: 0 Now like I said i'm doing this through the openGL stream, but I believe the underlying application does not set the stencilbuffer when it creates the openGL context as the feature is not used. This begs the question as to why/how i get a stencil on any radeon (several were tested), but not the geforce. I can only summerise that ATI sets up the hardware to automatically provide a stencilbuffer presumably becuase their hardware works better that way. So becuase the pfd cannot be set twice (any further attempts are ignored according to the specs) I decided to intercept the apps call to choosePixelFormat, add the stencilbits there and pre-empt the call to setPixelFormat with my own. These are the results I get iPixelFormat: 5 -- pfd chossen format SetPixelFormat: 1 -- true function was succesful describePixelFormat: 34 pfd.cStencilBits: 8 As you can see the results from describepixelformat tells me I have an 8 bit stencil. Yet subsequent calls to glGetIntegerv(GL_STENCIL_BITS) still return 0! So now i'm pretty confused. When I add stencilbits = 8 and choose a pixelformat it return type 5, presumably this format supports the stencil, yet later on when I check the same property I get 3 (see above). This suggests that although I get no errors on chosepixelformat, setpixelformat or describepixelformat, the results are clearly wrong. Just to double check I try to retireve the pfd before setting anything, to check if they have been set previous. The results I got would indicate that prior to the intercepted call to wglChoosepixel the device context has not yet been set up with a pfd. iPixelFormat: 0 -- call failed describePixelFormat: 0 -- call failed pfd.cStencilBits: 76 -- odd number presumably just random data Well this just leaves me confussed. As far as I can tell at the point I intercept the call the wglchoosepixelformat the device pfd has not yet been set. I can alter the pfd to support stencilBits and all function calls appear to pass and i get a valid pixelformat. Yet even though these are set, subsequnet tests indicate the the pixelformat is not what is should be (3 instead of 5) and the cStencilbits returns 0 not 8. I'd be most grateful for any ideas on where to go from here. [edited by - noisecrime on December 14, 2003 8:33:18 AM]
Advertisement
I didn''t read your whole post, but you need a 32-bit depth buffer to get a stencil buffer on the Geforce 2 (and many other cards). Make sure your desktop display depth is set to 32 bits.

____________________________________________________________www.elf-stone.com | Automated GL Extension Loading: GLee 5.00 for Win32 and Linux

Well I'll double check the pfd is set to 24 bit depth buffer and 32bit colour, but the monitor is definately set at 32. So i don't think this is the problem

Thanks anyway

[edited by - noisecrime on December 14, 2003 12:27:29 PM]
quote:Original post by noisecrime
Well I''ll double check the pfd is set to 24 bit depth buffer and 32bit colour, but the monitor is definately set at 32. So i don''t think this is the problem

Try setting the pfd to a 32bit depth buffer.
yep tried that, and tried setting every other combination i could think of between color/depth and stencil buffer bits.

Still no joy ;(

This topic is closed to new replies.

Advertisement