Jump to content
  • Advertisement
Sign in to follow this  
skow

Pbuffers and reading the depth?

This topic is 5036 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I check to see if somthing is infront of a given poitn by doing the following. I do a gluProject with the point to get the screen x and y, and the depth at that point. Then I do a glReadPixels at that same x and y, getting the depth of anything drawn thus far. If the the glu's depth is less than the one i get from the glReadPixels i draw. The only problem is when I do this with a pbuffer the depth vaules i'm getting from glReadPixels are compleately off, some times it will be 0, other times it will be some value other than it should. Does any one have any idea what could be causing this?

Share this post


Link to post
Share on other sites
Advertisement

just a thought:

the second param of wglMakeContextCurrentARB is the current read target, is it the same as the draw target?

Share this post


Link to post
Share on other sites
Yeah for the pbuffer i have the read and the writing to the "front."

wglMakeContextCurrentARB(hPBufferDC, hPBufferDC, hPBufferRC);
glDrawBuffer(GL_FRONT);
glReadBuffer(GL_FRONT);


I create it using the recommended settings from the ATI paper:

int attr[] =
{
WGL_SUPPORT_OPENGL_ARB, TRUE, // pbuffer will be used with gl
WGL_DRAW_TO_PBUFFER_ARB, TRUE, // enable render to pbuffer
//WGL_BIND_TO_TEXTURE_RECTANGLE_RGB_NV, TRUE, // non power of 2
WGL_BIND_TO_TEXTURE_RGBA_ARB, TRUE, // pbuffer will be used as a texture
WGL_RED_BITS_ARB, 8, // at least 8 bits for RED channel
WGL_GREEN_BITS_ARB, 8, // at least 8 bits for GREEN channel
WGL_BLUE_BITS_ARB, 8, // at least 8 bits for BLUE channel
WGL_ALPHA_BITS_ARB, 8, // at least 8 bits for ALPHA channel
WGL_DEPTH_BITS_ARB, 24, // at least 24 bits for depth buffer
WGL_DOUBLE_BUFFER_ARB, FALSE, // we don’t require double buffering
0 // zero terminates the list
};




Is there somthing wrong with this?

Share this post


Link to post
Share on other sites
Any one else have any ideas?

Does this maybe not work with WGL_BIND_TO_TEXTURE_RGBA_ARB?

Share this post


Link to post
Share on other sites
I just did a quick test and it appears on my graphics card (GeForce FX 5600 w/ 66.93 drivers) glReadPixels will only return valid depth information from a PBuffer if GL_DEPTH_TEST is enabled on the PBuffer.

Enigma

Share this post


Link to post
Share on other sites
That is odd Enigma, but alas I made sure it was enabled for when i read the pixel and it still is giving me bogus values :(

Share this post


Link to post
Share on other sites
A bit of further testing shows it needs to be enabled for the drawing as well. If you don't want depth testing it looks like you need to enable depth testing and just use a depth func of GL_ALWAYS.

Enigma

Share this post


Link to post
Share on other sites
Thanks Enigma for spending the time to help me out.

Ok, i draw most of my stuff using depthtesting, but some of the "after effect" billboarded stuff I add, after I draw everything (and am testing the depth to see if i should draw using glReadPixels) I don't use depth testing. Would this cause the read pixels to return bad depths just by drawing these last few things without depth testing enabled?

Share this post


Link to post
Share on other sites
On my system anything that is rendered with glDepthTest disabled does not write to the depth buffer of the PBuffer. So for example if I render a screen size quad at depth 0.5 with depth test enabled, then disable the depth test and draw a screen size quad with depth at 0.25 then renable the depth test and read a value I get 0.5. If I leave depth testing enabled when I render the second quad I get 0.25.

The same thing actually happens when rendering to the screen instead of a PBuffer, and I think I remember reading that this is part of the spec (i.e. disabling the depth test implicitly turns off depth writes), but I haven't been able to find anything to confirm it.

EDIT: Found it. From the OpenGL 2.0 spec:
Quote:
The depth buffer test discards the incoming fragment if a depth comparison fails. The comparison is enabled or disabled with the generic Enable and Disable commands using the symbolic constant DEPTH TEST. When disabled, the depth comparison and subsequent possible updates to the depth buffer value are bypassed and the fragment is passed to the next operation.

(Emphasis mine)

Enigma

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!