Jump to content
  • Advertisement
Sign in to follow this  
FERNANDO-BRASIL

glReadPixels problem

This topic is 5408 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi there! Look to my code:
Quote:
glMatrixMode(GL_MODELVIEW); glClearDepth(1.0); glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT | GL_STENCIL_BUFFER_BIT); glLoadIdentity(); glFinish(); glFlush(); SwapBuffers(DC); glReadBuffer(GL_BACK); glReadPixels(50, 50, 1, 1, GL_DEPTH_COMPONENT, GL_FLOAT, &depth);
I'm running in windowed mode, and my viewport is 300x300. As the back and front buffer are clear, the float 'depth' would have a '1.0' value after the glReadPixels call, but in this case, it's returning '0.0', always... And, as you can see, I'm using glClearDepth(1.0) to indicate I want to clear the depth buffer with ones (1.0). So, why I'm getting a 0.0 when I should get 1.0? Anybody knows what's happening? Thank you. Fernando.

Share this post


Link to post
Share on other sites
Advertisement
blast! missed the other reply

the glReadBuffer(GL_BACK) seems to imply you want to read from the back buffer, but I'm pretty sure (as I implied in my first reply) that the data you want will be in the front buffer after the swap

Share this post


Link to post
Share on other sites
After a buffer swap, the content of the back buffer is undefined. Since you're reading the back buffer after a buffer swap without writing new values to it, you should never expect to get correct values. The image is in the front buffer, so you should read the front buffer.

edit: Hmm, didn't see your post saying you did try reading from the front buffer, sorry. Well, the only thing I can think of then is; do you have a depth buffer at all? If you don't ask for a depth buffer when creating the pixel format, you're not getting one. And if you have one, is it enabled?

Share this post


Link to post
Share on other sites
Here is how I'm creating:

Quote:

DC = GetDC(Handle);

pfd.nSize = sizeof(PIXELFORMATDESCRIPTOR);
pfd.nVersion = 1;
pfd.dwFlags = PFD_SUPPORT_OPENGL | PFD_DOUBLEBUFFER;
pfd.iPixelType = PFD_TYPE_RGBA;
pfd.cColorBits = 24;
pfd.cAlphaBits = 8;
pfd.cAccumBits = 8;
pfd.cDepthBits = 8;
pfd.cStencilBits = 8;
pfd.iLayerType = PFD_MAIN_PLANE;
ChoosePixelFormat(DC, &pfd);

SetPixelFormat(DC, pf, &pfd);


Is there something wrong?

Thank you.
Fernando.

Share this post


Link to post
Share on other sites
Hi people!!

I discovered a way to avoid the problem.

The code works if I disable Stencil Buffer by informing pfd.cStencilBits = 0 in PIXELFOMATDESCRIPTOR.

Why?

Anybody knows?
Maybe I'm not using stencil buffer properly?
Maybe my graphics card don't have enough memory to alocate this buffer?

It's working fine now, but I need stencil, and I can't remove that. Does anybody has any idea?

Fernando.

Share this post


Link to post
Share on other sites
Well, you're asking for 8 bit depth and 8 bit stencil, and I seriously doubt your graphics card supports 8-bit depth buffers, so the closest pixel format supported is probably one with cDepthBits=0 and cStencilBits=8. Request the format you posted earlier, and see what glGetInteger with GL_DEPTH_BITS returns, I would guess a zero if my theory is correct.

When you don't request a stencil buffer, the closest supported pixel format is now cDepthBits=24 (or something like that) and cStencilBits=0, so now you have a depth buffer.

That's just my guess. I suggest you ask for a more sane depth buffer, like 24 bits, along with 8 bit stencil if you need it.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!