what does enabling depth testing do?

Started by
5 comments, last by AndreTheGiant 20 years ago
I thought it made it so that, no matter what order you drew stuff in your code, it would always look right on the display because it made sure that the closer stuff was drawn over the further away stuf if necessary. I have depth testing enabled, but my app displays objects based on the order they are drawn, rather than how far away from the viewer they are. Am I misunderstood about what depth testing should do?
Advertisement
Your thoughts on the depth buffer is correct, that is how it''s supposed to work, unless you abuse the depth buffer by setting the near clip plane to a ridicuolusly low value, effectively disabling the depth buffer. Or, you simply don''t have a depth buffer. Check the pixel format also.
quote:Original post by Brother Bob
Your thoughts on the depth buffer is correct, that is how it''s supposed to work, unless you abuse the depth buffer by setting the near clip plane to a ridicuolusly low value, effectively disabling the depth buffer. Or, you simply don''t have a depth buffer. Check the pixel format also.


Lots of good info, thanks.

by near clip plane, do you mean the near end of the view frustum? if so, I have it set to 1. (my far plane is 1000). I didnt know the depth buffer had anything to do with the near plane.

What do you mean I dont have a depth buffer? I enabled it with this code:
glEnable (GL_DEPTH_TEST);
Is there anything else required to enable it?

Pixel format eh? Maybe this is where the problem lies then, because I dont know much about it. I pretty much just used Nehe''s base code for this part. Here is what I have:

PIXELFORMATDESCRIPTOR pfd =	{										// pfd Tells Windows How We Want Things To Be		sizeof (PIXELFORMATDESCRIPTOR),									// Size Of This Pixel Format Descriptor		1,																// Version Number		PFD_DRAW_TO_WINDOW |											// Format Must Support Window		PFD_SUPPORT_OPENGL |											// Format Must Support OpenGL		PFD_DOUBLEBUFFER,												// Must Support Double Buffering		PFD_TYPE_RGBA,													// Request An RGBA Format		16,										// Select Our Color Depth (bpp)		0, 0, 0, 0, 0, 0,												// Color Bits Ignored		0,																// No Alpha Buffer		0,																// Shift Bit Ignored		0,																// No Accumulation Buffer		0, 0, 0, 0,														// Accumulation Bits Ignored		16,																// 16Bit Z-Buffer (Depth Buffer)  		0,																// No Stencil Buffer		0,																// No Auxiliary Buffer		PFD_MAIN_PLANE,													// Main Drawing Layer		0,																// Reserved		0, 0, 0															// Layer Masks Ignored	};


for the field labeled by "Z-Buffer (Depth Buffer)", I have 16. Is this the problem? I dont think so...

Anyway, thanks for your help. Still need more help though
quote:Original post by AndreTheGiant
by near clip plane, do you mean the near end of the view frustum? if so, I have it set to 1. (my far plane is 1000). I didnt know the depth buffer had anything to do with the near plane.

Yes, that''s what I mean. Looks like you''re not abusing it, those are very reasonable values.

The depth buffer is indeed affected by the values you choose. It''s because of finite precision in the depth buffer (usually 16 or 24 bits), and the fact that the distribution of the precision isn''t linear, and the non-linearity depends on the values you choose. The further away you get from the near plane, the less precision you get (the distance between two consecutive depth values increases). If you set the near plane too close, the precision will drop too fast as you move out from the viewpoint, and you will get depth fighting. See here for more info on the subject.

quote:
What do you mean I dont have a depth buffer? I enabled it with this code:
glEnable (GL_DEPTH_TEST);
Is there anything else required to enable it?

All you have to do is enable it, assuming you have a depth buffer. And by having a depth buffer, I mean you explicitly tell Windows you want a depth buffer when creating the pixel format. If you don''t tell Windows you want a depth buffer, you won''t get one.

quote:
Pixel format eh? Maybe this is where the problem lies then, because I dont know much about it. I pretty much just used Nehe''s base code for this part. Here is what I have:

code...

for the field labeled by "Z-Buffer (Depth Buffer)", I have 16. Is this the problem? I dont think so...

Looks correct, so you''re probably having a depth buffer.

The next question would be, do you, perhaps, try to enable it before the rendering context is created? It''s important to do it after.
quote:Original post by Brother Bob
The next question would be, do you, perhaps, try to enable it before the rendering context is created? It''s important to do it after.


No. I enable it in my init() funciton, which is called by the WM_CREATE message, which is after doing the pixelfomratdescriptor stuff

Thanks a lot for your help, but any other ideas?
glDepthFunc(GL_LEQUAL);

try to use this before rendering your objects.

quote from MSDN:
The glDepthFunc function specifies the function used to compare each incoming pixel z value with the z value present in the depth buffer. The comparison is performed only if depth testing is enabled.

so if you used glDepthFunc(GL_ALWAYS); your app displays objects based on the order they are drawn.

I hope you can understand my broken English!
Unless ophir is right, I think we need some code. Post a minimal, but working, program than demonstrates the behaviour. No fancy stuff, just to demonstrate the problem.

This topic is closed to new replies.

Advertisement