Depth-Buffer

Started by
6 comments, last by eq 18 years, 11 months ago
Hi guys, i have a question for experts: i just found this text in the managed directx online help: "Due to the mathematics involved, the generated z values in a z-buffer tend not to be distributed evenly across the z-buffer range (typically 0.0 to 1.0, inclusive). Specifically, the ratio between the far and near clipping planes strongly affects how unevenly z values are distributed. Using a far-plane distance to near-plane distance ratio of 100, 90 percent of the depth buffer range is spent on the first 10 percent of the scene depth range. Typical applications for entertainment or visual simulations with exterior scenes often require far-plane/near-plane ratios of anywhere between 1000 to 10000. At a ratio of 1000, 98 percent of the range is spent on the first 2 percent of the depth range, and the distribution becomes worse with higher ratios" Does anybody know WHY the z values in a z-buffer tend not to be distributed evenly across the z-buffer range?? thx
Advertisement
This is just a guess but I reckon It is because overlapping is way more noticeable in the near plane.
You definitely want to turn your near plane up as far as you can get it. Which would probably be about as far as the player's collision bounding radius in a FPS cam. The higher you can jack it up, the better z quality you get at farther distance.
Close to a year ago I did some research on this myself and came up with the result that the quite standard algorithm for calculating depth-values is the following.
Assuming Z is a depth value between zNear and zFarDepthBufferVal = a + b/Zwherea = zFar / (zFar - zNear)b = zFar * zNear / (zNear - zFar)

You can read the thread I started at the time here.

The WHY is that because of the perspective projection things in the foreground are larger and thereby more apt to produce depth buffering precision errors (parts of geometry showing up in front of other geometry that actually is closer to the camera). Using this kind of hyperbolic distribution of depth values, more precision is kept in the foreground and less far of in the distance, where no one will notice small errors anyway.
Hack my projects! Oh Yeah! Use an SVN client to check them out.BlockStacker
Quote:Original post by staafUsing this kind of hyperbolic distribution of depth values, more precision is kept in the foreground and less far of in the distance, where no one will notice small errors anyway.


The errors are often quite noticable and the resulting flicker is called z-fighting (though it has several other names too). Of course this is usualy the programmer's problem -- setting the distence between the near and far planes to too high of a value.

Though it is not well documented in directx some hardware supports what is called a w-bufffer which distributes the depth values in a much more linear way allowing for a vary large value range between the near and far planes. There may be cases where the w-buffer produces other sorts of artifacts, but I have found it to work beautifuly on hardware that supports it.
Quote:Original post by turnpast
Of course this is usualy the programmer's problem -- setting the distence between the near and far planes to too high of a value.

This is not entirely true. It's not the distance between them. For example, 0 to 500 may produce a lot of z-fighting in medium zones, where 20 to 1000 would have no noticable problems at all. That's saying a distance of 500 causes more problems than a distance of 980. The near plane (ratio relative to the far) is the most important factor for optimizing depth values.
Quote:Original post by turnpast
Though it is not well documented in directx some hardware supports what is called a w-bufffer which distributes the depth values in a much more linear way allowing for a vary large value range between the near and far planes. There may be cases where the w-buffer produces other sorts of artifacts, but I have found it to work beautifuly on hardware that supports it.

I don't have any facts but I've heard that most major card manufacturers have dropped their support for w-buffers, concentrating on z-buffers. Might be worth considering.
Hack my projects! Oh Yeah! Use an SVN client to check them out.BlockStacker
Yes, W-buffers are awesome.
For a 24 bit W-buffer with a view distance of 10000m the resolution is roughly 0.5mm.
Thus spacing polygons 1mm infront of eachother shouldn't cause problems even 10km away!
I bet that Z-buffer will get into big problems after a couple of 100m.
On the X-box I found them really nice to use.
I NEVER had any problems with Z-fighting on the X-box (admittingly we've only had 800m view distance, thus having ~0.05mm resolution).

It's to bad that supports seems to go away (I think they cost more in terms of transistor usage).


Just my 2c

Edit: Btw if we're getting the ability to use a depth buffer as a source texture it's much easier to get the actual distance from the near plane using W-buffers (if this happens though, the HW would probably do some conversion for us).
We did this on the X-box to do some deferred lighting stuff, works perfectly.

This topic is closed to new replies.

Advertisement