Sign in to follow this  
Axiverse

D3D12 default to logrithmic depth buffer?

Recommended Posts

I was trying to debug some zfighting issues in my d3d_12. I was using a depth stencil with the D32_FLOAT format, my camera was set to perspective with near being 0.0001f and far being 20.0f. The camera was 5 units away and the 3d model in question was approximately 1 unit deep. I was experiencing z-fighting issues which I found strange since the camera range was so narrow. When I used the graphics debugger, I found that the depth buffer only used values in the range of 0.9999+ - 1, which seems like a really narrow range given how the camera was set up. Changing the near to 0.01f fixed the z-fighting issues, but the z-buffer is still only utilizing 0.998-1. I thought that depth buffers were linear by default, have they change to logarithmic by default or how is it calculated?

Share this post


Link to post
Share on other sites

With a perspective projection they've always been non-linear, nothing has changed in D3D12, you probably haven't noticed before.

If your units are metres, then having a far clip plane set to 0.1 millimetres is excessively small. Even your new 1cm near-clip is smaller than you probably need. Try and keep the ratio between Near/Far clip as small as possible.

The trick you should start using is the reverse depth buffer trick (Google "Reverse Depth Buffer"). This should significantly reduce any Z-fighting you're getting even with larger ratios of near/far.

Here's one such link: https://developer.nvidia.com/content/depth-precision-visualized

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this