Sign in to follow this  
yahastu

Depth buffer problem

Recommended Posts

I am getting some weird flickering between pixels of the wrong depth. Depth buffering is enabled with a 16 bit buffer (which seems to be the most my system can handle because initializing the device fails if I use anything larger.) Here are all of the relevant lines of code I could find: //depth buffer dx_PresParams.EnableAutoDepthStencil = TRUE; dx_PresParams.AutoDepthStencilFormat = D3DFMT_D16; dxDevice->SetRenderState(D3DRS_ZENABLE, D3DZB_TRUE); //enable z buffering dxDevice->SetRenderState(D3DRS_CULLMODE, D3DCULL_CCW); //backface cull D3DXMatrixPerspectiveFovLH(&m_Projection, D3DX_PI/4, aspect, 0.1f, 2500.0f); As you can see in the screenshot the far clipping plane is pretty close relative to the size of this object, so I would not expect to be having any such problems Screenshots:

Share this post


Link to post
Share on other sites
You're only using a 16-bit depth buffer for a start. That severely limits precision.
Secondly, something like 75% of the Z range is used in the first 25% of the scene, the Z buffer depth range isn't linear.

You could try:
  • Using a 24 bit depth buffer
  • Moving your near clip plane further away to something like 1.0f or more - the further away the better
  • Bringing your far clip plane forwards

    Share this post


    Link to post
    Share on other sites
    Ooooh I assumed it was linear, so I never bothered changing the zNear...increasing the zNear gets rid of all my problems at the far end. Thanks for that tip.

    Well, I would like to have more than 16 bits in a depth buffer...


    D3DPRESENT_PARAMETERS dx_PresParams;
    ZeroMemory( &dx_PresParams, sizeof(dx_PresParams) );
    dx_PresParams.Windowed = TRUE;
    dx_PresParams.SwapEffect = D3DSWAPEFFECT_DISCARD;

    //antialiasing 4x
    dx_PresParams.MultiSampleType = D3DMULTISAMPLE_4_SAMPLES;
    dx_PresParams.PresentationInterval = D3DPRESENT_INTERVAL_IMMEDIATE;

    dx_PresParams.BackBufferFormat = D3DFMT_UNKNOWN;

    //depth buffer
    dx_PresParams.EnableAutoDepthStencil = TRUE;
    dx_PresParams.AutoDepthStencilFormat = D3DFMT_D16;

    NOVA_TRY( p_dx_Object->CreateDevice(D3DADAPTER_DEFAULT, D3DDEVTYPE_HAL, hMainWindow, D3DCREATE_HARDWARE_VERTEXPROCESSING, &dx_PresParams, &dxDevice),
    "Failure to CreateDevice", "InitializeDevice");


    The call to CreateDevice fails if AutoDepthStencilFormat is set to *ANY* format other than "D3DFMT_D16" (it wont even allow the other 16 bit ones).


    D3DFMT_D16_LOCKABLE 70 16-bit z-buffer bit depth.
    D3DFMT_D32 71 32-bit z-buffer bit depth.
    D3DFMT_D15S1 73 16-bit z-buffer bit depth where 15 bits are reserved for the depth channel and 1 bit is reserved for the stencil channel.
    D3DFMT_D24S8 75 32-bit z-buffer bit depth using 24 bits for the depth channel and 8 bits for the stencil channel.
    D3DFMT_D24X8 77 32-bit z-buffer bit depth using 24 bits for the depth channel.
    D3DFMT_D24X4S4 79 32-bit z-buffer bit depth using 24 bits for the depth channel and 4 bits for the stencil channel.
    D3DFMT_D32F_LOCKABLE 82 A lockable format where the depth value is represented as a standard IEEE floating-point number.
    D3DFMT_D24FS8 83 A non-lockable format that contains 24 bits of depth (in a 24-bit floating point format - 20e4) and 8 bits of stencil.
    D3DFMT_D16 80 16-bit z-buffer bit depth.
    D3DFMT_VERTEXDATA 100 Describes a vertex buffer surface.
    D3DFMT_INDEX16 101 16-bit index buffer bit depth.
    D3DFMT_INDEX32


    I have looked in the D3DCAPS9 struct and cannot find anything that really indicates that a card should not support certain depth buffer formats.

    Am I missing something?

    For reference, I am using a GeForce 6600GT 128MB...

    Share this post


    Link to post
    Share on other sites
    Quote:
    Original post by yahastu
    I have looked in the D3DCAPS9 struct and cannot find anything that really indicates that a card should not support certain depth buffer formats.


    Support for different depth surface types is not explicitly listed in the CAPS structure. Instead, you query the IDirect3D9 interface to find depth buffers that match a rendering configuration. The CapsViewer enumerates all supported combinations this way, as does the sample framework code which you can use as a reference to compare against.

    I discuss this in Chapter 2. Direct3D and Chapter 3. Direct3D Devices in my book.

    Share this post


    Link to post
    Share on other sites
    As for the reason this happens, it's likely that your display is 16-bit, and your card requires that the backbuffer depth matches the depth buffer depth (Passing D3DFORMAT_UNKNOWN causes the backbuffer format to be set to the display depth).

    Share this post


    Link to post
    Share on other sites
    Thanks legalize, I'll have a look there when I get a chance.

    Evil Steve,

    Note that D3DFMT_INDEX16 and D3DFMT_D15S1 are both 16 bit formats, and they are refused also.

    I tried manually setting my backbuffer format to something 32 bit.

    dx_PresParams.BackBufferFormat = D3DFMT_A8R8G8B8;

    This does not cause any problems. However, it still refuses to allow me to use anything other than D3DFMT_D16 as the depth buffer :/

    Share this post


    Link to post
    Share on other sites
    Quote:
    Original post by yahastu
    For reference, I am using a GeForce 6600GT 128MB...


    Just so you know, this card should support D3DFMT_D24X8 and D3DFMT_D24S8. I was using it until recently and could run in those modes without problem. Try to tweak other params and you should get it working

    Share this post


    Link to post
    Share on other sites

    Create an account or sign in to comment

    You need to be a member in order to leave a comment

    Create an account

    Sign up for a new account in our community. It's easy!

    Register a new account

    Sign in

    Already have an account? Sign in here.

    Sign In Now

    Sign in to follow this