Jump to content
  • Advertisement
Sign in to follow this  
Ripiz

DX11 [C++, DX11] Drawing depth buffer

This topic is 2554 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello,

I'm trying to draw Depth Buffer (DB), but it seems shader doesn't read DB values correctly, to be more exact, it reads 1 all the time.
If I clear DB using other value, than 1, then it does it read that other value. I though maybe DB doesn't work correctly, however all geometry is drawn correctly.
Maybe anyone could help to resolve this issue?

Depth Buffer initialization:

// modified DXUT11 code

ID3D11Texture2D* pDepthStencil = NULL;
D3D11_TEXTURE2D_DESC descDepth;
descDepth.Width = backBufferSurfaceDesc.Width;
descDepth.Height = backBufferSurfaceDesc.Height;
descDepth.MipLevels = 1;
descDepth.ArraySize = 1;
descDepth.Format = DXGI_FORMAT_R32_TYPELESS;
descDepth.SampleDesc.Count = pDeviceSettings->d3d11.sd.SampleDesc.Count;
descDepth.SampleDesc.Quality = pDeviceSettings->d3d11.sd.SampleDesc.Quality;
descDepth.Usage = D3D11_USAGE_DEFAULT;
descDepth.BindFlags = D3D11_BIND_SHADER_RESOURCE | D3D11_BIND_DEPTH_STENCIL;
descDepth.CPUAccessFlags = 0;
descDepth.MiscFlags = 0;
hr = pd3dDevice->CreateTexture2D( &descDepth, NULL, &pDepthStencil );

D3D11_DEPTH_STENCIL_VIEW_DESC descDSV;
descDSV.Format = DXGI_FORMAT_D32_FLOAT;
descDSV.Flags = 0;
if( descDepth.SampleDesc.Count > 1 )
descDSV.ViewDimension = D3D11_DSV_DIMENSION_TEXTURE2DMS;
else
descDSV.ViewDimension = D3D11_DSV_DIMENSION_TEXTURE2D;
descDSV.Texture2D.MipSlice = 0;
hr = pd3dDevice->CreateDepthStencilView( pDepthStencil, &descDSV, &pDSV );

// ShaderResourceView
ID3D11DepthStencilView *pDepthStencil = DXUTGetD3D11DepthStencilView();
pDepthStencil->GetResource((ID3D11Resource**)&pTexture);
srdesc.Format = DXGI_FORMAT_R32_FLOAT;
srdesc.ViewDimension = D3D11_SRV_DIMENSION_TEXTURE2D;
srdesc.TextureCube.MostDetailedMip = 0;
srdesc.TextureCube.MipLevels = 1;
pDevice->CreateShaderResourceView(pTexture, &srdesc, &m_pDepthBuffer);
SAFE_RELEASE(pTexture);



Rendering:

pDeviceContext->PSSetShader(m_pPixelShader1, 0, 0);
pDeviceContext->PSSetShaderResources(0, 1, &m_pDepthBuffer);
ID3D11RenderTargetView *view = DXUTGetD3D11RenderTargetView();
pDeviceContext->OMSetRenderTargets(1, &view, 0);
pDeviceContext->Draw(4, 0);


Shader:
SamplerState texSampler : register(s0);
Texture2D<float> depthTex : register(t0);

float4 PS(float4 position : SV_POSITION, float2 texcoord : TEXCOORD0) : SV_TARGET {
return float4(depthTex.Sample(texSampler, texcoord).xxx, 1);
}



If anything else needed let me know. Thank you.

Share this post


Link to post
Share on other sites
Advertisement
Have you tried to apply some bias to the value in the pixel shader, like:
return pow(depth,10)
// or

return pow(depth*0.01,10)
// play around with these two values



You could also check it in PIX, but it is important that you adjust the depth range sliders (the left one must go _almost_ all the way to the right).

Share this post


Link to post
Share on other sites

Have you tried to apply some bias to the value in the pixel shader, like:
return pow(depth,10)
// or

return pow(depth*0.01,10)
// play around with these two values



You could also check it in PIX, but it is important that you adjust the depth range sliders (the left one must go _almost_ all the way to the right).

It seems pow(depth, 100); does the trick, but when I change near/far planes it's wrong again, not to mention power of 100 probably takes a lot time.




Are you sure it's *actually* 1.0, or is it just very close to 1.0? Because a depth buffer from a perspective projection will contain values very close to 1.0 for most of the depth range. I gave a more detailed explanation here if you're interested: http://www.gamedev.n...58#entry4831558

Now that you mention, I noticed a problem... With n3Xus's suggestion to try pow(depth, 100); I can clearly see darker area's, PIX shows it darker as well, but when I debug dark pixel, it returns (1.0, 1.0, 1.0, 1.0), even though pixel is actually (0.3, 0.3, 0.3, 1.0).




Is there any more efficient way to get proper value, than power of <high number>? With my usual depth range of 0.1 - 5000.0f I had to use pow(depth, 500); to get decent color difference.
Basically I'm trying to make depth edge detection, but with such high power value even flat surface near camera gets detected as edge, or I'll have to make linear depth buffer?

Share this post


Link to post
Share on other sites
Usually you just rescale so that 0.9 in your depth buffer is mapped to 0.0 in your output color, so output = (depth - 0.9f) * 10.0f or something like that. A more robust way would be to convert the depth value back to linear Z and then divide by your far clip plane.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!