Jump to content
  • Advertisement
Sign in to follow this  
yoelshoshan

d3d 11 Stretch DepthStencil

This topic is 2563 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi,

I'm trying to find a way to stretch a depthstencil resource (change size, but maintain aspect ratio)
I'm aware that this is not common, but this is a part of a bigger algorithm that needs to be done. (The Algorithm evolves around Intels Dynamic Resolution Change method).

The method can contain few steps, and doesn't need to be super-fast, but the CPU should not be involved in the process.

Reading the values isn't a problem, however, i'm stuck for now on writing into the depthstencil :/

Anyone here is familiar with such method/have an idea how to do it?

Cheers,
Yoel.

Share this post


Link to post
Share on other sites
Advertisement
To output depth values, you can make your pixel shader output a struct instead of a float4, and in the struct add a float4 member tagged with the COLOR semantic (if you want to stretch the colour buffer at the same time), and add a float member tagged with the DEPTH semantic.

...however, it would be interesting to know why you need to do this, as interpolating a depth buffer generally doesn't produce "valid" results.

Share this post


Link to post
Share on other sites
Have you tried binding the depth-stencil texture as an ordinary render target? Try creating two render target views, one with type DXGI_FORMAT_R24_UNORM_X8_TYPELESS and the other with DXGI_FORMAT_X24_TYPELESS_G8_UINT and then render to them as ordinary render targets. I don't know if it will work, but I think it should. You will probably have to make a two-pass render as the driver will likely not allow you to bind both targets at the same time.

Share this post


Link to post
Share on other sites
Thanks for your quick answers!

Promethium :
I've tried similar stuff, but i get errors that this type is not supported
("ID3D11Device::CreateTexture2D: The format (0x2f, X24_TYPELESS_G8_UINT) cannot be bound as a RenderTarget")
I tried the rest of that format "family" as well
However i only have here 9600gt, so I'll be able to test with newer hardware tomorrow (480GTX)

Hodgman:
I will try this and report the performance i get with it. I somehow remembered that outputting depth in the pixel shader was a big no-no, but few years and DX APIs has progressed since then, so maybe it'll give a non super-slow performance now =)

Update will follow :)

Share this post


Link to post
Share on other sites
Pixel shaders can directly output depth to a depth buffer. You use the SV_Depth semantic to do so. There's no way to output stencil, however.

Outputting depth is not recommended when actually rendering geometry, since it disables early depth-culling hardware. For a full screen pass you won't even be using depth culling, so that is not an issue.

Share this post


Link to post
Share on other sites
Ok, thanks for you help people!

I've decided eventually to avoid any stretching of depthstencil and to use an alternative method which works now (which involves patching hlsl assembly ...)

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!