Jump to content

  • Log In with Google      Sign In   
  • Create Account

d3d 11 Stretch DepthStencil


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
5 replies to this topic

#1 yoelshoshan   Members   -  Reputation: 229

Like
0Likes
Like

Posted 03 December 2011 - 05:42 AM

Hi,

I'm trying to find a way to stretch a depthstencil resource (change size, but maintain aspect ratio)
I'm aware that this is not common, but this is a part of a bigger algorithm that needs to be done. (The Algorithm evolves around Intels Dynamic Resolution Change method).

The method can contain few steps, and doesn't need to be super-fast, but the CPU should not be involved in the process.

Reading the values isn't a problem, however, i'm stuck for now on writing into the depthstencil :/

Anyone here is familiar with such method/have an idea how to do it?

Cheers,
Yoel.

Sponsor:

#2 Hodgman   Moderators   -  Reputation: 32036

Like
0Likes
Like

Posted 03 December 2011 - 05:52 AM

To output depth values, you can make your pixel shader output a struct instead of a float4, and in the struct add a float4 member tagged with the COLOR semantic (if you want to stretch the colour buffer at the same time), and add a float member tagged with the DEPTH semantic.

...however, it would be interesting to know why you need to do this, as interpolating a depth buffer generally doesn't produce "valid" results.

#3 Promethium   Members   -  Reputation: 580

Like
0Likes
Like

Posted 03 December 2011 - 05:54 AM

Have you tried binding the depth-stencil texture as an ordinary render target? Try creating two render target views, one with type DXGI_FORMAT_R24_UNORM_X8_TYPELESS and the other with DXGI_FORMAT_X24_TYPELESS_G8_UINT and then render to them as ordinary render targets. I don't know if it will work, but I think it should. You will probably have to make a two-pass render as the driver will likely not allow you to bind both targets at the same time.

#4 yoelshoshan   Members   -  Reputation: 229

Like
0Likes
Like

Posted 03 December 2011 - 08:30 AM

Thanks for your quick answers!

Promethium :
I've tried similar stuff, but i get errors that this type is not supported
("ID3D11Device::CreateTexture2D: The format (0x2f, X24_TYPELESS_G8_UINT) cannot be bound as a RenderTarget")
I tried the rest of that format "family" as well
However i only have here 9600gt, so I'll be able to test with newer hardware tomorrow (480GTX)

Hodgman:
I will try this and report the performance i get with it. I somehow remembered that outputting depth in the pixel shader was a big no-no, but few years and DX APIs has progressed since then, so maybe it'll give a non super-slow performance now =)

Update will follow :)

#5 MJP   Moderators   -  Reputation: 11844

Like
0Likes
Like

Posted 03 December 2011 - 03:16 PM

Pixel shaders can directly output depth to a depth buffer. You use the SV_Depth semantic to do so. There's no way to output stencil, however.

Outputting depth is not recommended when actually rendering geometry, since it disables early depth-culling hardware. For a full screen pass you won't even be using depth culling, so that is not an issue.

#6 yoelshoshan   Members   -  Reputation: 229

Like
0Likes
Like

Posted 08 December 2011 - 07:21 AM

Ok, thanks for you help people!

I've decided eventually to avoid any stretching of depthstencil and to use an alternative method which works now (which involves patching hlsl assembly ...)




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS