Jump to content
  • Advertisement
Sign in to follow this  
Tree Penguin

DX11 [DX11] Writing depth buffer from PS using SV_Depth

This topic is 2607 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm trying to render to a depth buffer (D24S8) using SV_Depth in de pixel shader (for doing a selective downsample), but the value isn't written (I'm 100% sure of this).

The value itself is correct (I output it to a color buffer as well).
Depth writes are enabled, depth testing is disabled.
Enabling / disabling stencil doesn't seem to matter.

Does anyone know of any limitations that might cause this? Or anything else that might go wrong?

If there's some limitation are there any alternatives for doing (selective) downsamples of a depth buffer? I need a stencil buffer with the downsampled result.

Thanks

Share this post


Link to post
Share on other sites
Advertisement
Already read it, I am doing none of that.

Also, the shader is nothing fancy, it's 4 depth texture reads, min and max instructions, and then outputs the same value to SV_Target's R component and SV_Depth. I've even tried just outputting the texture fetch result, which still didn't work (even though the color buffer contained the correct value).

Share this post


Link to post
Share on other sites
Does the final result has to be a depth buffer? Couldn't you just output to a floating point precision render target?

Cheers!


Share this post


Link to post
Share on other sites
You really don't have to do anything special, just output a [0,1] value. If you're doing something majorly wrong then the runtime will usually yell at you in the debug output (assuming you created the device with the DEBUG flag). I assume that you've verified in PIX that the output depth buffer isn't getting the values that you expect?

Share this post


Link to post
Share on other sites
Does the final result has to be a depth buffer? Couldn't you just output to a floating point precision render target?

Unfortunately yes, I am doing a low resolution pass for which I need depth-tested stencil buffer masking.

You really don't have to do anything special, just output a [0,1] value. If you're doing something majorly wrong then the runtime will usually yell at you in the debug output (assuming you created the device with the DEBUG flag). I assume that you've verified in PIX that the output depth buffer isn't getting the values that you expect?

That's what I thought, and why I'm so surprised I can't get it to work.

I checked pix (and GPUPerfHud). The depth buffer values stay the same, the value I write to oDepth when I debug the pixelshader is correct (though oDepth seems to be listed as Name TBD, Value (_,_,_,_) and Type TBD in the Registers tab).
I enabled DX11 debug, and I get no errors or (related) warnings.

Share this post


Link to post
Share on other sites
Maybe you have a wrong viewport (it also defines a depth range). Anyway: Can you show your relevant code and the shader, please ?

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!