Jump to content
  • Advertisement
Sign in to follow this  

DX11 How to offset pixel depth in the pixel shader by a constant value?

Recommended Posts

I'm trying to offset the depth value of all pixels written by a HLSL pixel shader, by a constant view-space value (it's used in fighting games like Guilty gear and Street Fighter V to simulate 2d layering effects. I wish to do something similar). The projection matrix is generated in sharpdx using a standard perspective projection matrix (PerspectiveFOVLH, which makes a matrix similar to the one described at the bottom there).

My pixel shader looks like this

struct PSoutput {
    float4 color: SV_TARGET;
    float depth: SV_DEPTH;

PSoutput PShaderNormalDepth(VOutColorNormalView input)
    PSoutput output;
    output.color = BlinnPhong(input.color, input.normal, input.viewDirection);
    output.depth = input.position.z; //input.position's just the standard SV_POSITION
    return output;

This one gives me the exact same results as before I included depth output. Given a view space offset value passed in a constant shader, how do I compute the correct offset to apply from there?

EDIT: I've been stuck on this for weeks, but of course a bit after I post it I figure it out, after reading this.

So, with a standard projection in clip space position.z really contains D = a * (1/z)+b where b and a are elements 33 and 43 of the projection matrix and z is view space depth. This means the view space depth can be computed with z = a/(D-b).

So to add a given view space depth offset in the pixel shader, you do this:

float trueZ = projectionMatrix._43 / (input.position.z - projectionMatrix._33);
output.depth = projectionMatrix._43 / (trueZ + zOffset) + projectionMatrix._33;
Edited by chevluh
Solved my problem

Share this post

Link to post
Share on other sites

If you're always going to offset the depth in one direction, you can use SV_DepthGreaterEqual or SV_DepthLessEqual. These can potentially be more optimal for GPU's that perform depth testing before the pixel shader is run (pretty much all desktop GPU's do this).

Share this post

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!