Jump to content

  • Log In with Google      Sign In   
  • Create Account

Banner advertising on our site currently available from just $5!

1. Learn about the promo. 2. Sign up for GDNet+. 3. Set up your advert!


Member Since 12 May 2009
Offline Last Active Yesterday, 05:14 PM

Topics I've Started

Position and Normals from depth revisited (artifacts/banding)

07 May 2013 - 02:26 PM

I am trying to recover position and normals from depth stored in render to texture. I store the depth like so in the render to texture:




outObjPosViewSpace = mul(worldView, inPos);




float depth = length(inObjPosViewSpace); // stored in component w of rgba texture


I am trying to recover the normals/depth like so:


    float fDepth = tex2D(rtt, screenCoords).w;

    float3 vPosition = eyePosW.xyz + viewRay*fDepth;
    float3 vNormal = cross(normalize(ddy(vPosition.xyz)), normalize(ddx(vPosition.xyz)));

    vNormal= normalize(vNormal);


The depth as read from the texture, and hence the recovered normals/positions show banding/artifacts. See images (trying to render swimming pool and also trying to recover pool bottom position for caustics). I am unsure why this banding occurs.







I tried/read several things. For example:




-It does not matter whether I use 16 bit rgba for the rtt or 32 bit rgba

-Filtering mode does not cause any changes.

As a side note, I was playing around in Vision Engine's vForge editor, and when I debugged the normals/depth by outputting same values to the shader as mine, I get similar artifacts and banding. I would assume that VE is doing correct math, since their deferred renderer is 'battle tested'.