Position and Normals from depth revisited (artifacts/banding)

Started by
3 comments, last by psquare 10 years, 11 months ago

I am trying to recover position and normals from depth stored in render to texture. I store the depth like so in the render to texture:

VS:


outObjPosViewSpace = mul(worldView, inPos);

PS:


float depth = length(inObjPosViewSpace); // stored in component w of rgba texture

I am trying to recover the normals/depth like so:


    float fDepth = tex2D(rtt, screenCoords).w;

    float3 vPosition = eyePosW.xyz + viewRay*fDepth;
 
    float3 vNormal = cross(normalize(ddy(vPosition.xyz)), normalize(ddx(vPosition.xyz)));

    vNormal= normalize(vNormal);

The depth as read from the texture, and hence the recovered normals/positions show banding/artifacts. See images (trying to render swimming pool and also trying to recover pool bottom position for caustics). I am unsure why this banding occurs.

2013_05_07_161315.png

2013_05_07_161328.png

2013_05_07_161658.png

2013_05_07_161727.png

I tried/read several things. For example:

http://www.gamedev.net/topic/480031-world-space---view-space-transformation/

http://mynameismjp.wordpress.com/2010/03/


-It does not matter whether I use 16 bit rgba for the rtt or 32 bit rgba

-Filtering mode does not cause any changes.

As a side note, I was playing around in Vision Engine's vForge editor, and when I debugged the normals/depth by outputting same values to the shader as mine, I get similar artifacts and banding. I would assume that VE is doing correct math, since their deferred renderer is 'battle tested'.

Advertisement

I would just render the world space positions and normals directly using multiple rendering outputs since you might want normal mapping. If you need the depth buffer for a simple post effect, you can compute the linear depth if you have the camera projection matrix. http://www.gamedev.net/topic/604984-linear-depth-buffer-for-rendering-shadows/

Thanks for your reply. However, my question is more geared towards why this really happens.

How do you create your depth texture and render to it?

I use ogre. Its a render to texture render target with rgba components. Each component is a float32. So total of 32x4 = 128 bits precision.

This topic is closed to new replies.

Advertisement