reconstruct depth from z/w ?

Started by
3 comments, last by 3TATUK2 10 years, 1 month ago

Hello, I am fooling arround with webgl and I would like to add fog as a postprocess effect.

So I'm trying to get the original depth value to apply a linear fog.

I've already read through a lot of forum posts but I can't get it to work.

In my depthpass I output the depth as


float depth = gl_FragCoord.z / gl_FragCoord.w;
gl_FragColor = vec4( vec3(depth), 1.0 );

And in the postprocessing pass I try to reconstruct it using this method http://www.geeks3d.com/20091216/geexlab-how-to-visualize-the-depth-buffer-in-glsl/.

(my nearplane is 0.1 and farplane of the camera is 20000.0 )


float my_z = (-0.1 * 20000.0) / (depth1 - 20000.0);

Using my_z as depth my output isn't the same as when I just visualize the depth from my depthpass with:


float depth = gl_FragCoord.z / gl_FragCoord.w;
float color = 1.0 - smoothstep( 1.0, 200.0, depth );
gl_FragColor = vec4( vec3(color), 1.0 );

So I expected to get the same result when using the reconstructed Z in my postprocessing pass


float my_z = (-0.1 * 20000.0) / (texture2D( tDepth, texCoord ).x - 20000.0);
float color = 1.0 - smoothstep( 1.0, 200.0, depth );
gl_FragColor = vec4( vec3(color), 1.0 );

Outputting my_z gives me this result: http://i.imgur.com/8C3reNd.png

So what am I doing wrong here?

Advertisement

ofc, just after posting this I found another error, now I just output z/w and use it in the other shader..

 

...in the postprocessing pass I try to reconstruct it using this method http://www.geeks3d.com/20091216/geexlab-how-to-visualize-the-depth-buffer-in-glsl/.

Re the geekslab page, yeah I tried that years ago and it doesn't work.

Try this (for an arbitrary perspective frustum, with glDepthRange of 0..1):


vec3 PositionFromDepth_DarkHelmet(in float depth)
{
  vec2 ndc;             // Reconstructed NDC-space position
  vec3 eye;             // Reconstructed EYE-space position
 
  eye.z = near * far / ((depth * (far - near)) - far);
 
  ndc.x = ((gl_FragCoord.x * widthInv) - 0.5) * 2.0;
  ndc.y = ((gl_FragCoord.y * heightInv) - 0.5) * 2.0;
 
  eye.x = ( (-ndc.x * eye.z) * (right-left)/(2*near)
            - eye.z * (right+left)/(2*near) );
  eye.y = ( (-ndc.y * eye.z) * (top-bottom)/(2*near)
            - eye.z * (top+bottom)/(2*near) );
 
  return eye;
}
Note: "depth" is your 0..1 window-space depth. Of course, if you assume a "symmetric" perspective frustum (but not necessarily one that is 90 deg FOV), the eye.x/.y lines simplify down to:


  eye.x = (-ndc.x * eye.z) * right/near;
  eye.y = (-ndc.y * eye.z) * top/near;
Now of course for mere depth buffer visualization, all you really want from this is "eye.z", which is linear depth value. So nuke the rest. Just map this eye.z value from -n..-f to 0..1, use that as your fragment intensity, and you're done:


  intensity = (-eye.z - near) / (far-near);

vec3 PositionFromDepth_DarkHelmet(in float depth){
  vec2 ndc;             // Reconstructed NDC-space position
  vec3 eye;             // Reconstructed EYE-space position
  eye.z = near * far / ((depth * (far - near)) - far);
  ndc.x = ((gl_FragCoord.x * widthInv) - 0.5) * 2.0;
  ndc.y = ((gl_FragCoord.y * heightInv) - 0.5) * 2.0;
  eye.x = ( (-ndc.x * eye.z) * (right-left)/(2*near)            - eye.z * (right+left)/(2*near) );
  eye.y = ( (-ndc.y * eye.z) * (top-bottom)/(2*near)            - eye.z * (top+bottom)/(2*near) );
  return eye;
}

Fixed.


L. Spiro

I restore Nintendo 64 video-game OST’s into HD! https://www.youtube.com/channel/UCCtX_wedtZ5BoyQBXEhnVZw/playlists?view=1&sort=lad&flow=grid

F_depth should be output simply as gl_FragCoord.z

Then reconstruction:

vert:


varying mat4 inv;
inv = inverse(gl_ProjectionMatrix);

frag:


vec4 worldPosition;
worldPosition.x = f_TexCoord.x * 2 - 1;
worldPosition.y = f_TexCoord.y * 2 - 1;
worldPosition.z = F_depth;
worldPosition.w = 1.0f;
worldPosition = inv * worldPosition;
worldPosition /= worldPosition.w;

This topic is closed to new replies.

Advertisement