Sign in to follow this  
PascalGrosset

Fragment Depth in glsl

Recommended Posts

Hello

I'm looking for a way to get the correct depth value in GLSL. I am rendering a cube and I am stepping inside that cube (it's a volume rendering application).

In the picture below
[sharedmedia=core:attachments:7420]

The blue dots should have a deeper z value than the red one. However, when I am displaying the value of z as a color, I am getting the same color at the blue (on the outside) as the red dot which is in front (meaning I am getting the same depth). I am trying with gl_FragCoord.z (and tried several others too). Does anyone have a suggestion what I could use.

Thanks

Share this post


Link to post
Share on other sites
Fragment depth is stored in nonlinear between 0 and 1. All the depths close to you are going to come out to the same 0-255 value. You can easily visualize it by doing pow(z, 50); if you need the exact z you're going to want to figure out the inverse of whatever transformation it goes though and figure out the formula to transform it back

Share this post


Link to post
Share on other sites
Do Modelview * Vertex in your shader and send that to your fragment shader. From the fragment shader, write that value to the output. Also, attach a Render Buffer with a GL_RGBA32F format (or GL_R32F if your system supports the write version. I think you needed GL 3.0).

Share this post


Link to post
Share on other sites
As zacaj said, the depth values are stored in a nonlinear fasion.

This article explains how to get them to be linear.
[url="http://olivers.posterous.com/linear-depth-in-glsl-for-real"]http://olivers.posterous.com/linear-depth-in-glsl-for-real[/url]

Share this post


Link to post
Share on other sites
Thanks for the replies [img]http://public.gamedev.net//public/style_emoticons/default/smile.png[/img]

I am currently working in the eye space. In this space, the camera is at (0,0,0) and we are looking towards the negative z.

The only thing that seems to be giving me something which looks correct is:

// in the vertex shader
eyeVec = vec3(gl_ModelViewMatrix * gl_Vertex);


// in the fragment shader
depth = -1.0/((eyeVec.z-gl_DepthRange.near)/(gl_DepthRange.far-gl_DepthRange.near));

or

depth = -1.0/eyeVec.z;

I am not satisfied with the range of z that I am getting though
[sharedmedia=core:attachments:7424]
though it looks reasonable correct.

This is my perspective projection setup:
gluPerspective(45, (width/height), 1, 10.0);
and I do several other translations and rotations too.

My question is: how do I get the value of my near plane and far plane in the fragment shader while working in eyespace. Is gl_DepthRange.near correct ?

OpenGL version used:
OpenGL Version: 4.2.0 NVIDIA 290.10
OpenGL Shading Language Version: 4.20 NVIDIA via Cg compiler

Share this post


Link to post
Share on other sites
Not sure what -1.0/ Anything is supposed to be.

I would list what exact effect you are trying to achieve and why you need the depth. Using gl_FragCoord.z, your box should be almost completely white. I would move the box really close the the eye of the camera so that you can see the difference in colors, because you will only see color differences within the first 10% of your depth range. So you have 1 to 10. You will only see a difference in colors from probably 1 to 1.5 units away from the camera.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this