Jump to content

  • Log In with Google      Sign In   
  • Create Account


Fragment Depth in glsl


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
5 replies to this topic

#1 zetwalnwar   Members   -  Reputation: 100

Like
0Likes
Like

Posted 22 February 2012 - 07:48 PM

Hello

I'm looking for a way to get the correct depth value in GLSL. I am rendering a cube and I am stepping inside that cube (it's a volume rendering application).

In the picture below
drawing.png

The blue dots should have a deeper z value than the red one. However, when I am displaying the value of z as a color, I am getting the same color at the blue (on the outside) as the red dot which is in front (meaning I am getting the same depth). I am trying with gl_FragCoord.z (and tried several others too). Does anyone have a suggestion what I could use.

Thanks

Sponsor:

#2 zacaj   Members   -  Reputation: 643

Like
0Likes
Like

Posted 22 February 2012 - 08:27 PM

Fragment depth is stored in nonlinear between 0 and 1. All the depths close to you are going to come out to the same 0-255 value. You can easily visualize it by doing pow(z, 50); if you need the exact z you're going to want to figure out the inverse of whatever transformation it goes though and figure out the formula to transform it back

#3 V-man   Members   -  Reputation: 805

Like
0Likes
Like

Posted 23 February 2012 - 07:25 AM

Do Modelview * Vertex in your shader and send that to your fragment shader. From the fragment shader, write that value to the output. Also, attach a Render Buffer with a GL_RGBA32F format (or GL_R32F if your system supports the write version. I think you needed GL 3.0).
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);

#4 Sparty   Members   -  Reputation: 100

Like
0Likes
Like

Posted 23 February 2012 - 07:41 AM

As zacaj said, the depth values are stored in a nonlinear fasion.

This article explains how to get them to be linear.
http://olivers.posterous.com/linear-depth-in-glsl-for-real

#5 zetwalnwar   Members   -  Reputation: 100

Like
0Likes
Like

Posted 23 February 2012 - 05:10 PM

Thanks for the replies Posted Image

I am currently working in the eye space. In this space, the camera is at (0,0,0) and we are looking towards the negative z.

The only thing that seems to be giving me something which looks correct is:

// in the vertex shader
eyeVec = vec3(gl_ModelViewMatrix * gl_Vertex);


// in the fragment shader
depth = -1.0/((eyeVec.z-gl_DepthRange.near)/(gl_DepthRange.far-gl_DepthRange.near));

or

depth = -1.0/eyeVec.z;

I am not satisfied with the range of z that I am getting though
cube.png
though it looks reasonable correct.

This is my perspective projection setup:
gluPerspective(45, (width/height), 1, 10.0);
and I do several other translations and rotations too.

My question is: how do I get the value of my near plane and far plane in the fragment shader while working in eyespace. Is gl_DepthRange.near correct ?

OpenGL version used:
OpenGL Version: 4.2.0 NVIDIA 290.10
OpenGL Shading Language Version: 4.20 NVIDIA via Cg compiler

Attached Thumbnails

  • cube.png


#6 dpadam450   Members   -  Reputation: 886

Like
0Likes
Like

Posted 23 February 2012 - 05:37 PM

Not sure what -1.0/ Anything is supposed to be.

I would list what exact effect you are trying to achieve and why you need the depth. Using gl_FragCoord.z, your box should be almost completely white. I would move the box really close the the eye of the camera so that you can see the difference in colors, because you will only see color differences within the first 10% of your depth range. So you have 1 to 10. You will only see a difference in colors from probably 1 to 1.5 units away from the camera.




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS