why the gl_ClipDistance[] doesn't work?

Started by
7 comments, last by yan_qingsong 9 years, 8 months ago
I just can't understand why the gl_ClipDistance doesn't work. The result is the same as I didn't set the gl_ClipDistance.

I had set glEnable(GL_CLIP_DISTANCE0+2) by application.

the vertex shader is like this:

#version 410

uniform mat4 mvpMatrix;
//.......some other uniforms here
in vec2 vVertex;
uniform vec4 clip_plane=vec4(1.0,1.0,0.0,0.85);
uniform vec4 clip_sphere=vec4(0.0,0.0,0.0,10.0);

void main(void)
{
//........some other code,include vec2 worldPos
vec4 worldPosition=vec4(worldPos.x,height,worldPos.y,1.0);
gl_ClipDistance[0]=dot(worldPosition,clip_plane);
gl_ClipDistance[1]=length(worldPosition.xyz/worldPosition.w-clip_sphere.xyz)-clip_sphere.w;
gl_Position = mvpMatrix * position;
}
Advertisement

Clipping works perfectly, but you have to enable it. ;)

By setting glEnable(GL_CLIP_DISTANCE0+2), you have enabled gl_ClipDistance[2], not 0 and 1.

Also, I'm not sure whether your math is correct or not. It depends on your algorithm.

Just to know, if gl_ClipDistance[x] > 0.0, the vertex is visible, if it is <= 0.0, it is clipped.

By setting glEnable(GL_CLIP_DISTANCE0+2), you have enabled gl_ClipDistance[2], not 0 and 1.
This.
&nbsp;

Clipping works perfectly, but you have to enable it. ;)
&nbsp;
By setting&nbsp;glEnable(GL_CLIP_DISTANCE0+2), you have enabled&nbsp;gl_ClipDistance[2], not 0 and 1.
Also, I'm not sure whether your math is correct or not. It depends on your algorithm.
&nbsp;
Just to know, if&nbsp;gl_ClipDistance[x] &gt; 0.0, the vertex is visible, if it is &lt;= 0.0, it is clipped.

&nbsp;
Thank you! I tried "glEnable(GL_CLIP_DISTANCE0) ;glEnable(GL_CLIP_DISTANCE1) ;",but it still didn't work. I guess maybe something else is not right.

Then your code is wrong, as I presumed.

There is no need for multiple clip-distances. Set just gl_ClipDistance[0].

And, in order to prove that it works, set gl_ClipDistance[0] = -1.0;

If your model disappears, clipping works. :)

&nbsp;

Then your code is wrong, as I presumed.
There is no need for multiple clip-distances. Set just&nbsp;gl_ClipDistance[0].
And, in order to prove that it works, set&nbsp;gl_ClipDistance[0] = -1.0;
If your model disappears, clipping works. :)

&nbsp;
I set "gl_ClipDistance[0] = -1.0;" as you saied, but what make me confused is that the result is not changed, my model didn't disappear.

I set "gl_ClipDistance[0] = -1.0;" as you saied, but what make me confused is that the result is not changed, my model didn't disappear.

I hope you have couplet it with glEnable(GL_CLIP_DISTANCE0); in the host application, and the shader is actually executing.

Maybe you should read a little bit about the topic. There is a lot of material in all OpenGL related books:

  • OpenGL SuperBible 5th Ed. – pg.528
  • OpenGL SuperBible 6th Ed. – pg.276-281.
  • OpenGL Programming Guide 8th Ed. – pg.238
  • OpenGL Shading Language 3rd Ed – pg.112, 286

I hope you have couplet it with&nbsp;glEnable(GL_CLIP_DISTANCE0); in the host application, and the shader is actually executing.

Maybe you should read a little bit about the topic. There is a lot of material in all OpenGL related books:

  • OpenGL SuperBible 5th Ed. – pg.528
  • OpenGL SuperBible 6th Ed. – pg.276-281.
  • OpenGL Programming Guide 8th Ed. – pg.238
  • OpenGL Shading Language 3rd Ed – pg.112, 286

actually,I had read related content in OpenGL SuperBible 5th Ed,OpenGL SuperBible 6th Ed and OpenGL Shading Language 3rd Ed. And I set “glEnable(GL_CLIP_DISTANCE0)” in the application of course.

The problem is solved.The code is right. GPU is the key. When I change the AMD card to NVIDIA card, the code works well.

The problem is solved.The code is right. GPU is the key. When I change the AMD card to NVIDIA card, the code works well.

That, however, is no solution. A third of your customers has an AMD card, so your code must work on these.

The phenomenon "works on nVidia, doesn't on AMD" usually comes from the fact that AMD's GLSL implementation is rather rigidly following the specification, whereas nVidia passes GLSL through their Cg compiler, which is a bit more relaxed and allows things that are actually not allowed if you are really strict about the spec (which isn't as good as it sounds, actually that's rather bad -- it should complain when something isn't quite right).

Be sure (absolutely positive!) that you do not see any warnings in your shader log, and run your shader code through the reference compiler. Only if the reference compiler does not emit an error or a warning (and I'm almost sure it will!) you might consider the problem "solved" (but then, you should file a driver bug report, since after all if your shader is correct, then it must execute correctly, too).

This topic is closed to new replies.

Advertisement