glClipPlane not work
hi all;
i use shader in opengl, i find 'glClipPlane' don't works in my games.
when i remove the shader, it works!
so what should i do?
thanks.
Shaders should be specifying either gl_ClipVertex or gl_ClipDistance[] .
There's a chance it's a driver bug, too. Mixing shaders and fixed-func is a can of worms, so I wouldn't mark it as a safe-path.
There's a chance it's a driver bug, too. Mixing shaders and fixed-func is a can of worms, so I wouldn't mark it as a safe-path.
Shaders override the fixed function pipeline, so you should be "emulating" the clip planes in your shaders. Just write the eye space vertex position into gl_ClipVertex inside your vertex shader and you should be fine.
Also, you can completely do clip planes in GLSL.
Also, you can completely do clip planes in GLSL.
I use CG as shader, is there any info about clip plane In CG?
by the way, i think clip is after VertexShader, and there is some operations on VertexShader's result, and New data will Call PixelShader, then there is some
operations on PixelShader's result, in the end, the data will write in buffer.
by the way, i think clip is after VertexShader, and there is some operations on VertexShader's result, and New data will Call PixelShader, then there is some
operations on PixelShader's result, in the end, the data will write in buffer.
Yes, it is best that you do the clip plane algorithm yourself, preferably in the fragment shader. Check out the opengl spec to see how it is done.
During the Gf2 and Gf4 period, a gyu from nvidia said they used a 1D texture to do the clipping so it consume 1 texture unit. I know it doesn't compare with todays shader based GPUs but I don't see it as a big problem to do this in a fragment shader.
I believe when I tried this, I wrote to gl_Position and gl_ClipPlane
Ex:
gl_Position = ftransform();
gl_ClipPlane = Modelview * gl_Vertex;
but this caused artifacts on a Radeon 9700 with some driver. With another driver, it run extremely slow without any warning that it was in software mode.
So I gave up on gl_ClipPlane.
I believe when I tried this, I wrote to gl_Position and gl_ClipPlane
Ex:
gl_Position = ftransform();
gl_ClipPlane = Modelview * gl_Vertex;
but this caused artifacts on a Radeon 9700 with some driver. With another driver, it run extremely slow without any warning that it was in software mode.
So I gave up on gl_ClipPlane.
Quote:Original post by V-man
During the Gf2 and Gf4 period, a gyu from nvidia said they used a 1D texture to do the clipping so it consume 1 texture unit. I know it doesn't compare with todays shader based GPUs but I don't see it as a big problem to do this in a fragment shader.
I believe when I tried this, I wrote to gl_Position and gl_ClipPlane
Ex:
gl_Position = ftransform();
gl_ClipPlane = Modelview * gl_Vertex;
but this caused artifacts on a Radeon 9700 with some driver. With another driver, it run extremely slow without any warning that it was in software mode.
So I gave up on gl_ClipPlane.
thanks. but I use CG. so I use other method, I Enable AlphaTest, and In Pixel Shader, I compute dot(ClipPlane, Point) as alpha, It works!
en. I think it's easy to understand with my method.
That's what I suggested. Do your own clipping in the fragment shader, which translates to killing fragments or discarding fragments or whatever your want to call it :)
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement