Sign in to follow this  
Assembler015

SOLVED (Crappy ATI): varing variables in GLSL

Recommended Posts

Assembler015    122
EDIT: I posted my solution as post#11 Hello - I'm pretty new to pixel shaders and have been having some troubles lately. All I'm trying to do is shade a sphere but I can only seem to get it to 'work' in a hacked-up-way. I think I've tracked down the problem to the normals being used. So if I do something like this in the vertex shader:
varying vec3 n

void main (void)
{
    n = gl_Normal;
    gl_Position = ftransform();
}


and use that variable 'n' in the fragment shader it appears to be wrong. However, if I use a uniform variable in the fragment shader and set my normal manually on a per face basis, it looks correct. (So I do something like this in the program):
while(vertices to render)
{
    glUniform3fARB(my_normal, nx,ny,nz);
    /*glNormal3f(nx,ny,nz);*.

    glBegin(GL_TRIANGLES)

    glVertex3f()...glVertex3f()...glVertex3f()

    glEnd()
}


and then access the my_normal variable in the fragment shader in place of the normal specified via glNormal3f it shades correctly. So I'm wondering, how does the varying variable work? I have an idea of how it works, but I'm thinking I must be wrong because otherwise my shader would work. If I set a varying variable to 0.0 for the vertices in the vertex shader, will it always be 0.0 in the fragment shader? Also, does the varying variable interpolate on a per triangle basis? Thanks --Andrew [Edited by - Assembler015 on November 13, 2005 12:47:31 PM]

Share this post


Link to post
Share on other sites
rollo    366
You forget to transform your normal in the vertex shader. You transform the position, but not the normal:

void main (void)
{
n = gl_Normal;
gl_Position = ftransform();
}

should read

void main (void)
{
n = gl_NormalMatrix * gl_Normal;
gl_Position = ftransform();
}

fttransform() does the following calculation: gl_ModelViewProjectionMatrix * gl_Vertex. The reason that there is a special function for it is to ensure compability with fixed-function hardware. It will probably be the same result on your machine.

EDIT: oh, btw, gl_NormalMatrix is the transpose of the inverse of the upper 3x3 bit of gl_ModelViewMatrix (Phew!). Some more info here...

Share this post


Link to post
Share on other sites
Assembler015    122
Thanks for the reply,

I've tried multiplying by the normal matrix as well with no success. I must be doing something obviously wrong...I just tried this test program and the results weren't what I expected...


vertex shader:


varying vec3 n;

void main ()
{
n = vec3(0,0,0);
gl_Position = ftransform();
}




fragment shader:


varying vec3 n;

void main ()
{
if(n.x == 0.0 && n.y == 0.0 && n.z == 0.0)
gl_FragColor = vec4(1,0,0,1);
else
gl_FragColor = vec4(0,1,0,1);
}





I expected my model to always be red, but it's not. It's solid green and periodically it flashed red.

Does that make sense? :-/

--Andrew

Share this post


Link to post
Share on other sites
zedzeek    528
in the shader if u use gl_Normal then this reads whatever the current normal is
ie glNormal3fv( normal );
it doesnt take a uniform as input like youre doing

Share this post


Link to post
Share on other sites
Assembler015    122
Quote:
Original post by zedzeek
in the shader if u use gl_Normal then this reads whatever the current normal is
ie glNormal3fv( normal );
it doesnt take a uniform as input like youre doing


I was only using the uniform as a test. I would set a uniform to the normal vector *instead of* calling glNormal3f and using gl_Normal in the shader.

However, it is looking like it is the varying variable that are causing the problem. I thought I understood them but I guess I don't. For instance...


If I do..



varying vec3 n;

void main (void)
{
n = vec3(0, 0, 0);

gl_Position = ftransform();
}




and



varying vec3 n;

void main (void)
{
if(n.x == 0.0 && n.y == 0.0 && n.z == 0.0)
gl_FragColor = vec4(1,0,0,1);
else
gl_FragColor = vec4(0,1,0,1);
}




I see the model color as GREEN. If I change the code to...



varying float nx;
varying float ny;
varying float nz;

void main (void)
{
nx = 0.0;
ny = 0.0;
nz = 0.0;

gl_Position = ftransform();
}




and



varying float nx;
varying float ny;
varying float nz;

void main (void)
{
if(nx == 0.0 && ny == 0.0 && nz == 0.0)
gl_FragColor = vec4(1,0,0,1);
else
gl_FragColor = vec4(0,1,0,1);
}



It works as I expect and the result is a RED model.

Can some one plllleeeeaaassseee explain this to me? :-)

Thanks for all your help so far,
--Andrew

Share this post


Link to post
Share on other sites
Black Knight    769
I have the same problem.
I set the texcoord in vertex shader like

gl_TexCoord[1] = gl_TextureMatrix[1] * gl_Vertex;
ProjTexCoord = gl_TextureMatrix[1] * gl_Vertex;

Then in the fragment shader i use it :
This works
vec4 reflectionValue = vec4(texture2DProj(reflection, gl_TexCoord[1]));
but this does not work.
vec4 reflectionValue = vec4(texture2DProj(reflection, ProjTexCoord));

ProjTexCoord is a varying variable :
varying vec4 ProjTexCoord;
I dont know why it doesnt work :(

Did you tried?
n = vec3(0.0, 0.0, 0.0);




Share this post


Link to post
Share on other sites
Assembler015    122
Quote:
Original post by Black Knight
I have the same problem.
I set the texcoord in vertex shader like

gl_TexCoord[1] = gl_TextureMatrix[1] * gl_Vertex;
ProjTexCoord = gl_TextureMatrix[1] * gl_Vertex;

Then in the fragment shader i use it :
This works
vec4 reflectionValue = vec4(texture2DProj(reflection, gl_TexCoord[1]));
but this does not work.
vec4 reflectionValue = vec4(texture2DProj(reflection, ProjTexCoord));

ProjTexCoord is a varying variable :
varying vec4 ProjTexCoord;
I dont know why it doesnt work :(

Did you tried?
n = vec3(0.0, 0.0, 0.0);


I have tried setting the normal a dozen diffent ways. Including...


n = vec3(0, 0, 0);

and

n.x = 0;
n.y = 0;
n.z = 0;

and

n[0] = 0;
n[1] = 0;
n[2] = 0;


The only way I can get n.xyz = 0.0 is by declaring them as individual floats.

Also, does any one know what position gl_FragCoord represents? Does it represent an interpolated point inside the triangle specified by the 3 vertices or is it an adjust coordinate?

--Andrew

Share this post


Link to post
Share on other sites
rollo    366
watch out with float equality! Maybe the fragment shader can't represent 0.0 properly, or the way its represented in vectors and scalars are different. You really should put in some epsilons to be safe. So try this instead:

void main (void)
{
if (length(n) < 0.001)
gl_FragColor = vec4(1,0,0,1);
else
gl_FragColor = vec4(0,1,0,1);
}

Share this post


Link to post
Share on other sites
Assembler015    122
Quote:
Original post by rollo
watch out with float equality! Maybe the fragment shader can't represent 0.0 properly, or the way its represented in vectors and scalars are different. You really should put in some epsilons to be safe. So try this instead:
*** Source Snippet Removed ***


I tried something similar to that. After some playing around I found that when the variable is declared as:

varying vec3 n;

and set in the vertex shader as:

n.x = 0.0;
n.y = 0.0;
n.z = 0.0;

That n.x and n.y are always 0 in the shader. However, n.z is a larger value. I tried ruling out rounding error by saying:

if(abs(n.z) < 0.0001){}

But the value was significantly larger than that. So why would n.xy stay constant (like I expected) but n.z be changing?


EDIT: Also, the output is inconsistent. The model is output as solid green one second, and solid red the next.

Share this post


Link to post
Share on other sites
rollo    366
Ok, I was a bit rushed when I typed in my first reply and didnt really answer all your questions. so here goes...

Quote:

So I'm wondering, how does the varying variable work? I have an idea of how it works, but I'm thinking I must be wrong because otherwise my shader would work. If I set a varying variable to 0.0 for the vertices in the vertex shader, will it always be 0.0 in the fragment shader? Also, does the varying variable interpolate on a per triangle basis?

The varying variables are variables passed from vertex to fragment shaders, and they are interpolated across the triangle. So yes it should always be 0.0 in the fragment shader.
Remember that you need to renormalize your normal in the fragment shader since it might not be of unit-length after interpolation.

Quote:

Also, does any one know what position gl_FragCoord represents? Does it represent an interpolated point inside the triangle specified by the 3 vertices or is it an adjust coordinate?

About gl_FragCoord. The spec says:
Quote:

The built-in gl_FragCoord holds the window relative coordinates x, y, z,
and 1/w for the fragment. The "z" component of gl_FragCoord undergoes an
implied conversion to floating point. This conversion must leave the
values 0 and 1 invariant. Note that this "z" component already has a
polygon offset added in, if enabled. The 1/w value is computed from the
Wc coordinate (see Section 2.10), which is the result of the product of
the projection matrix and the vertex's eye coordinates.


To figure out your problem it would be nice to see a bit more of the code you use to send data to the shader. Have you tried a call to glNormal3f for each vertex? Are you checking for errors and messages from shader compilation as well as glGetError()?

Share this post


Link to post
Share on other sites
Assembler015    122
Thanks everyone for your replies...

I solved the problem, it was a driver issue.

I went and had my brother run the shader on his geforce fx 5000 something and it ran exactly like it was suppose to. So this is the 4th time ATI's drivers have screwed me...I should have learned the first time.


But anyways, I was running ATI's latest catalyst drivers on my ATI Xpress 200M graphics card (on my laptop) and those drivers just didn't behave right with GLSL shaders (but they worked okay with DX HLSL shaders). I verified this by running additional sample code (some even released by ATI).

So I went and I grabbed the latest Omega drivers and they work out of the box...running my shader flawlessly.

Thanks for everyones help,
ATI sucks,
-Andrew

Share this post


Link to post
Share on other sites
_the_phantom_    11250
I should probably point out that Omega's drivers are very much ATI's drivers, they tend to lag behind and are mostly reg. tweaks and a few other minor adjustments, but the code is purely ATI's, as such any fixes could infact have been down to a driver install problem.

Share this post


Link to post
Share on other sites
Assembler015    122
Hmm, that is interesting.

When I first encountered the problem I did try a few sets of drivers, but not the Omega drivers. (I tried the ATI drivers that came with my laptop, the windows default, and the latest catalyst drivers. Between each install I ran ATI's 'ati software removal' tool)

I'm content with blaming it on ATI though. Previous experiences with them led to random freezes of my computer (while I was just sitting at the desktop), random crashes of 3d intensive games, and the random rejection of the video card by my system (BIOS complaining about strange things and Windows reverting back to the default video driver).

I've been an Nvidia fan ever since all those problems...their drivers have always seemed very stable. Unfortunately, I had to choose a laptop with ATI.

-Andrew

Share this post


Link to post
Share on other sites
rollo    366
Nvidias GLSL parser was awful until up to just recently too. It would basically just treat the shaders as Cg code and not give any warnings about things like:

vec4 v = vec3(1,2,3);

I got bitten many times by this when I was developing code at home on nvidia and was showing it off on my professor's ati card at university (ATI's parser was/is much stricter). NVidia seem to have fixed most of this by now which is nice, and there is always 3dLabs test parser to make sure you are actually writing valid GLSL.

Share this post


Link to post
Share on other sites
V-man    813
Quote:
Original post by Assembler015
Thanks for everyones help,
ATI sucks,
-Andrew


I just tested your code and it works fine on my 9500 with cat 5.9

I'm using 5.9 because 5.10 seems to crash when I'm debugging and I abort my program.

I would have to say that ATI is very lazy in correcting bugs in their drivers. Many catalyst version pass, even years pass and certain bugs go unchanged.
Their GLSL vertex shader still doesn't support looping so it's not possible to take full advantage of VS 2.0

Their drivers are quite good but I don't understand why they don't improve them. It's like driver development is at a standstill.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this