# SOLVED (Crappy ATI): varing variables in GLSL

This topic is 4482 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

EDIT: I posted my solution as post#11 Hello - I'm pretty new to pixel shaders and have been having some troubles lately. All I'm trying to do is shade a sphere but I can only seem to get it to 'work' in a hacked-up-way. I think I've tracked down the problem to the normals being used. So if I do something like this in the vertex shader:
varying vec3 n

void main (void)
{
n = gl_Normal;
gl_Position = ftransform();
}


and use that variable 'n' in the fragment shader it appears to be wrong. However, if I use a uniform variable in the fragment shader and set my normal manually on a per face basis, it looks correct. (So I do something like this in the program):
while(vertices to render)
{
glUniform3fARB(my_normal, nx,ny,nz);
/*glNormal3f(nx,ny,nz);*.

glBegin(GL_TRIANGLES)

glVertex3f()...glVertex3f()...glVertex3f()

glEnd()
}


and then access the my_normal variable in the fragment shader in place of the normal specified via glNormal3f it shades correctly. So I'm wondering, how does the varying variable work? I have an idea of how it works, but I'm thinking I must be wrong because otherwise my shader would work. If I set a varying variable to 0.0 for the vertices in the vertex shader, will it always be 0.0 in the fragment shader? Also, does the varying variable interpolate on a per triangle basis? Thanks --Andrew [Edited by - Assembler015 on November 13, 2005 12:47:31 PM]

##### Share on other sites
You forget to transform your normal in the vertex shader. You transform the position, but not the normal:
void main (void){    n = gl_Normal;    gl_Position = ftransform();}

void main (void){    n = gl_NormalMatrix * gl_Normal;    gl_Position = ftransform();}

fttransform() does the following calculation: gl_ModelViewProjectionMatrix * gl_Vertex. The reason that there is a special function for it is to ensure compability with fixed-function hardware. It will probably be the same result on your machine.

EDIT: oh, btw, gl_NormalMatrix is the transpose of the inverse of the upper 3x3 bit of gl_ModelViewMatrix (Phew!). Some more info here...

##### Share on other sites

I've tried multiplying by the normal matrix as well with no success. I must be doing something obviously wrong...I just tried this test program and the results weren't what I expected...

varying vec3 n;void main (){    n = vec3(0,0,0);    gl_Position = ftransform();}

varying vec3 n;void main (){    if(n.x == 0.0 && n.y == 0.0 && n.z == 0.0)        gl_FragColor = vec4(1,0,0,1);    else        gl_FragColor = vec4(0,1,0,1);}

I expected my model to always be red, but it's not. It's solid green and periodically it flashed red.

Does that make sense? :-/

--Andrew

##### Share on other sites
in the shader if u use gl_Normal then this reads whatever the current normal is
ie glNormal3fv( normal );
it doesnt take a uniform as input like youre doing

##### Share on other sites
Quote:
 Original post by zedzeekin the shader if u use gl_Normal then this reads whatever the current normal isie glNormal3fv( normal );it doesnt take a uniform as input like youre doing

I was only using the uniform as a test. I would set a uniform to the normal vector *instead of* calling glNormal3f and using gl_Normal in the shader.

However, it is looking like it is the varying variable that are causing the problem. I thought I understood them but I guess I don't. For instance...

If I do..

varying vec3 n;void main (void){    n = vec3(0, 0, 0);    gl_Position = ftransform();}

and

varying vec3 n;void main (void){    if(n.x == 0.0 && n.y == 0.0 && n.z == 0.0)        gl_FragColor = vec4(1,0,0,1);    else        gl_FragColor = vec4(0,1,0,1);}

I see the model color as GREEN. If I change the code to...

varying float nx;varying float ny;varying float nz;void main (void){    nx = 0.0;    ny = 0.0;    nz = 0.0;    gl_Position = ftransform();}

and

varying float nx;varying float ny;varying float nz;void main (void){    if(nx == 0.0 && ny == 0.0 && nz == 0.0)        gl_FragColor = vec4(1,0,0,1);    else        gl_FragColor = vec4(0,1,0,1);}

It works as I expect and the result is a RED model.

Can some one plllleeeeaaassseee explain this to me? :-)

Thanks for all your help so far,
--Andrew

##### Share on other sites
I have the same problem.
I set the texcoord in vertex shader like

gl_TexCoord[1] = gl_TextureMatrix[1] * gl_Vertex;
ProjTexCoord = gl_TextureMatrix[1] * gl_Vertex;

Then in the fragment shader i use it :
This works
vec4 reflectionValue = vec4(texture2DProj(reflection, gl_TexCoord[1]));
but this does not work.
vec4 reflectionValue = vec4(texture2DProj(reflection, ProjTexCoord));

ProjTexCoord is a varying variable :
varying vec4 ProjTexCoord;
I dont know why it doesnt work :(

Did you tried?
n = vec3(0.0, 0.0, 0.0);

##### Share on other sites
Quote:
 Original post by Black KnightI have the same problem.I set the texcoord in vertex shader likegl_TexCoord[1] = gl_TextureMatrix[1] * gl_Vertex;ProjTexCoord = gl_TextureMatrix[1] * gl_Vertex;Then in the fragment shader i use it : This worksvec4 reflectionValue = vec4(texture2DProj(reflection, gl_TexCoord[1]));but this does not work.vec4 reflectionValue = vec4(texture2DProj(reflection, ProjTexCoord));ProjTexCoord is a varying variable : varying vec4 ProjTexCoord;I dont know why it doesnt work :(Did you tried?n = vec3(0.0, 0.0, 0.0);

I have tried setting the normal a dozen diffent ways. Including...

n = vec3(0, 0, 0);

and

n.x = 0;
n.y = 0;
n.z = 0;

and

n[0] = 0;
n[1] = 0;
n[2] = 0;

The only way I can get n.xyz = 0.0 is by declaring them as individual floats.

Also, does any one know what position gl_FragCoord represents? Does it represent an interpolated point inside the triangle specified by the 3 vertices or is it an adjust coordinate?

--Andrew

##### Share on other sites
I think vec3(0,0,0) and vec3(0.0,0.0,0.0) are different.
My shaders dont work if I specify 0.0 as 0.

##### Share on other sites
watch out with float equality! Maybe the fragment shader can't represent 0.0 properly, or the way its represented in vectors and scalars are different. You really should put in some epsilons to be safe. So try this instead:
void main (void){    if (length(n) < 0.001)        gl_FragColor = vec4(1,0,0,1);    else        gl_FragColor = vec4(0,1,0,1);}

##### Share on other sites
Quote:
 Original post by rollowatch out with float equality! Maybe the fragment shader can't represent 0.0 properly, or the way its represented in vectors and scalars are different. You really should put in some epsilons to be safe. So try this instead:*** Source Snippet Removed ***

I tried something similar to that. After some playing around I found that when the variable is declared as:

varying vec3 n;

and set in the vertex shader as:

n.x = 0.0;
n.y = 0.0;
n.z = 0.0;

That n.x and n.y are always 0 in the shader. However, n.z is a larger value. I tried ruling out rounding error by saying:

if(abs(n.z) < 0.0001){}

But the value was significantly larger than that. So why would n.xy stay constant (like I expected) but n.z be changing?

EDIT: Also, the output is inconsistent. The model is output as solid green one second, and solid red the next.

##### Share on other sites
Ok, I was a bit rushed when I typed in my first reply and didnt really answer all your questions. so here goes...

Quote:
 So I'm wondering, how does the varying variable work? I have an idea of how it works, but I'm thinking I must be wrong because otherwise my shader would work. If I set a varying variable to 0.0 for the vertices in the vertex shader, will it always be 0.0 in the fragment shader? Also, does the varying variable interpolate on a per triangle basis?

The varying variables are variables passed from vertex to fragment shaders, and they are interpolated across the triangle. So yes it should always be 0.0 in the fragment shader.
Remember that you need to renormalize your normal in the fragment shader since it might not be of unit-length after interpolation.

Quote:
 Also, does any one know what position gl_FragCoord represents? Does it represent an interpolated point inside the triangle specified by the 3 vertices or is it an adjust coordinate?

Quote:
 The built-in gl_FragCoord holds the window relative coordinates x, y, z,and 1/w for the fragment. The "z" component of gl_FragCoord undergoes animplied conversion to floating point. This conversion must leave thevalues 0 and 1 invariant. Note that this "z" component already has apolygon offset added in, if enabled. The 1/w value is computed from theWc coordinate (see Section 2.10), which is the result of the product ofthe projection matrix and the vertex's eye coordinates.

To figure out your problem it would be nice to see a bit more of the code you use to send data to the shader. Have you tried a call to glNormal3f for each vertex? Are you checking for errors and messages from shader compilation as well as glGetError()?

##### Share on other sites

I solved the problem, it was a driver issue.

I went and had my brother run the shader on his geforce fx 5000 something and it ran exactly like it was suppose to. So this is the 4th time ATI's drivers have screwed me...I should have learned the first time.

But anyways, I was running ATI's latest catalyst drivers on my ATI Xpress 200M graphics card (on my laptop) and those drivers just didn't behave right with GLSL shaders (but they worked okay with DX HLSL shaders). I verified this by running additional sample code (some even released by ATI).

So I went and I grabbed the latest Omega drivers and they work out of the box...running my shader flawlessly.

Thanks for everyones help,
ATI sucks,
-Andrew

##### Share on other sites
I should probably point out that Omega's drivers are very much ATI's drivers, they tend to lag behind and are mostly reg. tweaks and a few other minor adjustments, but the code is purely ATI's, as such any fixes could infact have been down to a driver install problem.

##### Share on other sites
Hmm, that is interesting.

When I first encountered the problem I did try a few sets of drivers, but not the Omega drivers. (I tried the ATI drivers that came with my laptop, the windows default, and the latest catalyst drivers. Between each install I ran ATI's 'ati software removal' tool)

I'm content with blaming it on ATI though. Previous experiences with them led to random freezes of my computer (while I was just sitting at the desktop), random crashes of 3d intensive games, and the random rejection of the video card by my system (BIOS complaining about strange things and Windows reverting back to the default video driver).

I've been an Nvidia fan ever since all those problems...their drivers have always seemed very stable. Unfortunately, I had to choose a laptop with ATI.

-Andrew

##### Share on other sites
*shrugs* its up to you, however on the flip side I've been using ATI hardware since the 9700pro came out and i've never had any driver problems (nor GLSL problems), so chances are you've just been unlucky.

##### Share on other sites
He wasn't just unlucky. ATI's OpenGL drivers are notoriously buggy. This condition has caused me and many other developers a lot of pain.

##### Share on other sites
Again, in all the years i've been using ATI hardware I've never come across these bugs, at least none spring to mind which havent been fixed pretty quickly (with the exception of the context switch memory leak).

##### Share on other sites
Nvidias GLSL parser was awful until up to just recently too. It would basically just treat the shaders as Cg code and not give any warnings about things like:
vec4 v = vec3(1,2,3);

I got bitten many times by this when I was developing code at home on nvidia and was showing it off on my professor's ati card at university (ATI's parser was/is much stricter). NVidia seem to have fixed most of this by now which is nice, and there is always 3dLabs test parser to make sure you are actually writing valid GLSL.

##### Share on other sites
Quote:
 Original post by Assembler015Thanks for everyones help,ATI sucks,-Andrew

I just tested your code and it works fine on my 9500 with cat 5.9

I'm using 5.9 because 5.10 seems to crash when I'm debugging and I abort my program.

I would have to say that ATI is very lazy in correcting bugs in their drivers. Many catalyst version pass, even years pass and certain bugs go unchanged.
Their GLSL vertex shader still doesn't support looping so it's not possible to take full advantage of VS 2.0

Their drivers are quite good but I don't understand why they don't improve them. It's like driver development is at a standstill.