Jump to content
  • Advertisement
Sign in to follow this  
StiNKy

GLSL issue

This topic is 4697 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi. I've just recently dived head first into Cg and GLSL, and a small issue has popped up for me. I'm putting a unit sphere through the vertex shader, here's the shader:
void main()
{
    gl_Position = ftransform();
    gl_FrontColor = normalize(gl_Vertex);
}
Now the output (tested in RenderMonkey and my own engine) is rather odd. You'd THINK you'd see a sphere with bright colours at the axis-extremeties, but instead I see dimly lit sphere. If I remove the "normalize(..)" and just have "gl_Vertex" I get what I'm supposed to see. My question is this: why on earth is normalize() cutting the vertices from the unit sphere so dramatically, it *should* have no effect on the vertices. Thanks in advance.

Share this post


Link to post
Share on other sites
Advertisement
its not cutting the vertices at all, its doing precisely what you are telling it to do, that is, take the location of the vertex and normalise it so that the length = 1.

This is going to produce the dim colours you see as all the colour values summed up are going to equal 1.0f (or 255) and if you have enabled alpha blending its going to dim things even more as it blends into the background (depending on blending rules).

Share this post


Link to post
Share on other sites
That's exactly my point. The length already is = 1, so normalize() technically shouldn't be doing a thing, but for some reason it is.
I don't quite understand you when you say "all the colour values summed up are going to equal 1.0f"? Do you mean for the fragment? Oh and no I don't have alpha blending enabled :P

Share this post


Link to post
Share on other sites
Radius :P But technically it shouldn't matter since normalize() would be doing it's job...
What's even MORE strange is if I put a sphere with radius 50.0 in, the results are fine!

Share this post


Link to post
Share on other sites
Yay I've figured it out.

After going through the OpenGL GLSL manual, I noticed this definition:
syntax:
float length (genType x)
description:
Returns the length of vector, i.e., sqrt(x[0]*x[0] + x[1]*x[1] + ...)


Naturally, gl_Vertex is a 4D vector, so normalizing it requires the length, which would return:
sqrt(x*x + y*y + z*z + w*w)


How very irritating that is...

Share this post


Link to post
Share on other sites
yeah, i was going to come back and mention that 'W' coord [smile]

Simplest way would just be to convert it to a vec3 and then tack a 1.0 to the resulting normalised vector to give you your four colour compoents.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!