Simple vertex shader

Started by
16 comments, last by amtri 11 years, 8 months ago
Normally important variables get calculated in the vertex program. With your custom vertex program (i.e., one that just sets the vertex position), you don't have that extra information. For example, typically gl_Normal is transformed by gl_NormalMatrix. Normals are important for lighting in the fragment program, but you're not calculating them in the vertex program! This isn't in itself a problem, but it does explain why, for example, the default OpenGL fragment program can't calculate lighting.

Similarly, your vertex program doesn't calculate texture coordinates and color. The reason gl_Color is zero in your fragment program is because it also is a variable calculated in and interpolated from the vertex program--but you're not setting it in the vertex program. If you want "gl_FragColor = gl_Color;" to work, you should, in your vertex program, add "gl_FrontColor = gl_Color;" When you call glColor3f(...) or whatnot in your main program, what this does is to specify the value of gl_Color in the vertex program. The vertex program then must pass it to the fragment program--that's what setting gl_FrontColor does. gl_FrontColor gets interpolated across the polygon and stored for each fragment. Then, in the fragment program, you can read the interpolated value as gl_Color.

If you want to make your custom vertex program emulate OpenGL's default vertex program, that's certainly possible (though there aren't any shortcuts--and there's a lot of outputs OpenGL sets). My personal recommendation is to not do that, and instead concentrate on implementing what you care about--typically lighting and texturing. The per-pixel Phong model is better than the per-vertex Gouraud shading OpenGL provides. You probably only care about maybe one texture. Just Google "lighting fragment program" or whatever it is you want to implement, and you'll find out how to do what you want to do.

[size="1"]And a Unix user said rm -rf *.* and all was null and void...|There's no place like 127.0.0.1|The Application "Programmer" has unexpectedly quit. An error of type A.M. has occurred.
[size="2"]

Advertisement
Geometrian,

Thanks! That was a true eye opener.

I know I can set the new coordinate location with gl_Position. But what is the output for the normal variable? In other words, what variable will take the value gl_NormalMatrix * gl_Normal?
Typically, you use your own defined variables in your vertex and fragment shaders. Before your main function, you can declare variables, such as "out vec3 normal" in your vertex shader. This needs a matching "in vec3 normal" in your fragment shader. Then you can write to your normal variable in your vertex shader and read the value in your fragment shader. You can put as many as you like and use different types as you choose.
There are also other kinds of variables you can declare when writing glsl programs. Uniform variables are read-only globals for both your vertex and fragment shader, which can be set from your program. These are often used to pass additional constant information, like light positions or texture information.
I'm somewhat confused... I never really declared gl_Position; that's because this is a reserved name used to pass the vertex location. If I simply declared a variable, say, "pos", how would the program know to use this as the new position?

Likewise, the normal. I was expecting a variable called something like gl_NewNormal - a reserved word that GLSL would know to use as the new normal.

How is it that just declaring a variable called "normal" GLSL will know to use this as the new normal? Is "normal" a reserved word?

I feel like I'm missing some important concept here...
Usually you would write both both a vertex and a fragment shader. Writing just a vertex shader, while technically allowed, might be such a rarely used corner case that it does not work in a lot of driver implementations. Just something to keep in mind.

There might be a predefined variable for normals, but since all those predefined variables are gone in the GLSL versions I'm interested in you would have to dig through the appropriate documentations.
Usually you would declare something like "smooth out vec3 position;" in the vertex shader and "smooth in vec3 position;" in the fragment shader. Analogous for different attributes, types and interpolations.

That aside, if you really have to try going the vertex shader-only way, try clearing the back buffer to all red and disabling blending. If the issue is lighting evaluating to 0 or alpha being 0, you should then at least see a black triangle.
BitMaster:

Interesting that you mentioned that writing just a vertex shader is a corner case. I say this because right now in my application I can think of no need for a fragment shader. A typical use for me is to interpolate two frames with identical topologies for animation (tweening??).

Most important for me is your comment about "GLSL versions I'm interested in". I know that GLSL is at about version 4.3 now, but on my machine I only have version 1.4. So if I write my code based on the latest version wouldn't I be shutting out a lot of users with older machines? What is a "reasonable" GLSL version to base my code on? Maybe others can comment on this also.

Thanks.
OpenGL 1.4 is from 2002. That is stone age. Sure you want to limit yourself to that?

Minimum to go for is OpenGL 2.1 from 2006, but you miss out a lot of goodies. Preferably would be OpenGL 3.3 from 2010. OpenGL 4+ is not important for you now.

Check a FAQ for some basic answers you need: http://www.opengl.org/wiki/Getting_started

And have a look at Learning Modern 3D Graphics Programming for how to do modern 3D graphics programming.
[size=2]Current project: Ephenation.
[size=2]Sharing OpenGL experiences: http://ephenationopengl.blogspot.com/
Well, I have everything "almost" working as I want it. The problem I have is that all geometry going through my vertex shader is not being lit - it all looks flat. Here are the important facts:

1) My vertex shader is generating the geometry just as I want it.

2) I do input normals when I draw the geometry

3) If I don't use any shaders when I draw the geometry, other than not getting the changes from the vertex shader, all lighting is fine.

4) Now comes the tricky part: up to now I did NOT have a fragment shader - thinking I wouldn't need one. But maybe I do. Just as there is a gl_Position standard output from the vertex shader, I wish there were a standard normal output. But there isn't one. My hope is that the default fragment shader would just take the normals defined in the application and use them in the default fragment shader - but I guess that's not the case.

So it appears I need a fragment shader after all, just to set gl_FragColor. This will be based on gl_Color - the interpolated pixel color - and the normal, for which I will have to define my own output variable in the vertex shader and use it as input in the fragment shader.

Could somebody confirm with me that this is correct? This means I need to check for all lights defined in the application and do all the dot products in the shader.

This topic is closed to new replies.

Advertisement