Creating a GLSL Library

Published October 29, 2007 by Don Olmstead, posted by Myopic Rhino
Do you see issues with this article? Let us know.
Advertisement

Introduction

There are plenty of examples of GLSL shaders available in print or on the web. In this article, rather than showing an a la carte example, we will go through the creation of a shader library. Since the two most likely requirements in and OpenGL program are texturing and lighting the scene, those will be the two things we'll be doing. Specifically we'll be covering Texturing, and as an extension Multi-Texturing, and Per-pixel Lighting for Directional, Point, and Spot Lights.

There a couple assumptions that are made in this article. First, it is assumed that you already know how to properly compile, link and use a shader in OpenGL. Second, it is assumed that you do not need an explanation on the Phong Lighting Model. The references for this article cover those things so look to those if you want a thorough explanation of either of those things.

First thing we're going to do is create the Texturing library. This library is fairly simple, and admittedly overkill, but it provides a basis for going to the more complex Lighting library.

A Texturing Example

The vertex shader used in this example isn't very exciting so lets just get it out of the way.


/**
 * \file Texturing.vert
 */
void main()
{
    gl_TexCoord[0] = gl_MultiTexCoord0;
   
    gl_Position = ftransform();
}
All the vertex shader does is copy the multi-texture coordinate over for the fragment shader to use. The attribute array gl_MultiTexCoord0 is not available in the fragment shader so we need to populate the gl_TexCoord[0] value with it. Then the vertex position gets transformed according to the current model view matrix.

Now its time to write our Texturing library function.

There are six options for how to texture a geometry in OpenGL

  • GL_REPLACE
  • GL_MODULATE
  • GL_DECAL
  • GL_BLEND
  • GL_ADD
  • GL_COMBINE
Each of these options has a different calculation for texturing that can be replicated in GLSL. So lets create a function that does just that.

/**
 * \file Texturing.frag
 */
const int REPLACE  = 0;
const int MODULATE = 1;
const int DECAL    = 2;
const int BLEND    = 3;
const int ADD      = 4;
const int COMBINE  = 5;

void applyTexture2D(in sampler2D texUnit, in int type, in int index, inout vec4 color)
{
    // Read from the texture
    vec4 texture = texture2D(texUnit, gl_TexCoord[index].st);
   
    if (type == REPLACE)
    {
        color = texture;
    }
    else if (type == MODULATE)
    {
        color *= texture;
    }
    else if (type == DECAL)
    {
        vec3 temp = mix(color.rgb, texture.rgb, texture.a);
       
        color = vec4(temp, color.a);
    }
    else if (type == BLEND)
    {
        vec3 temp = mix(color.rgb, gl_TextureEnvColor[index].rgb, texture.rgb);
       
        color = vec4(temp, color.a * texture.a);
    }
    else if (type == ADD)
    {
        color.rgb += texture.rgb;
        color.a   *= texture.a;
       
        color = clamp(color, 0.0, 1.0);
    }
    else
    {
        color = clamp(texture * color, 0.0, 1.0);
    }
}
At the top of the code we create a set of constant integers to denote what type of texturing we want to apply to the geometry. The function itself takes four arguments. The first is a sampler2D which is the texture unit holding the texture we want to apply. Second is the type which is one of the constant integers we defined. Third is the index of the texture coordinates we wish to use. And finally color is the current color of the texel which will be modified and returned from the function. The code itself is taken from the Orange Book and can also be created using 3Dlabs ShaderGen tool.

So now we have a function that will replicate OpenGL's texturing facility. Now we just need to create the other fragment shader that uses this function.


/**
 * SingleTexture.frag
 */

void applyTexture2D(in sampler2D texUnit, in int type, in int index, inout vec4 color);

uniform sampler2D TexUnit0;
uniform int TexturingType;

void main()
{
    vec4 color = gl_Color;
   
    applyTexture2D(TexUnit0, TexturingType, 0, color);
   
    gl_FragColor = color;
}
The most important thing to remember when creating a shader library is to use forward declarations. The applyTexture2D function is in a different file so when this file is compiled the compiler needs to have knowledge that there is a function called applyTexture2D and what its parameters are. When the shaders are linked together the linker will figure out the rest.

Other than that the shader is fairly straightforward. There are two uniforms, the sampler2D which just points to the texture unit we wish to use, and the integer TexturingType which denotes what OpenGL texturing to emulate. The main function just takes the color of the vertex and passes that along with TexUnit0, TexturingType and the index of the texture coordinates to the function. The result of which is used as the fragment color.

Its worth noting that OpenGL does not provide a mechanism to query what the environment of the texture is (minus the texture environment color, gl_TextureEnvColor[]). So if you send via glTexEnv that you want to use GL_ADD, there is no way inside the shader to determine that is being requested by the program.

A Multi-Texturing Example

Next up is Multi-Texturing. There is no longer a need to use any OpenGL extensions to do multi-texturing, all of it can be done within a fragment shader. You can blend together as many textures as your video card supports (call glGetIntegerv with GL_MAX_TEXTURE_UNITS to determine that information), but in our case we'll only be doing two. No changes are required to our Texturing library or the vertex shader we we're using, and we'll need to add three lines of code to our fragment shader to get two textures blended together.


/**
 * \file MultiTexture.frag
 */
uniform sampler2D TexUnit0;
uniform int TexturingType0;
uniform sampler2D TexUnit1;
uniform int TexturingType1;

void main()
{
    vec4 color = gl_Color;
   
    applyTexture2D(TexUnit0, TexturingType0, 0, color);
    applyTexture2D(TexUnit1, TexturingType1, 0, color);
   
    gl_FragColor = color;
}
So all that needed to be added was another sampler2D and another integer specifying what texturing to use. So hopefully you can see how creating a library can make your shaders a lot more concise, and a lot more powerful. Now we can move onto creating a Lighting library for per-pixel lighting.

A One-Sided Per-Pixel Lighting Example

This example is fairly complicated so the discussion is broken up into parts. The first part will be the vertex shader. Since we're doing everything in the pixel shader, the vertex shader is fairly simple.


/**
 * \file Lighting.vert
 */
varying vec3 normal;
varying vec3 vertex;

void main()
{
    // Calculate the normal
    normal = normalize(gl_NormalMatrix * gl_Normal);
   
    // Transform the vertex position to eye space
    vertex = vec3(gl_ModelViewMatrix * gl_Vertex);
       
    gl_Position = ftransform();
}
All thats happening in the vertex shader is passing along the normal and the position of the vertex to the fragment shader. These values are needed for the lighting calculations coming in the fragment shader.

/**
 * \file OneSidedLighting.frag
 */
void calculateLighting(in int numLights, in vec3 N, in vec3 V, in float shininess,
                       inout vec4 ambient, inout vec4 diffuse, inout vec4 specular);

varying vec3 normal;
varying vec3 vertex;

void main()
{
    // Normalize the normal. A varying variable CANNOT
    // be modified by a fragment shader. So a new variable
    // needs to be created.
    vec3 n = normalize(normal);
   
    vec4 ambient  = vec4(0.0);
    vec4 diffuse  = vec4(0.0);
    vec4 specular = vec4(0.0);

    // In this case the built in uniform gl_MaxLights is used
    // to denote the number of lights. A better option may be passing
    // in the number of lights as a uniform or replacing the current
    // value with a smaller value.
    calculateLighting(gl_MaxLights, n, vertex, gl_FrontMaterial.shininess,
                      ambient, diffuse, specular);
   
    vec4 color = gl_FrontLightModelProduct.sceneColor  +
                 (ambient  * gl_FrontMaterial.ambient) +
                 (diffuse  * gl_FrontMaterial.diffuse) +
                 (specular * gl_FrontMaterial.specular);
                
    color = clamp(color, 0.0, 1.0);
   
    gl_FragColor = color;
}
Now if this were some shader library we were integrating into our own shaders the only thing we'd need to know is the interface for the function. From there we'd assume that it was doing the proper calculations and just leave it at that. But since this is a library that we're creating lets go over the functions of the Lighting library.

The Phong Lighting Shader

This library is A LOT more complicated than our texturing library. There are a couple of auxillary functions used by this library to simplify the code. The first of which determines whether a light is enabled.


/**
 * \file PhongLighting.frag
 */
const vec4 AMBIENT_BLACK = vec4(0.0, 0.0, 0.0, 1.0);
const vec4 DEFAULT_BLACK = vec4(0.0, 0.0, 0.0, 0.0);

bool isLightEnabled(in int i)
{
    // A separate variable is used to get
    // rid of a linker error.
    bool enabled = true;
   
    // If all the colors of the Light are set
    // to BLACK then we know we don't need to bother
    // doing a lighting calculation on it.
    if ((gl_LightSource.ambient  == AMBIENT_BLACK) &&
        (gl_LightSource.diffuse  == DEFAULT_BLACK) &&
        (gl_LightSource.specular == DEFAULT_BLACK))
        enabled = false;
       
    return(enabled);
}
A good chunk of the state information that is passed via an OpenGL program is available within a shader. Unfortunately there is no flag for whether or not a light is enabled. We can determine whether a light is enabled by viewing its ambient, diffuse and specular contributions. OpenGL defines the default value of the ambient color of a light to be {0.0, 0.0, 0.0, 1.0} and the default value for the diffuse and specular colors to be {0.0, 0.0, 0.0, 0.0} so if a light has those values we can be sure that it has no contribution to the scene. Otherwise its contribution needs to be calculated. This information is used in the calculateLighting function to decrease the number of lights that need to be passed over.

/**
 * \file PhongLighting.frag
 */
void calculateLighting(in int numLights, in vec3 N, in vec3 V, in float shininess,
                       inout vec4 ambient, inout vec4 diffuse, inout vec4 specular)
{
    // Just loop through each light, and if its enabled add
    // its contributions to the color of the pixel.
    for (int i = 0; i < numLights; i++)
    {
        if (isLightEnabled(i))
        {
            if (gl_LightSource.position.w == 0.0)
                directionalLight(i, N, shininess, ambient, diffuse, specular);
            else if (gl_LightSource.spotCutoff == 180.0)
                pointLight(i, N, V, shininess, ambient, diffuse, specular);
            else
                 spotLight(i, N, V, shininess, ambient, diffuse, specular);
        }
    }
}
OpenGL uses some cues to determine what type of light is being passed to it. Directional lights are assumed to have a position where the w value corresponds to 0.0. A Point Light has a w value of 1.0 in its position and a value of 180.0 for their spotCutoff value. And finally a Spot Light has a value in its spotCutoff of something other than 180.0. So if the light is enabled we use these cues to determine what type of light we're encountering. Finally lets take a look at the actual lighting calculations for each type of light. The actual Phong Lighting equations are not gone over as they are covered in much greater detail in the references.

/**
 * \file PhongLighting.frag
 */
float calculateAttenuation(in int i, in float dist)
{
    return(1.0 / (gl_LightSource.constantAttenuation +
                  gl_LightSource.linearAttenuation * dist +
                  gl_LightSource.quadraticAttenuation * dist * dist));
}

void directionalLight(in int i, in vec3 N, in float shininess,
                      inout vec4 ambient, inout vec4 diffuse, inout vec4 specular)
{
    vec3 L = normalize(gl_LightSource.position.xyz);
   
    float nDotL = dot(N, L);
   
    if (nDotL > 0.0)
    {   
        vec3 H = gl_LightSource.halfVector.xyz;
       
        float pf = pow(max(dot(N,H), 0.0), shininess);

        diffuse  += gl_LightSource.diffuse  * nDotL;
        specular += gl_LightSource.specular * pf;
    }
   
    ambient  += gl_LightSource.ambient;
}

void pointLight(in int i, in vec3 N, in vec3 V, in float shininess,
                inout vec4 ambient, inout vec4 diffuse, inout vec4 specular)
{
    vec3 D = gl_LightSource.position.xyz - V;
    vec3 L = normalize(D);

    float dist = length(D);
    float attenuation = calculateAttenuation(i, dist);

    float nDotL = dot(N,L);

    if (nDotL > 0.0)
    {   
        vec3 E = normalize(-V);
        vec3 R = reflect(-L, N);
       
        float pf = pow(max(dot(R,E), 0.0), shininess);

        diffuse  += gl_LightSource.diffuse  * attenuation * nDotL;
        specular += gl_LightSource.specular * attenuation * pf;
    }
   
    ambient  += gl_LightSource.ambient * attenuation;
}

void spotLight(in int i, in vec3 N, in vec3 V, in float shininess,
               inout vec4 ambient, inout vec4 diffuse, inout vec4 specular)
{
    vec3 D = gl_LightSource.position.xyz - V;
    vec3 L = normalize(D);

    float dist = length(D);
    float attenuation = calculateAttenuation(i, dist);

    float nDotL = dot(N,L);

    if (nDotL > 0.0)
    {   
        float spotEffect = dot(normalize(gl_LightSource.spotDirection), -L);
       
        if (spotEffect > gl_LightSource.spotCosCutoff)
        {
            attenuation *=  pow(spotEffect, gl_LightSource.spotExponent);

            vec3 E = normalize(-V);
            vec3 R = reflect(-L, N);
       
            float pf = pow(max(dot(R,E), 0.0), shininess);

            diffuse  += gl_LightSource.diffuse  * attenuation * nDotL;
            specular += gl_LightSource.specular * attenuation * pf;
        }
    }
   
    ambient  += gl_LightSource.ambient * attenuation;
}
The attenuation factor of the light is calculated in a different function since it is used when calculating the contribution of both the Point and Spot lights.

Now we have a library that handles per-pixel Phong Lighting. So lets take it a step further.

A Two-Sided Per-Pixel Lighting Example

OpenGL by default does two sided lighting. This means that lighting is calculated for the front and back faces of a polygon. This is really useful for something that is lit from the back, an everyday example of this is a lamp shade.

The only thing that needs to be modified to achieve this is to modify our fragment shader's main function.


/**
 * TwoSidedLighting.frag
 */
void main()
{
    // Normalize the normal. A varying variable CANNOT
    // be modified by a fragment shader. So a new variable
    // needs to be created.
    vec3 n = normalize(normal);
   
    vec4 ambient, diffuse, specular, color;

    // Initialize the contributions.
    ambient  = vec4(0.0);
    diffuse  = vec4(0.0);
    specular = vec4(0.0);
   
    // In this case the built in uniform gl_MaxLights is used
    // to denote the number of lights. A better option may be passing
    // in the number of lights as a uniform or replacing the current
    // value with a smaller value.
    calculateLighting(gl_MaxLights, n, vertex, gl_FrontMaterial.shininess,
                      ambient, diffuse, specular);
   
    color  = gl_FrontLightModelProduct.sceneColor  +
             (ambient  * gl_FrontMaterial.ambient) +
             (diffuse  * gl_FrontMaterial.diffuse) +
             (specular * gl_FrontMaterial.specular);

    // Re-initialize the contributions for the back
    // pass over the lights
    ambient  = vec4(0.0);
    diffuse  = vec4(0.0);
    specular = vec4(0.0);
          
    // Now caculate the back contribution. All that needs to be
    // done is to flip the normal.
    calculateLighting(gl_MaxLights, -n, vertex, gl_BackMaterial.shininess,
                      ambient, diffuse, specular);

    color += gl_BackLightModelProduct.sceneColor  +
             (ambient  * gl_BackMaterial.ambient) +
             (diffuse  * gl_BackMaterial.diffuse) +
             (specular * gl_BackMaterial.specular);

    color = clamp(color, 0.0, 1.0);
   
    gl_FragColor = color;
}
The only thing that we needed to do was change reset the ambient, diffuse and specular values, flip the normal, and use the back material values. Now we have an even more accurate lighting calculation available in our programs.

Conclusion

Hopefully this article will give you an idea of how to write shader libraries. With the enclosed libraries you could combine Lighting and Texturing into a single pass. You could change the lighting model from Phong to something else, such as Toon Shading. The possibilities with GLSL are enormous.

Just remember if you come up with something really cool to share with the rest of us. Thanks for reading.

References

GLSL Tutorial - Lighthouse3D August 2007
Lighting with GLSL : Phong Model - oZone3D.net August 2007
Per Fragment Lighting - ClockworkCoders August 2007
Rost, Randi J. OpenGL Shading Language 1st ed. Boston : Addison-Wesley, 2004

Useful Links

ShaderGen - Fixed Functionality Shader Generation Tool (used to emulate OpenGL texturing)
ShaderDesigner - A GLSL IDE available in the OpenGL SDK (used for development of the enclosed shaders)

Cancel Save
0 Likes 0 Comments

Comments

Nobody has left a comment. You can be the first!
You must log in to join the conversation.
Don't have a GameDev.net account? Sign up!
Advertisement