Jump to content

  • Log In with Google      Sign In   
  • Create Account





Shader Templates - Simple But Useful

Posted by Tocs, 25 February 2014 · 1,968 views

For the longest time I've struggled with how I wanted to handle materials in my graphics framework. When searching around for existing solutions I found basically two things.

A: Shaders with strict inputs:
A single shader that had specific inputs that were textures, floats, etc etc.

B: Node based shaders:
Crazy flexible graphical editors for materials constructed from graphs of various building block elements.

'A' wasn't flexible enough for me, and 'B' seemed like something way to big and time consuming to properly create myself.

So I decided on something somewhat inbetween...

I built a GLSL preprocessor templated shader generator. It takes in GLSL code with extra markup. Instead of specifying inputs as strictly typed GLSL uniform variables. I tell the generator the input names and the desired final data type (float, vec2, vec3, vec4). Then a set of inputs is given to the generator and it creates a shader that samples the textures, looks up values, and makes sure there's a variable with the requested name that contains the appropriate value.

It's easier to show what I mean... Here's my Blinn-Phong material shader template.

Templates
#version 140
#include "shaders/brdf/blinnphong.hglsl"
#include "shaders/brdf/NormalMapping.hglsl"

<%input vec4 DiffuseColor%>
<%input vec4 SpecularColor%>
<%input float SpecularPower%>
<%input float SpecularIntensity%>
<%input vec3 NormalMap optional%>
<%input vec4 Emissive optional%>

in vec2 TextureCoordinate;
in vec3 GeometryNormal;
in vec3 GeometryTangent;

<%definitions%>

void ShadePrep ()
{
	<%init%>
}

vec4 ConstantShading(vec4 AmbientLight)
{
	vec4 result = AmbientLight * DiffuseColor;
	<%ifis not Emissive: None%>
		result += Emissive;
	<%endif%>
	return result;
}

vec4 Shade (vec3 LightDir, vec3 ViewDir, vec3 LightColor, float Attenuation)
{
	vec3 normal = normalize(GeometryNormal);
	<%ifis NormalMap: Texture %>
		normal = NormalMapping(normal,normalize(GeometryTangent),NormalMap);
	<%endif%>
	return BlinnPhong (LightDir, ViewDir, LightColor, Attenuation, normal, DiffuseColor, SpecularColor, SpecularPower, SpecularIntensity);
}

<%input ... %> : Specifies input names and desired types to the generator. Some inputs can be optional, the generator won't raise an error if these aren't supplied.

<%definitions%> : This tells the generator where to define extra variables it may need. While technically redundant because the extra variables could be placed where the <%input%> tag is, I wrote it with the <%definitions%> tag and didn't bother to remove it.

<%init%> : This tells the generator where to put the code that's needed to get the proper final value, such as sampling a texture.

<%ifis ...%> : This lets the generator do different things based on the types of input you supply. In the above shader I add an extra line of code to transform the normal if a NormalMap texture is supplied. I also apply an Emissive term if the shader is supplied one.

Input Sets

Sets can be constructed in code, or loaded from file. The file contains text like this:
DiffuseColor tex("sword/diffuse.png")
SpecularPower 6.0
SpecularIntensity tex("sword/specular.png").r
SpecularColor color(255,255,255)
NormalMap tex("sword/normal.png")
Emissive tex("sword/glow.png")
It's pretty easy to tell what this does. The generator takes the input set, and generates a shader which can utilize it. You might notice I specify a swizzle for SpecularIntensity. You can pick different channels out of a texture for a certain input, if you specify the same texture twice it's smart enough to only sample it once and swizzle the sample in the shader.

When I plug those inputs in, this is what it generates (I fixed the whitespace up though...)
#version 140
#include "shaders/brdf/blinnphong.hglsl"
#include "shaders/brdf/NormalMapping.hglsl"

uniform sampler2D Texture_0;
uniform vec4 SpecularColor;
uniform float SpecularPower;
uniform sampler2D Texture_4;
uniform sampler2D Texture_2;
uniform sampler2D Texture_1;

in vec2 TextureCoordinate;
in vec3 GeometryNormal;
in vec3 GeometryTangent;

vec4 DiffuseColor;
vec4 Emissive;
vec3 NormalMap;
float SpecularIntensity;


void ShadePrep ()
{
	vec4 Sample0 = texture2D(Texture_0, TextureCoordinate);
	DiffuseColor = Sample0.xyzw;
	vec4 Sample1 = texture2D(Texture_1, TextureCoordinate);
	Emissive = Sample1.xyzw;
	vec4 Sample2 = texture2D(Texture_2, TextureCoordinate);
	NormalMap = Sample2.xyz;
	vec4 Sample4 = texture2D(Texture_4, TextureCoordinate);
	SpecularIntensity = Sample4.x;
}

vec4 ConstantShading(vec4 AmbientLight)
{
	vec4 result = AmbientLight * DiffuseColor;
	result += Emissive;
	return result;
}

vec4 Shade (vec3 LightDir, vec3 ViewDir, vec3 LightColor, float Attenuation)
{
	vec3 normal = normalize(GeometryNormal);
	normal = NormalMapping(normal,normalize(GeometryTangent),NormalMap);
	return BlinnPhong (LightDir, ViewDir, LightColor, Attenuation, normal, DiffuseColor, SpecularColor, SpecularPower, SpecularIntensity);
}

These are just simple inputs, but you can do more interesting things with it as well. For particle systems I can connect the material inputs to values passed in from the particle data. For instance getting a index for an array of different particle textures. If you wanted an animated texture you can have an input type represent all the frames and switch between them.

Additionally the input sets generate a hash code (probably)unique to the generated shader, so if you have similar input sets that use the same generated shader, it's only created once. I was also hoping to cache the compiled shader binaries. However, even in the latest OpenGL the spec says that glShaderBinary must be supported but there doesn't have to exist a format to save them in. So it's pretty useless and disappointing, it's possible to cache the linked programs though.

It was fairly easy to implement, allows decent flexibility, and cuts down on the time I have to spend writing little differences in shaders. There's obviously a lot of improvements I could make (as with anything) but I'm getting a lot of mileage out of it's current state.

What do you think? I'd love some feedback.

Also, Obligatory Progress Screenshot:
Posted Image




Quite a nice concept you have here, I like the pseudo-dynamic aspect of it.

I have a somewhat similar setup which revolves around using pre-processor macros and compiling the shader with different configurations ('sets' in your case).

 

Does it also work for non-uniforms (e.g. you wanted to fetch DiffuseColor not from a texture, but from the vertex colour itself) or is your vertex input pretty much static for the moment ?

 

Anyway, thanks for sharing, this has given ideas to push my own design further. 

I added a new "input type" for vertex attributes because I needed them for particle effects. Individual particle information is stored in a VBO so it has to be passed in through the vertex shader.

 

So if my vertex shader has 

out vec4 VertexColor;

In my material file I can type 

Color vertex_input("VertexColor")

And the generator will create 

in vec4 VertexColor;
vec4 Color;

and 

Color = VertexColor;

Which adds an extra step, but I could add a check to see if the desired name and the input name are the same to only generate an input variable.

 

It's pretty easy with my implementation to add new ways of accepting input since it just involves deriving from a base input class and writing a handful of methods.

October 2014 »

S M T W T F S
   1234
567891011
12131415161718
1920 21 22232425
262728293031 

Recent Comments

PARTNERS