We have a game here using a straightforward deferred lighting approach, but we'd like to get some lighting on our translucent objects. In an attempt to avoid recreating all the horrible things that came from shader combinations for every light combination, I've been trying to implement something similar to the technique Bungie described in their presentation on Destiny's lighting.
The idea is to collapse the light environment at various probe points into a spherical harmonic representation, that the shader would then use to compute lighting. Currently it's doing all of this on the CPU, but I've run into what seems to be a fundamental issue with projecting a directional light into SH.
After digging through all of the fundamental papers, everything seems to agree that the way to project a directional light into SH, convolved with the cosine response is
void project_directional( float* SH, float3 color, float3 dir )
{
SH[0] = 0.282095f * color * pi;
SH[1] = -0.48603f * color * dir.y * (pi * 2/3);
SH[2] = 0.48603f * color * dir.z * (pi * 2/3);
SH[3] = -0.48603f * color * dir.x * (pi * 2/3);
}
float3 eval_normal( float* SH, float3 dir )
{
float3 result = 0;
result = SH[0] * 0.282095f;
result += SH[1] * -0.48603f * dir.y;
result += SH[2] * 0.48603f * dir.z;
result += SH[3] * -0.48603f * dir.x;
return result;
}
// result is then scaled by diffuse
There's a normalization term or two, but the problem I've been running into, that I haven't seen any decent way to avoid, is that ambient term in SH[0]. If I plug in a simple light pointing down Z, normals pointing directly at it, or directly away from it behave reasonably, but a normal pointing down, say, the X axis will always be lit by at least 1/4 of the light color. It's produced a directional light that generates significant amounts of light at 90 degress off-axis.
I'm not seeing how this could ever behave differently. I can get vaguely reasonable results if I ignore the ambient term while merging diffuse lights in, but that breaks down the moment I try summing two lights, pointing in opposite directions in. Expanding out to the 9-term quadratic form does not help much either.
I get the feeling I've missed some fundamental thing to trim down the off-axis directional light response, but I'll be damned if I can see where it would come from. Is this just a basic artifact of using a single light as a test case? Is this likely to behave better by keeping the main directional lights out, and just using the SH set to collapse point lights in as sphere lights or attenuated directionals? Have I just royally screwed up my understanding of how to project a directional light into SH?
The usual pile of papers and articles from SCEE, Tom Forsyth, Sebastien Lagarde, etc have not helped. Someone had a random shadertoy that looked like it worked better in posted screenshots, but actually running it produces results more like what I've seen.