# Spherical Mapping for lighting

This topic is 2102 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

Hello everyone,

I'm trying to implement a lighting technique that uses something similar to spherical environment reflection mapping. The light would be represented by a texture. The center of the texture would be bright, and would become darker around the edges. The closer each vertex normal is to the light direction, the closer to the center of the texture it should be mapped.

[attachment=18368:Sphere.png]

note: "Polygon normal" should instead read "vertex normal". In addition. the math in the image is slightly off. See the code below instead:

The code I'm currently using does what I want to some extent, but has the unwanted side effect of mapping back-facing vertices (nearly) the same as front-facing vertices:

u = ( dot(vertex.normal,Light.Right) * 0.5 ) + 0.5;

v = ( dot(vertex.normal,Light.Up) * -0.5 ) + 0.5;

My goal is to map vertices from the center of the texture, outward, based on how far away the vertex normal points from the light direction. Vertex normals facing away from the light should be mapped to the very edge of the texture. In addition, the direction-axis that changes should control the direction the texture coordinates move away from the center - if the light spins left-right, the uv coordinate should change left/right (as opposed to up/down, or some arbitrary direction).

One simple idea I came up with was to use dot(vertex.normal,light.forward) as some type of scaler for the other two axes, but I couldn't come up with any decent math to make it work.

This code to convert is implemented in a vertex shader, so I'm trying to limit conditional expressions/code. I'm sure others have done this before for the same or different purposes. I'm pretty confuzzled at the moment, and could really use some outside thinking. I really appreciate any input or advice anyone may have. Thank you very much.

Edited by Stephany

##### Share on other sites

Your image is symmetrical in X and Y, so there is no need to use a 2D texture (i.e. a 1D gradient will encode the same amount of information).

At that point, making the 1D texture run from fully lit to fully unlit, and using saturate(dot(normal, light)) as the texture coordinate ought to do the trick.

You might also want to look at a more complex variation on this technique...

##### Share on other sites

Thank you for the information. I actually do need the texture coordinates to lay out on a full 2D grid. That specific texture would work as a 1D gradient, but I would like to implement different materials and effects that make use of the full texture image.

However, on the topic of 1D gradients, one could probably just plug in a slope formula to filter output and do away with the texture completely. I considered this as I was trying to brainstorm a way around this system, but it won't handle some of the things I hope to accomplish with the full spherical texturing.

Thanks again! Does anyone have any suggestions for the full 2D system?

##### Share on other sites

Thanks again! Does anyone have any suggestions for the full 2D system?

Did you follow the link in my last post? It has sample code and all for a variant of the 2D case.

##### Share on other sites

We used these kinds of sphere light maps on the Wii for cheap lighting  But we did them per object, and drew multiple lights into the objects texture, which then let us draw the object with any number of lights for the same cost.

There's an explanation of the math for general sphere mapping here:

Before you use the spheremap formula though, in your case you want to rotate the normal into the coordinate frame of the light:

x = dot(vertex.normal, Light.Right);
y = dot(vertex.normal, Light.Up);
z = dot(vertex.normal, Light.Forward);
n.b. this code is exactly equivalent to multiplying the vertex normal with the light's 3x3 rotation matrix, e.g.
xyz = vertex.normal * Light.Rotation;

##### Share on other sites

Thank you both for your help. I was able to finally map the vertices somewhat the way I intended by using arctangent:

float x = dot( normal, Light.Right );
float y = dot( normal, Light.Up );
float z =-dot( normal, Light.Forward );

return float2( ( atan2(x,z) * 0.15915f ) + 0.5f, ( atan2(y,z) * 0.15915f ) + 0.5f );

Note: 0.15915f is 1.0/(PI*2).

However, there is a really bothersome problem with the entire implementation, and I don't know if there is any fixing it. The way this works, vertices that are facing away from the light get mapped to the outside edges of the texture. This means that on the back of a round object, the vertex closer to the left gets mapped to the left texture edge, and the vertex adjacent to it is closer to the right, and so gets mapped to the right texture edge. This causes the polygon that shares these two vertices to get mapped with the entire sphere texture, because it must interpolate between the two edges. The result is a bright area on a single polygon/strip on the back of the object.

I may actually end up going with a 1D gradient :)

Thanks again

Edited by Stephany

• 21
• 12
• 9
• 17
• 13