And my code in GLSL is written below:

uniform sampler2D normal; uniform vec2 mouse; uniform float Za; vec4 effect(vec4 color,sampler2D tex,vec2 tc,vec2 pc) { vec4 img_color = texture2D( tex, tc ); vec4 normalColor = texture2D( normal, tc ); float X = (mouse.x - pc.x); float Y = (mouse.y - pc.y); float Z = Za; float dotProduct = X * normalColor.r + Y * normalColor.g + Z * normalColor.b; dotProduct /= sqrt(X * X + Y * Y + Z * Z) * sqrt(normalColor.r * normalColor.r + normalColor.g * normalColor.g + normalColor.b * normalColor.b); float factor = dotProduct; img_color.r = img_color.r + factor; img_color.g = img_color.g + factor; img_color.b = img_color.b + factor; return img_color; }

So basically I had a few questions:

**Is my math correct?**

What I'm trying to do is get the vector of the mouse (the light source for now) to the pixel, and the normal of that pixel, and then get the dot product between them, normalize that, and that is the brightness factor. However the angle

*seems*to be a bit off. I just wanted to double check if my questions were all right because debugging on the GPU doesn't seem to be very easy

**For the usual lighting methods, do you multiply by a distance factor?**

Naturally when a light is further away the object should be dimmer. But this doesn't seem to be the case with this approach. Should I just multiply by some arbitrary distance factor?

Finally, I'm basically using 3D light for a 2D game. Although it seems pretty simple in the implementation though. I'm told the

**Phong Reflection Model**may be better suited in this case. Any thoughts on that?

Thanks in advance! Any help or tips on anything would be appreciated!