I'm writing a shader for a 2d point light shadow mapper.

I render the depth-values of the right Frustum of the light into a texture which is 1024x1 pixels in size.

I project the vertices myself, so I dont use the w coordinate.

This is my projection into the shadow-map. "vs.lightSpace" means that all vertices are in the coordinate-system of the light. (0,0) is it's origin.

gl_Position = vec4( vs.lightSpace.y / vs.lightSpace.x, 0.0f, depthToHom(vs.lightSpace.x), 1.0f);

float depthToHom(float depth) { float near = 0.5f; float far = 10.0f; return depth * 2.0f / (far - near) + (-far - near) / (far - near); }

As you can see the depth-values are not interpolated correctly. On long edges there is a round curve in the shadow. Is this because I don't divide correctly? The big white circle is supposed to be a point-light.

To sample the depth I use:

float depth = texture(shadowMap, vec2((lightSpace.y / lightSpace.x + 1.0f) * 0.5f, 0.5f)).x; if(depthToNorm(lightSpace.x) > depth) { //... ... ... }

float depthToNorm(float depth) { float near = 0.5f; float far = 10.0f; return depth * (1.0f / (far - near)) + (-near) / (far - near); }

**Edited by stefthedrummer, 11 April 2014 - 01:30 PM.**