Help with Shadow Mapping and Multi-Texturing

Started by
22 comments, last by Terin 11 years, 9 months ago
Beans,

THANK YOU for pointing out my BIG mistake. Lookie what I got working...

2012.07.21%20Shadow%20Mapping%20III%20%28Shadows%20Working%29.png

Mind you, it's still not perfectly correct. But it's closer! While I get a shadow, it does not seem to be correct when rendering, so I need to figure out any other gotcha's. The other problem that is prevalent in this image, is that the water is transparent at all -- I assume the problem is with the Alpha channel not being preserved. I think, perhaps, I had better be multiplying my color by only XYZ instead of XYZW. XYZW should correspond to RGBA, yes? I'd assume that if an image is in ARGB format (specified in the texture creation), then XYZW would map A => X, and so forth, yes?

Thanks for all of your help!
Advertisement
I don't think the mapping would change in the shader. XYZW/RGBA/STPQ. W=A=Q. But you can use any of those to mean whichever spot in the vec4 you want. Not sure if you can go .xgq, but why not use .a for alpha, and the rest when refering to color? Also, when you multiply in the diffuse component from the texture, you don't need any swizzles there (wxyz also looks strange on both sides).

What does MinimumShadow do? That line usually prevents lambert from being negative with zero usually being in place of MinimumShadow.
I'm curious how your lighting works. Usually I see something like (ambient * DiffuseTexel) + (lambert * DiffuseTexel). I'm going to guess that MinimumShadow is there to prevent things from being all black when lambert would otherwise be zero?

New C/C++ Build Tool 'Stir' (doesn't just generate Makefiles, it does the build): https://github.com/space222/stir

OK, so after a bit of cleanup, I've managed to maintain my transparency, but I believe the shadows are not correctly projecting onto the image...

Yes, MinimumShadow prevents the Lambert from being 100% black, so it never completely scales to completely black.

Here's what my end-result of a Shader looks like for the Fragment Shader (everything else has remained the same in code):

[source lang="cpp"] uniform sampler2D DiffuseMap;
uniform sampler2D ShadowMap;
uniform mat4 WorldMatrix;
uniform float MinimumShadow;

varying vec3 Normal;
varying vec4 LightCoordinate;
varying vec4 WorldPosition;

void main()
{
// Direct lighting
// ------------------------------
vec4 Color = gl_LightSource[0].ambient;

vec3 View_Light = normalize(gl_LightSource[0].position.xyz);
vec3 View_Normal = Normal;

float Lambert = max(dot(View_Normal, View_Light), MinimumShadow);
Color *= Lambert;

//Blend in Color from primary texture unit
Color *= texture2D(DiffuseMap, vec2(gl_TexCoord[0]));

// Shadow mapping
// ------------------------------
vec4 lcoord = LightCoordinate; // Fragment position in light space.
lcoord /= lcoord.w; // Project to cartesian space
lcoord.xy = lcoord.xy * 0.5 + 0.5; // Scale since light clipping space is in [-1,1] but texture space is [0,1]

float fragmentDepth = lcoord.z; // Depth of the fragment in light space.
float shadowMapDepth = texture2D(ShadowMap, lcoord.xy).z; // Depth in the shadow map.

float eps = 0.001; // depth bias
float shadow = fragmentDepth - eps > shadowMapDepth ? 0.5: 1.0;

Color.rgb *= shadow;

gl_FragColor = Color;
}[/source]

The alpha channel is now preserved, thankfully. However, still some issues...

I'm passing in the ShadowMapTexture (well, the active texture is what I'm passing in, but I'm BINDING the shadow map) as the sampler2D ShadowMap, which appears to be correct. It seems like the shadow's coordinates aren't generating correctly, or that I'm doing something wrong still. Using the DepthMap does not ever make it look correct.

When I setup the scene for shadows, I do:

PushMatrix
LoadLightProjection
LoadLightModelView
PushMatrix
Translate into Scene Center
RenderScene
PopMatrix
PopMatrix

When I get the Light ModelView and Projection, I do...

PushMatrix
LoadLightProjection
LoadLightModelView
Translate into Scene Center
GetProjection into Variable
GetModelview into Variable

Multiply the two matrices together to form an uber matrix that will be sent in.
PopMatrix

I'm assuming that this approach isn't correct, as whenever I move the location of the camera, the entire set of shadows disappear. Another interesting thing is that it seems like if I pull out the Translate, the shadows stay, but they do not move with the scene...

Here's the code for my matrix multiplication, so I don't have to do it in GLSL:

[source lang="csharp"] public static void MultiplyMatrices( int pDimension, float[] pMatrixA, float[] pMatrixB, ref float[] pResult )
{
for ( int i = 0; i < pDimension; i++ )
{
for ( int j = 0; j < pDimension; j++ )
{
int index = i * pDimension + j;

pResult[index] = 0.0f;

for ( int k = 0; k < pDimension; k++ )
{
pResult[index] += ( pMatrixA[i * pDimension + k] * pMatrixB[k * pDimension + j] );
}
}
}
}[/source]

Which I believe is correct and I lifted from another site... Any thoughts?
Please keep in mind that Push/PopMatrix only affect the current matrix mode, ModelView and Projection have completely
separate stacks. I might also be confused about which one of those set of steps was creating a depth map and which rendering
the final scene. In a few of those steps it looks like Push/Pop are not matching up. I think basically that list of steps has me confused (sorry 'bout that).

If the camera is causing a problem like that it sounds like Push/Pop are not matched up correctly, or the wrong MatrixMode
is sticking around past where it's supposed to be. Otherwise, the main camera shouldn't be affecting shadow map generation.

New C/C++ Build Tool 'Stir' (doesn't just generate Makefiles, it does the build): https://github.com/space222/stir

I have a feeling that the Light Projection and View Matrix are not correct.

So, I have a "Debug" mode that I implemented that loads the Light Projection and View Matrices (instead of the scenes), and when I use that, it looks wrong still. For example, if the sun is rising from the East, I'll have shadows on the Eastern side of the scene, not on the Western side. This uses the same code and same steps to produce it as the scene rendering from the light's view. I have now replicated the steps I take so that the Push/Pop and Translate/etc are all working together correctly.

If I do a Translate on the Light to center it with my scene (when updating the Light Matrix that is sent into the Rendering Shader), the second I move from center, everything goes DARK or LIGHT, which leads me to think that there HAS to be something wrong with the Light's Position or the Light's Matrix or something like that.

I believe the Ambient Lighting for the scene is fine and without any flaws. That has to mean that something is wrong with how either I calculate the LightCoordinate or the Light's Matrix.

I'll post my code again and maybe you'll see a problem... Again, I've made changes...

[source lang="cpp"]//CAPTURE

//VERTEX
varying vec4 LightPosition;

void main()
{
gl_Position = ftransform();
LightPosition = gl_Position;
}

//FRAGMENT
varying vec4 LightPosition;

void main()
{
gl_FragColor = vec4(LightPosition.z / LightPosition.w);
}

//RENDER

//VERTEX
uniform mat4 LightModelViewProjectionMatrix;
uniform mat4 WorldMatrix;

varying vec3 Normal;
varying vec4 LightCoordinate; // Position in model view projection space from the lights view.
varying vec4 WorldPosition; // Position in world space.

void main()
{
Normal = gl_Normal;
WorldPosition = WorldMatrix * gl_Vertex;
LightCoordinate = LightModelViewProjectionMatrix * gl_Vertex;
gl_Position = ftransform(); // Transform via fixed function into the viewer's view
gl_TexCoord[0] = gl_MultiTexCoord0;
}

//FRAGMENT
uniform sampler2D DiffuseMap;
uniform sampler2D ShadowMap;
uniform mat4 WorldMatrix;
uniform float MinimumShadow;

varying vec3 Normal;
varying vec4 LightCoordinate;
varying vec4 WorldPosition;

void main()
{
// Direct lighting
// ------------------------------
vec4 Color = gl_LightSource[0].ambient;

vec3 View_Light = normalize(gl_LightSource[0].position.xyz);
vec3 View_Normal = normalize(Normal);

float Lambert = max(dot(View_Normal, View_Light), MinimumShadow);
Color *= Lambert;

//Blend in Color from primary texture unit
Color *= texture2D(DiffuseMap, vec2(gl_TexCoord[0]));

// Shadow mapping
// ------------------------------
vec4 lcoord = LightCoordinate; // Fragment position in light space.
lcoord /= lcoord.w; // Project to cartesian space
lcoord.xy = lcoord.xy * 0.5 + 0.5; // Scale since light clipping space is in [-1,1] but texture space is [0,1]

float fragmentDepth = lcoord.z; // Depth of the fragment in light space.
float shadowMapDepth = texture2D(ShadowMap, lcoord.xy).r; // Depth in the shadow map.

float eps = 0.001; // depth bias
float shadow = fragmentDepth - eps > shadowMapDepth ? 0.5: 1.0;

Color.rgb *= shadow;

gl_FragColor = Color;
}[/source]
Whenever I set the matrices, I make sure I LoadIdentity() before I do any operations on them, so it should clear them and set them up as we desire. The order of operations is correct now as well...

I've noticed the code I borrowed also make ABSOLUTELY NO USE of the entire DepthMapTexture... They added it to the FBO, but use it nowhere else. There has to be some strange calculation that maybe I need to do on the texture to figure out where it should be on the ShadowMapTexture or to relate that into the coordinates that would be used...

Thoughts?
Ambient is supposed to be added in without being involved in the lambert calculations as part of its purpose is keeping things from being completely black, but that's obviously not important if the shadows are being stubborn.

As for the two textures, there's really no way to know what another coder was thinking if they didn't leave any notes. All shadow code I've seen disables color buffer drawing leaving it out of the FBO while only using the depth component texture in the shadow checking code.

Other than to try to using gluLookAt to build the light camera/view matrix, I'm out of ideas.

New C/C++ Build Tool 'Stir' (doesn't just generate Makefiles, it does the build): https://github.com/space222/stir

Hey Beans,

OK, so I've been working on my engine and noticed I GROSSLY screwed up the normals.

That being said, I now do something like this:

[source lang="csharp"] public static Vertex CalculateNormal( Double pPoint1X, Double pPoint1Y, Double pPoint1Z, Double pPoint2X, Double pPoint2Y, Double pPoint2Z, Double pPoint3X, Double pPoint3Y, Double pPoint3Z )
{
Vertex A, B, Result;

A.X = pPoint2X - pPoint1X;
A.Y = pPoint2Y - pPoint1Y;
A.Z = pPoint2Z - pPoint1Z;
//a.x = p2.x - p1.x;
//a.y = p2.y - p1.y;
//a.z = p2.z - p1.z;

B.X = pPoint3X - pPoint1X;
B.Y = pPoint3Y - pPoint1Y;
B.Z = pPoint3Z - pPoint1Z;
//b.x = p3.x - p1.x;
//b.y = p3.y - p1.y;
//b.z = p3.z - p1.z;

Result.X = ( A.Y * B.Z ) - ( A.Z * B.Y );
Result.Y = ( A.Z * B.X ) - ( A.X * B.Z );
Result.Z = ( A.X * B.Y ) - ( A.Y * B.X );
//n.x = ( a.y * b.z ) - ( a.z * b.y );
//n.y = ( a.z * b.x ) - ( a.x * b.z );
//n.z = ( a.x * b.y ) - ( a.y * b.x );

Double UnitLength = Math.Sqrt( ( Result.X * Result.X ) + ( Result.Y * Result.Y ) + ( Result.Z * Result.Z ) );

if ( UnitLength != 0 )
{
Result.X /= UnitLength;
Result.Y /= UnitLength;
Result.Z /= UnitLength;
}
//// Normalize (divide by root of dot product)
//l = sqrt( n.x * n.x + n.y * n.y + n.z * n.z );
//n.x /= l;
//n.y /= l;
//n.z /= l;

return Result;
}[/source]
As you can see, I really just copied something out of the OpenGL wiki and then converted it to C#.

And then to actually add in the Normals I do...

[source lang="csharp"]//T1
TerrainTileVertices.Add( x );
TerrainTileVertices.Add( y );
TerrainTileVertices.Add( pTiles[Offset].Height );

TerrainTileVertices.Add( x );
TerrainTileVertices.Add( y + 1 );
TerrainTileVertices.Add( pTiles[Offset + Diameter].Height );

TerrainTileVertices.Add( x + 1 );
TerrainTileVertices.Add( y + 1 );
TerrainTileVertices.Add( pTiles[Offset + Diameter + 1].Height );

Normal = Auxiliary.CalculateNormal( TerrainTileVertices[TerrainTileVertices.Count - 9], TerrainTileVertices[TerrainTileVertices.Count - 8], TerrainTileVertices[TerrainTileVertices.Count - 7],
TerrainTileVertices[TerrainTileVertices.Count - 6], TerrainTileVertices[TerrainTileVertices.Count - 5], TerrainTileVertices[TerrainTileVertices.Count - 4],
TerrainTileVertices[TerrainTileVertices.Count - 3], TerrainTileVertices[TerrainTileVertices.Count - 2], TerrainTileVertices[TerrainTileVertices.Count - 1] );

for ( int i = 0; i < 3; i++ )
{
TerrainTileNormals.Add( Normal.X );
TerrainTileNormals.Add( Normal.Y );
TerrainTileNormals.Add( Normal.Z );
}

//T2
TerrainTileVertices.Add( x );
TerrainTileVertices.Add( y );
TerrainTileVertices.Add( pTiles[Offset].Height );

TerrainTileVertices.Add( x + 1 );
TerrainTileVertices.Add( y );
TerrainTileVertices.Add( pTiles[Offset + 1].Height );

TerrainTileVertices.Add( x + 1 );
TerrainTileVertices.Add( y + 1 );
TerrainTileVertices.Add( pTiles[Offset + Diameter + 1].Height );

Normal = Auxiliary.CalculateNormal( TerrainTileVertices[TerrainTileVertices.Count - 9], TerrainTileVertices[TerrainTileVertices.Count - 8], TerrainTileVertices[TerrainTileVertices.Count - 7],
TerrainTileVertices[TerrainTileVertices.Count - 6], TerrainTileVertices[TerrainTileVertices.Count - 5], TerrainTileVertices[TerrainTileVertices.Count - 4],
TerrainTileVertices[TerrainTileVertices.Count - 3], TerrainTileVertices[TerrainTileVertices.Count - 2], TerrainTileVertices[TerrainTileVertices.Count - 1] );

for ( int i = 0; i < 3; i++ )
{
TerrainTileNormals.Add( Normal.X );
TerrainTileNormals.Add( Normal.Y );
TerrainTileNormals.Add( Normal.Z );
}[/source]

T1 and T2 are the two triangles that compose one quad/tile.

As you pointed out, my code also is horribly incorrect for calculating directional light (oops!). So, I've updated it...

[source lang="cpp"]//VERTEX

uniform mat4 LightModelViewProjectionMatrix;
uniform mat4 WorldMatrix;

varying vec3 Normal; // The eye-space normal of the current vertex.
varying vec3 LightDirection; // The eye-space direction of the light.

varying vec4 LightCoordinate; // Position in model view projection space from the lights view.
varying vec4 WorldPosition; // Position in world space.

void main()
{
Normal = normalize(gl_NormalMatrix * gl_Normal);
LightDirection = normalize(vec3(gl_LightSource[0].position));

WorldPosition = WorldMatrix * gl_Vertex;
LightCoordinate = LightModelViewProjectionMatrix * gl_Vertex;
gl_Position = ftransform(); // Transform via fixed function into the viewer's view
gl_TexCoord[0] = gl_MultiTexCoord0;
}

//FRAGMENT
uniform sampler2D DiffuseMap;
uniform sampler2D ShadowMap;
uniform mat4 WorldMatrix;
uniform float MinimumShadow;

varying vec3 Normal; // The eye-space normal of the current vertex.
varying vec3 LightDirection; // The eye-space direction of the light.
varying vec4 LightCoordinate;
varying vec4 WorldPosition;

void main()
{
vec4 Texel = texture2D(DiffuseMap, vec2(gl_TexCoord[0]));

// Directional lighting

//Build ambient lighting
vec4 AmbientElement = gl_LightSource[0].ambient;

//Build diffuse lighting
float Lambert = max(dot(Normal, LightDirection), 0.0);
vec4 DiffuseElement = ( gl_LightSource[0].diffuse * Lambert );

vec4 LightingColor = ( DiffuseElement + AmbientElement );

LightingColor.r = min(LightingColor.r, 1.0);
LightingColor.g = min(LightingColor.g, 1.0);
LightingColor.b = min(LightingColor.b, 1.0);
LightingColor.a = min(LightingColor.a, 1.0);

LightingColor *= Texel;


// Shadow mapping
// ------------------------------
// vec4 lcoord = LightCoordinate; // Fragment position in light space.
// lcoord /= lcoord.w; // Project to cartesian space
// lcoord.xy = lcoord.xy * 0.5 + 0.5; // Scale since light clipping space is in [-1,1] but texture space is [0,1]
//
// float fragmentDepth = lcoord.z; // Depth of the fragment in light space.
// float shadowMapDepth = texture2D(ShadowMap, lcoord.xy).r; // Depth in the shadow map.
//
// float eps = 0.001; // depth bias
// float shadow = fragmentDepth - eps > shadowMapDepth ? 0.5: 1.0;
//
// Color.rgb *= shadow;

gl_FragColor = LightingColor;
}[/source]

Only problem is I think I still haven't quite gotten the scene lighting right, since some triangles appear to be incorrectly lit -- i.e. I'm expecting both triangles in a quad to be nearly identically lit. Unfortunately, that does not seem to be the case -- even for light shooting down from directly above.

I'm not sure what a good ambient/diffuse color set might be -- that may be my problem. But my understanding of diffuse and ambient is that they add together. I also wanted to make sure that the scene didn't get too bright -- I don't want light to be super-reflective and show the light color itself -- at most, I want the true RGBA of the texel to be shown, which is why I clamped each element (there may be a better way) in the LightingColor to be 1.0 at highest...

i.e.

2012.07.26%20Directional%20Lighting%20%28Incorrect%29.png

The ambient and diffuse are the same color at the moment, to attempt to "build" this out... Which, sure, it makes sense that they add together here. But it looks like the normals may still be off for some triangles, which in reality, should be the same. The lower left side are all flat on the ground.

Any thoughts?
The only reason I haven't posted anything, is your pictures are ridiculous. I have no what I am looking at. I know its some terrain but its insane. Get rid of the texturing/tiles if you are just work on the lighting alone.

Does your lighting shader work on other objects like a sphere?

Tips:
Make a function called Cross product that takes two vectors instead of this.
Result.X = ( A.Y * B.Z ) - ( A.Z * B.Y );
Result.Y = ( A.Z * B.X ) - ( A.X * B.Z );
Result.Z = ( A.X * B.Y ) - ( A.Y * B.X );

It looks like you are only doing face normals instead of blended vertex normals.

NBA2K, Madden, Maneater, Killing Floor, Sims http://www.pawlowskipinball.com/pinballeternal

I was hoping to do something flat like face normals. Is this a problem if that's the case? Good point on the method, which I've done.

I'm sorry the pictures are "ridiculous." I figured you'd still be able to tell, even with textures on. I haven't pushed anything else into the engine to test it. I figured a flat surface with occasional changes in height would be a good test.
If your quads are supposed to be evenly lit, then you are probably taking the wrong vertices to cross and doing them in the wrong order causing some to be negative.

NBA2K, Madden, Maneater, Killing Floor, Sims http://www.pawlowskipinball.com/pinballeternal

This topic is closed to new replies.

Advertisement