Normal mapping doesn't work on flat surfaces?

Started by
7 comments, last by Lifepower 17 years ago
After successfully implementing normal mapping algorithm, I've found a rather disturbing problem: it doesn't seem to work on flat surfaces such as cubes or planes, but it works perfectly on round surfaces like spheres. Here is how normal mapping looks when applied to the surface of sphere (screenshot from FX Composer): Normal mapped sphere However, when the same effect applied to simple cube, the result is all wrong: Normal mapped cube Initially I thought that perhaps my code was wrong, but I've downloaded "Relief Mapping" package from NVIDIA Shader Library. It makes the sphere look like this: Parallax mapped sphere Unfortunately, when applied to the same cube, it gives equally wrong result: Parallax mapped cube I also thought that maybe FX Composer generates tangent and binormal vectors wrong in the meshes, so I tested my shader by making a simple app. Again, everything works perfectly with round shape (torus knot), but when applied to a simple plane, it is all wrong: Normal mapped torus knot and XZ plane Since I was using D3DXComputeTangentFrameEx to calculate binormal and tangent vectors, I thought that maybe this function didn't work with flat surfaces, so I specified manually tangent and binormal vectors for the plane. The result was exactly the same. The curious part is that when using WRONG tangent and binormal vectors (just random combination I've tried), that is Tangent = {1.0f, 1.0f, 1.0f} and Binormal = {-1.0f, -1.0f, -1.0f} (normalized, of course), the result seems to improve quite a bit! Look: Wrong tangent and binormal vectors lead to better result Has anyone experienced these artifacts before? Is there any way to fix this or somehow reduce the artifact? Lastly, when playing in FX Composer, I've noticed that there is one side of the cube that in fact looks correctly (!!!) Look: One side of cube looks correctly The mystery is that only THAT side of cube looks fine, the rest is garbled. I remember that I've seen in games bump mapped flat surfaces that looked correctly, so the question is, how is it done?
Advertisement
Hi, I just took a quick look (ill check again when im home)
but it looks like your tangent space transformation is off
how does your shader code look?

Edit: Okay, I think when you're transferring your normals/bi/tan into world space you might be translating them also? Make sure your W is 0 when you mul with the world matrix.

[Edited by - MetaKnight on April 22, 2007 12:43:46 AM]
I don't transform normal, binormal and tangent vectors. Instead, I transform light direction and vertex-to-eye positions by world inverse and then to tangent space.

Here's my vertex shader:
void ShadowBumpPhongVS(  float3 InPos     : POSITION0,  float3 InTangent : TANGENT0, float3 InBinormal: BINORMAL0, float3 InNormal  : NORMAL0, float2 InTex     : TEXCOORD0, out float4 OutPos   : POSITION,  out float2 OutTex   : TEXCOORD0, out float4 ProjTex  : TEXCOORD1, out float3 OutToEye   : TEXCOORD2, out float3 OutLightDir: TEXCOORD3){  // Compute projected texture coordinates.  float4 PosW = mul(float4(InPos, 1.0f), World);  ProjTex = mul(PosW, LightWVP);    // Pass skin texture coordinates.  OutTex  = InTex;  // Build matrix for transforming to tangent space.  float3x3 ToTangent = transpose(float3x3(InTangent, InBinormal, InNormal));    // Compute vertex-to-eye position in tangent space.  float3 EyePosL = mul(float4(EyePos, 1.0f), WorldInverse);  OutToEye = mul(EyePosL - InPos, ToTangent);    // Compute light direction in tangent space.  float3 LightDirL = mul(float4(LightDir, 0.0f), WorldInverse);  OutLightDir = mul(LightDirL, ToTangent);   // Transform vertex position.   OutPos = mul(float4(InPos, 1.0f), WorldViewProjection);}


And here's my pixel shader:
float4 ShadowBumpPhongPS( float2 InTex   : TEXCOORD0, float4 ProjTex : TEXCOORD1, float3 ToEye   : TEXCOORD2, float3 LightDir: TEXCOORD3) : COLOR{  float ShadowCoef = ApplyShadowMap(ProjTex);    float3 NormalT = tex2D(BumpSampler, InTex);  NormalT = normalize(2.0f * NormalT - 1.0f);    return ApplyPhong(InTex, normalize(ToEye), -normalize(LightDir),    NormalT, ShadowCoef);}
By the lack of responses I assume that nobody used bump-mapping or nobody cares? Come on guys, any suggestions on this topic?

As an update, I've checked shader code from "Programming Vertex and Pixel shaders" book by Wolfgang Engel, and his normal mapping code doesn't work with flat surfaces as well. Is there some dirty trick required to make bump-mapping to work with a simple plane or cube?
It's hard to see on the images if things are actually wrong, but I belive you if you say so :) I guess you precompute the tangent and binormals in the mesh, are that correctly done? You could try to compute the TBN matrix inside the shader. Also try to send the matrix to the pixel shader and perform the transformation there.

	// Inside the vertex shader	float3 Up = float3(0.0f, 1.0f, 0.0f);	if(inNormal.x < inNormal.y)		Up = float3(1.0f, 0.0f, 0.0f);	float3 Tangent = normalize( cross(Up, inNormal));	float3 Binormal = cross(Tangent, inNormal);	// Send Tangent, Binormal and Normal to the pixel shader as texture coordinates and build the TBN Matrix there
Journal entry with pictures of per-pixel lighting on a simple 2-triangle flat quad - seems to work just fine for me [razz]

Seriously though, I'd point the finger at your tangent/bitangent/normal vectors - you can get some very odd results when these are wrong. Given that they're generated from the UV coordinates, any sort of clever mirroring/wrapping modifiers tend to explode the TBN generators.

If you've got the geometry stored loaded from a file, try loading it in DXViewer - it has a way of visualizing the per-vertex vectors, so you can do a quick visual inspection and see if theres something wrong.

hth
Jack

<hr align="left" width="25%" />
Jack Hoxley <small>[</small><small> Forum FAQ | Revised FAQ | MVP Profile | Developer Journal ]</small>

I have finally found the source of the problem. Since I apply non-uniform scale to objects, this part of code no longer generates correct result:

 // Compute vertex-to-eye position in tangent space. float3 EyePosL = mul(float4(EyePos, 1.0f), WorldInverse); OutToEye = mul(EyePosL - InPos, ToTangent);   // Compute light direction in tangent space. float3 LightDirL = mul(float4(LightDir, 0.0f), WorldInverse); OutLightDir = mul(LightDirL, ToTangent);


Is there a way to reverse transform that works with non-uniform scaling?
I do all my lighting in world space - it's easier to write orthogonal shader code that way, it seems. If you're doing scaling (uniform or not), you'd still presumably need to re-normalize after transform, which does preclude some tricks (like encoding ambient occlusion in the normals' lengths).
I have been checking Parallax Occlusion Mapping example from DX SDK and learned that they were using transformation from world to tangent space as opposed to local space to tangent space that I was using. The code, again, assumed uniform scaling. However, I do know how to transform normals to world space with non-uniform scaling correctly, so I modified the example code and ported it to my vertex shader.

In case someone is having the same trouble I had, here is the fixed vertex shader code:
void ShadowBumpPhongVS(  float3 PositionOS: POSITION0,  float3 TangentOS : TANGENT0, float3 BinormalOS: BINORMAL0, float3 NormalOS  : NORMAL0, float2 InTex     : TEXCOORD0, out float4 OutPos   : POSITION,  out float2 OutTex   : TEXCOORD0, out float4 ProjTex  : TEXCOORD1, out float3 OutToEye   : TEXCOORD2, out float3 OutLightDir: TEXCOORD3){  // Compute projected texture coordinates.  float4 PosWS = mul(float4(PositionOS, 1.0f), World);  ProjTex = mul(PosWS, LightWVP);    // Transform normal, binormal and tangent vectors to world space.  float3 NormalWS   = mul(float4(NormalOS, 0.0f), WorldInverseTranspose).xyz;  float3 TangentWS  = mul(float4(TangentOS, 0.0f), WorldInverseTranspose).xyz;  float3 BinormalWS = mul(float4(BinormalOS, 0.0f), WorldInverseTranspose).xyz;    NormalWS   = normalize(NormalWS);  TangentWS  = normalize(TangentWS);  BinormalWS = normalize(BinormalWS);    // Compute matrix for transforming from world space to tangent space.  float3x3 ToTangent = transpose(float3x3(TangentWS, BinormalWS, NormalWS));        // Transform light direction and vertex-to-eye vector to tangent space.  OutLightDir = mul(LightDir, ToTangent);      OutToEye    = mul(EyePos - PosWS, ToTangent);     // Transform vertex position.   OutPos = mul(float4(PositionOS, 1.0f), WorldViewProjection);    // Pass skin texture coordinates.  OutTex  = InTex;  }


The above code works properly for any kind of surface with both uniform and non-uniform scaling. The correctly generated image is the following:
Bump-mapping fixed!

Thanks everyone! [smile]

This topic is closed to new replies.

Advertisement