Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

alexmac

Cg bump mapping confusion

This topic is 5180 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm getting a bit confused trying to get bump mapping to work using Cg and opengl. To test it out I have generated a sphere in blender(and icosphere) which my program can load, that part works fine. I also have it textured mapped and it looks ok. I have managed to get a simple diffuse lighting Cg shader up and running so that is also ok. The way I'm doing the bump mapping is as follows: program: pre generate the tangent vector per polygon at the start vertex shader: using the surface normal and the tangent vector do the cross product to work out the binormal vector (this should be ok since the polygons are all uniformly textured). Then form a 3x3 rotation matrix out of the tangent, binormal and normal vectors. multiply the light_pos, eye_pos and polygon_pos by this matrix to get everything into tangent space (from object space). pass these values to the fragment shader. fragment shader: for now ignore the normal map and just do simple diffuse lighting to test that everything in tangent space is working the same as it was before problem: I thought that this would just give me standard diffuse lighting (albeit slower due to the conversion to tangent space), but it seems like lots of the polygons are getting rotated the wrong way and appear a lot darker tahn they should be. I think this has something to do with the following case where two polygons side by side have very different tangent and binormal vectors, but I don't see why this should be a problem:
*-----------------*
|    T            |
|    |            |
|    |            |
|    *____B       |
|                 |
|                 |
*-----------------*
|                 |
|   B____*        |
|        |        |
|        |        |
|        T        |
|                 |
*-----------------*
 
Any help would be much appreciated! [edited by - alexmac on June 11, 2004 11:42:31 AM]

Share this post


Link to post
Share on other sites
Advertisement
some more info, just incase someone can spot any errors:

vertex shader(uses the tangent, binormal and normal vectors to transform worldspace/objectspace into tangent space):

struct outputdata
{
float4 pos : POSITION;
float4 color : COLOR;
float3 texcoord : TEXCOORD0;
float3 normal : TEXCOORD1;
float3 eye_pos : TEXCOORD2;
float3 obj_pos : TEXCOORD3;
float3 light_pos : TEXCOORD4;
float3 light_color : TEXCOORD5;
float3 light_ambient : TEXCOORD6;
float3 light_specular : TEXCOORD7;
};

outputdata main(
float4 position : POSITION,
float4 color : COLOR,
float3 normal : NORMAL,
float3 texcoord : TEXCOORD0,
float3 tangent : TEXCOORD1,
float3 binormal : TEXCOORD2,

uniform float3 light_pos,
uniform float3 light_color,
uniform float3 light_ambient,
uniform float3 light_specular,
uniform float3 eye_pos,
uniform float4x4 mx_modelview_proj,
uniform float4x4 mx_modelview_it)
{
outputdata OUT;

OUT.pos = mul(mx_modelview_proj, position);
OUT.texcoord = texcoord;
OUT.light_color = light_color;
OUT.light_ambient = light_ambient;
OUT.light_specular = light_specular;

float3 tang = normalize(tangent);
float3 binorm = normalize(binormal);

float3x3 rotation = float3x3(tang, binorm, normal);

OUT.normal = normalize(mul(rotation, normal));
OUT.eye_pos = mul(rotation, eye_pos);
OUT.light_pos = mul(rotation, light_pos);
OUT.obj_pos = mul(rotation, position.xyz);

return OUT;
}


fragment shader (should do simple diffuse lighting):

struct inputdata
{
float4 pos : POSITION;
float4 color: COLOR;
float3 texcoord : TEXCOORD0;
float3 normal : TEXCOORD1;
float3 eye_pos : TEXCOORD2;
float3 obj_pos : TEXCOORD3;
float3 light_pos : TEXCOORD4;
float3 light_color : TEXCOORD5;
float3 light_ambient : TEXCOORD6;
float3 light_specular : TEXCOORD7;
};

void main(inputdata IN,
out float4 outcolor : COLOR,
uniform float3 mat_e,
uniform float3 mat_a,
uniform float3 mat_d,
uniform float3 mat_s,
uniform float mat_shine,
uniform float3 global_ambient,
uniform sampler2D tex_main,
uniform sampler2D tex_normal,
uniform float4x4 mx_modelview_proj,
uniform float4x4 mx_modelview_it)
{
float3 texcol = tex2D(tex_main, IN.texcoord).rgb;
float3 texnorm = tex2D(tex_normal, IN.texcoord).rgb;


float3 norm = normalize(IN.normal);
float3 light_dir = normalize(IN.light_pos - IN.obj_pos);
float3 eye_dir = normalize(IN.eye_pos - IN.obj_pos);
float light_dist = length(IN.light_pos - IN.obj_pos);
float3 half_angle = normalize(light_dir + eye_dir);

float3 ambient = mat_a * global_ambient;
float3 emissive = mat_e;

float3 diffuse = texcol * IN.light_color * max(dot(norm, light_dir), 0);

outcolor.rgb = ambient + emissive + diffuse;
outcolor.a = 1;
}

Share this post


Link to post
Share on other sites
I guess you are creating your vertex tangents/binormals by averaging those from the adjacent faces, right?
This way, your tangents and binormals will cancel each other out, will have zero-length and result in wrong shading.

I think the only ways around this are:
a) don''t create texture-mappings where this occurs
b) duplicate the vertices at these points

There''s a tool (with source) at developer.nvidia.com that creates the proper tangent-bases for the vertices (NVMeshMender).
Maybe this helps as well.

Regards,

Jan

Share this post


Link to post
Share on other sites
Your method seems fine to me,it should work.Probably there is some small mistake in the way you compute the TBN matrix,check that again.
Just some other ideas:
"multiply the light_pos, eye_pos and polygon_pos by this matrix"
I'm a little confused by that.light_pos and polygon_pos(you mean vertex_pos,right?) are probably in eye space(I assume you use your modelview matrix to do the rotation).So,eye_pos is always (0,0,0).Or am i getting something wrong?

Also,when you say "ignore the normalmap",this means that you always use (0,0,1) as the normal,right?Not an interpolated normal.

[edited by - mikeman on June 11, 2004 2:59:41 PM]

Share this post


Link to post
Share on other sites
quote:
Original post by jeickmann
There''s a tool (with source) at developer.nvidia.com that creates the proper tangent-bases for the vertices (NVMeshMender).


MeshMender uses simmilar method than he does. Plus it has problems with mirrord texture coordintes/mapping.



You should never let your fears become the boundaries of your dreams.

Share this post


Link to post
Share on other sites
yes I just checked my tangent and binormals and they were slightly wrong so I've fixed that now. but here's another problem:-

to better visualise what the light source is doing I put the following at the end of the fragment shader:

if(light_dist < 500)
{
outcolor.r = 1;
}
else
{
outcolor.r = 0;
}


This should make a red circle when the light gets close to a surface since it is a point light.

using my previously posted vertex shader that turns everything into tangent space the resulting picture is wrong:
http://www.alexmac.cc/~alex/tangent-space.png

by switching the bit of the vertex shader that says:

OUT.normal = normalize(mul(rotation, normal));
OUT.eye_pos = mul(rotation, eye_pos);
OUT.light_pos = mul(rotation, light_pos);
OUT.obj_pos = mul(rotation, position.xyz);


so that it says

OUT.normal = normal;
OUT.eye_pos = eye_pos;
OUT.light_pos = light_pos;
OUT.obj_pos.xyz = position.xyz;


it leaves the coordinates in object space and the result is correct:
http://www.alexmac.cc/~alex/object-space.png




all the surrounding polygons have different tangent and binormal vectors, but since I rotate everything (light, camera and polygon) it really shouldn't make any difference to anything.. at least that's how I see it

the only thing I can think of is maybe the matrix made up from the tangent, binormal and normal is doing more than just rotation and also doing some scaling or bending somehow.....


[edited by - alexmac on June 11, 2004 4:15:49 PM]

Share this post


Link to post
Share on other sites
hmm... should the tangent and binormal vectors be calculated per vertex or per polygon. Currently I have them per polygon since I dont understand how they could be different at each vertex... surely normal mapping doesn''t work if you stretch the texture in weird ways such that you would require per vertex tangent and binormal vectors....

also I did a quick test where I set the fragment color to (0.5 + length(vertexpos) - length(rotated_vertexpos)) so that It would show the difference between the originak object space vertex and the tangent space vertex after it had been rotated. If my TBN matrix was correct then surely they would be the same length after rotation and hence the color would be set to 0.5 (gray). This was not the case as can be seen in this picture:

http://www.alexmac.cc/~alex/vector_length.png

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!