Sign in to follow this  

[Resolved]Bumpmapping, calculating tangents

Recommended Posts

For bumpmapping, it seems I cant do it without moving everything to texture (tangent) space, as N.L lighting modified to N.(BumpMap.L) doesnt work properly. So, from a point of vertices, how can I calculate the tangents? I am not using meshes, I am using MDX's CustomVertex.PositionNormalTextured. If possible, I would like to calculate the tangent in the HLSL shader, if not, I can store the value and pass it to the HLSL shader at runtime. [Edited by - CadeF on October 8, 2005 8:26:00 AM]

Share this post

Link to post
Share on other sites
You will need another vector in your vertex format, because you need at least two vertices to calculate the tangent space. Google for "tangent basis construction" or "tangent space calculation" and you'll get lots of good results. If you send in normal and tangent, you can create binormal with cross product.

Typically, you don't want to take the eye into tangent space; instead, you want to take the tangent space normal (from the normal map) into object space; you can do that by multiplying by the inverse of the tangent space matrix. The inverse is typically the transpose (for unity scale without shear). That, in turn, means that three DP3 instructions turn instead into three MAD instructions.

Share this post

Link to post
Share on other sites
This is the standard link given when people ask about tangents. nVidia has MeshMender, or something, which calculates tangents too. You can load a mesh as an D3DXMesh and ask D3DX to compute tangents for you too (D3DXComputeTangent).

Tangents are based on the mesh positions, normals, and the change of UVs of the normal map. When tangent algorithms are looking for UVs, they want the UVs you'll use for sampling your normal map.

Share this post

Link to post
Share on other sites
So, a tangent is a line from the corners of a texture, pointing in the direction of the UV?

0,0 0,1
|X |
| |
1,0 1,1

So, from X, the tangent goes down-right?

Edit: Im not using meshes, Im using lit vertices with normals and textures. I can pass in tangent via a hlsl shader's input.

Edit Again: Scince I am using unreal's .t3d map format, I think tangents are in it. Example,
Origin -00272.614380,-00609.927612,+00312.000000
Normal +00000.923880,-00000.382683,+00000.000000
TextureU -00000.377476,-00000.911309,-00000.164399
TextureV -00000.062913,-00000.151885,+00000.986394
Vertex -00266.677765,-00595.595337,+00343.026306
Vertex -00272.614380,-00609.927612,+00312.000000
Vertex -00272.614410,-00609.927612,+00248.000000
Vertex -00266.677765,-00595.595337,+00216.973694

Is TextureUV the tangent? Because I have to make the UV coordinates for each vertex from that. Or is it different for each vertex?

[Edited by - CadeF on October 7, 2005 10:57:21 PM]

Share this post

Link to post
Share on other sites
The UVs supplied are probably UVs. You can make tangents using the UVs, position, and normal. I'll try to explain what tangent vectors are, what they're for, and how the math works out.

There are two vectors, tangent, and bitangent (or binormal, the industry can't decide). Both of these vectors are at 90 degrees from the vertex normal. The tangent, bitangent, and normal all change per vertex and represent the "tangent space" basis vectors. It's like a local X axis, Y axis, and Z axis. What that means is, however your texture is mapped onto your mesh, tangent points along the U coordinate, bitangent points along the V coordinate, and your normal points directly out of your texture.

tangent = change in U in object space.
bitangent = change in V in object space.
normal = away from surface in object space.

Since bitangent is at 90 degrees to the tangent and the normal, you often pass in just the tangent and do a cross product of the tangent and the normal in your shader to find the bitangent.

Lets see how the math behind this works:

Imagine a bulge on a wall (heightmap row = 0 0 1 0 0).
Converting this to a normal map we'd have something like this:
(0,0,1) (-.707,0,.707) (0,0,1) (.707,0,.707) (0,0,1)
ie: the surface would briefly point left, then briefly point right. But that's left and right only in the texture. When that texture is mapped onto a mesh, who knows how it will be mapped. That's where tangents come in.

For a flat quad, with standard UV mapping, facing along Z, the tangent would be (1,0,0), bitangent would be (0,1,0), and your normal would be (0,0,1).

If our mesh had UVs rotated 90 degrees, our tangent would become (0,1,0), and bitangent would becomes (-1,0,0), and normal would remain (0,0,1).

Now lets say we have a light pointing along X (1,0,0).
For the first quad, our TBN matrix would be
transform our light vector by that and we get (1,0,0). Our light vector is now in tangent space. If you read the normal map, and dot3 with this light vector you'll get a bit of light on the bump.

For the second quad, our TBN matrix wold be
tranform our light vector by that and get (0,1,0). Our light vector is now in tangent space. Notice how since we mapped the UVs at 90 degrees, the light got rotated to compensate. Our light will now make no light when it hits that bump. The bump was a purely horizontal bump in the texture, but it was rotated to be a purely vertical bump by our UV mapping. A light shining along X won't hit a surface that only changes in Y and Z.

Share this post

Link to post
Share on other sites
Thanks a lot for that post :)

But, it seems I just needed to use the TextureU in the map as the Tangent for each vertex in that poly, and the TextureV for the bi-normal. Oops

Share this post

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this