# Compressed tangent space vectors

This topic is 3301 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

Hi, Is it possible to store tangent space vectors in textures instead of vertexbuffer. Normalized vectors are always in interval [-1,1]. So this interval can be transformed in range [0,255]?

##### Share on other sites
In theory you can *always* store per vertex data into a texture map.

This is expecially true nowadays since you are not forced to store data in classic rgba format.

In practice I wonder why you would like to have a "tangent map", since compared to interpolation it has some weaknesses:

- higher bandwidth needed
- higher video memory usage
- less precision
- needs mipmapping

##### Share on other sites
you could as well store the whole tangentspace + normal with one quaterion (x,y,z,w) and then regenerate the normal, tangent, bitangent from this. a bit more computational work, a bit less storage.

i wouldn't suggest storing it in a texture in general, too.

##### Share on other sites
I'm rendering my terrain with a single shared vertexbuffer. (This is possible if using vertex texture for heightvalues)

So, if I need additional vertex data (like tangents), I need to use multistreaming (Because the vertex position is calculated in the shader)

I'm trying to avoid vertex streams

@davepermen
How do I generate tangent space vectors from quaternion?

##### Share on other sites
What exactly do you need the tangent space vectors for? If just for bumpmapping terrain, then maybe you can get away without computing these.

##### Share on other sites
Do you have a normal vector? I would start from that, because you will need it for simple shading anyway.

If not, you could compute it from the heightmap in the VS or you can use a vertex normalmap, like you are already using the heightmap.

With the normal you can easily compute the tangent vector as long as the terrain is a regular grid. I guess you are computing the texture coordinates via the vertex position too.

##### Share on other sites
Yes, I need tangents just for normalmapping. I've tried to do normalmapping without computing tangents, but I'm getting weird results.(I need more details how to do this correctly)

I'm computing vertex normals in the vertex shader, I think this is very easy and saves memory.(So I'm not using vertex normalmap)
Yes, texture coordinates are computed via vertex position.

[Edited by - Hiyar on December 21, 2008 12:00:29 PM]

##### Share on other sites
You only need to use tangent space when you are working with deformable models (skinned characters generally), otherwise use object space. No need for storing the binormal and tangent, so you save space, and no need for doing the extra transform from tangent space to object space.

##### Share on other sites
The normals in a normalmap are relative to the texture space. And the light vector is relative to the worldspace. For correct lighting I need to transform light vector to the tangentspace. I'm not using a global normalmap for the entire terrain. Normalmaps I'm using are for detail textures. So how do I use objectspace for detail normalmapping?

(For distant lighting I'm using normals that are computed in the vertex shader)

##### Share on other sites

The tangent and binormal can be caluculated like this:
Tangent = cross(float3(0, 1, 0), Normal);Binormal = cross(Normal, Tangent);
Maybe you will need to invert a vector, but that should be it.

##### Share on other sites
Why crossproduct of (0,1,0) and normal?

Maybe it is possible to compute tangent space in the pixel shader. Like so:

inNormal = normalize(inNormal);
float3 normalSample = tex2D(normalMap, tex0).rgb;
float3 normal = normalSample * 2.0f - 1.0f;

float3 q0 = ddx(pos);
float3 q1 = ddy(pos);
float2 st0 = ddx(tex0);
float2 st1 = ddy(tex0);

float3 T = normalize( q0 * st1.y - q1 * st0.y);
float3 B = normalize(-q0 * st1.x + q1 * st0.x);

float3x3 TBN;
TBN[0] = T;
TBN[1] = B;
TBN[2] =inNormal;

normal = mul(normal, TBN);

I'm wondering, if I can use vertex normal for this approach. I'm computing vertex normal by sampling four neighboring elevations in the vertex texture. And then just doing this:

float3 n;
n.x = h3 - h4;
n.y = h1 - h2;
n.z = g_NormalScale;

return normalize(n);

Do I need to transform the normal and light vector in the vertexshader, which space?

##### Share on other sites
Quote:
 Original post by LuiHave you solved it already?The tangent and binormal can be caluculated like this:Tangent = cross(float3(0, 1, 0), Normal);Binormal = cross(Normal, Tangent);Maybe you will need to invert a vector, but that should be it.

This may be incorrect because it makes alot of assumptions. Computing tangents can be a pain, but if you understand exactly what the tangents are, there is a chance you can take some shortcuts (like Lui's code does).

Somehow you know the normal (I'd be really interested to know how you're doing this inside a vertex shader). Assuming your terrain is an x/z plane and you're projecting u,v coordinates with u on the x axis and v on the z axis, you can find the tangent by taking the cross product of the normal and unit z. Similarly you can get the bitangent from the cross product of the normal and unit x.

Ofcourse this is just the concept. You'll have to work out the proper cross product order and possibly some negations based on your exact projection and handedness.

##### Share on other sites
Quote:
 Original post by realkiranSomehow you know the normal (I'd be really interested to know how you're doing this inside a vertex shader).

Like I said in the above post. Four neighboring height values can be sampled in the vertex shader and then use this formula:

n = (west-east, 2, south-north);

But I don't know if this normal vector can be used in the above pixel shader code (by using partial derivatives). This is my question in the above post.