# Baking normal maps from models

This topic is 481 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

Short version: I'm trying to bake a normal map from a uvmap with displacement values for each texel. I can quite easily calculate the normal of each texel in 'texture space', however it seems like (I may be wrong) that I need to rotate this normal to get it from texture space to tangent space. How is a good way of doing this? :wacko:

....

Having got normal mapping seeming to render ok with a few models, I find myself having to now understand a bit better how it works, as getting the combination of rendering and baking mirrored UVs is not super easy!

My shaders are doing the usual kind of thing I believe, constructing a TBN matrix and using this to modify the light vector passed to the fragment shader, then doing dot product against the stored normal. I am testing for mirrored faces and flipping the bitangent (or other flipping methods), as seems to be suggested for dealing with mirrored UVs.

It seems to render fine with pre-made normal maps. But now I'm trying to bake my own, from a displacement map stored per texel.

I had foolishly assumed from the confusing documents on the interwebs that I could just calculate the normal relative to the texture (i.e. in UV space, up is green, down is lack of green , red is right, lack of red is left etc. And I was hoping the tangent space would somehow deal with the orientation. But after a bit of frustration, I'm getting the impression that the orientation per texel has to be calculated on the basis of the orientation of the model and uv space (tangent basis I think this is right terminology?). So when you rotate a UV island, you also have to rotate the texel normal vector...

Anyway here is a pic illustrating the type of problem. Here I have drawn a single line across between 2 UV islands (and it is mirrored left and right side). The tangent basis seems to be different between the 2, despite the flipping, resulting in the shading being on opposite sides on the right hand line. (Incidentally, the red is the normal, green is tangent and blue is bitangent)

and here is the normal map, I'm currently just using a sobel filter to get the normal from neighbouring texels displacement:

I've found an article here:

http://http.developer.nvidia.com/GPUGems3/gpugems3_ch22.html

suggesting to convert a normal from world space to tangent space:

float3 normalTS;
normalTS.x = dot(normal, Input.binormal);
normalTS.y = dot(normal, Input.tangent);
normalTS.z = dot(normal, Input.normal);
normalTS = normalize(normalTS);
return float4(normalTS * 0.5 + 0.5, 1);


They are using this to convert from a high poly to low poly bake. However given that I'm just trying to go from a displacement map, is there a simpler calculation I can use? Can I simply twist the r,g,b normal vector to get the correct tangent basis, and if so any ideas how?

I know I should be able to do it the long way by computing world space positions of each texel using e.g. barycentric coords, then push out in the direction of the normal by the length of the displacement, then use the nvidia formula, but this seems overly complex and inefficient..  :blink:

Edited by lawnjelly

##### Share on other sites

Ok things are getting more interesting, I've loaded my baked normal map that is not rendering correctly in my shader, and it renders correctly in blender. This now suggests that it is simply a problem with my shader / tangent calculations with mirrored UVs, and that my original assumption may be correct:

That you can calculate normals directly from a displacement / heightfield in TEXTURE SPACE and have it render correctly over a model (even with rotated UV islands, because the tangent space is taking care of the rotation). Can anyone more mathematically minded confirm this? :unsure:

##### Share on other sites

I *think* I have answered my own question. :) I figured out that blender displayed the normal maps correctly in normal mode, but not in edit mode. Then an online viewer did not display them correctly, depending on the rgb polarity. Turned out it may be the answer was simply that the shader needed the normal map y flipping, and it seems to be working now. It does look better at some angles than others, but I guess that it always the case with normal mapping.

So my tentative conclusion is that the tangent space conversion should automagically deal with UV island rotation. I hope that is the case anyway.

I need to test the robustness with different models. And there are also a few issues where I need to split verts at seams, but that should be not so bad.