# Trying to understand normalize() and tex2D() with normal maps

This topic is 967 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

Hello

I'm learning the normal mapping process in HLSL and am a little confused by an example in it the book I'm learning from ("Introduction to Shader Programming" by Pope Kim)

What confuses me is:

float3 tangentNormal = tex2D(NormalSampler, Input.mUV).xyz;
tangentNormal = normalize(tangentNormal * 2 - 1);

I don't know why we are normalizing the tangetNormal... I thought that normalize() "converted" units into a value within a 0 to 1 range? Now that I read that back it sounds wildly inaccurate...

I'm also wondering if anyone can describe what exactly tex2D(sampler, uv) is doing? (the more detail the better)
- all i know right now is that I can supply tex2D an image and it will apply that image to a 0 to 1 texture coordinate

Thanks for any help or clarification
Edited by digs

##### Share on other sites
Thanks! I think I understand that better now

float3 tangentNormal = tex2D(NormalSampler, Input.mUV).xyz;
tangentNormal = normalize(tangentNormal * 2 - 1);

tex2D() samples an rgb color value from the NormalSampler at the current texel address ...

This means that if that texture were a 10px by 10px image, at some point tex2D() would return the color value of the texel at (0.7, 0.4) -- we store the rgb value of that texel in the  vector because this is the normal direction we want

"fetching right in the middle between ( -0.7071, 0.7071, 0 ) and ( 0.7071, 0.7071, 0 ) which are both unit length vectors will result in the interpolated vector ( 0, 0.7071, 0 ); which is not unit length"

why is ( 0, 0.7071, 0 ) not unit length? Is it because this value is < 1?
Edited by digs

##### Share on other sites

"Unit length" means x * x + y * y + z * z = 1.  It might help if you try to visualize this as points on a sphere of radius 1.  When you do linear interpolation between 2 such points you're actually taking a straight line between them, not a curve, and hence the distance from the center to the interpolated point is no longer 1.

##### Share on other sites

I still don't understand why it's necessary to normalize, but that's probably because I'm missing some basic theory perhaps?

For example, I don't understand why a normal's length is relevant when calculating lighting. To me it makes sense that the position and direction of that normal are the important factors

also, can anyone provide an example of a time when you might not want to normalize something?

Edited by digs

##### Share on other sites

I still don't understand why it's necessary to normalize, but that's probably because I'm missing some basic theory perhaps?

For example, I don't understand why a normal's length is relevant when calculating lighting. To me it makes sense that the position and direction of that normal are the important factors

A normal is a direction only.

Therefore the length isn't relevant, which is the reason why you normalize it.

If you didn't normalize it, the length of the vector will make calculations with it more tricky.

The main reason I think is the dot product.

If you have vectors A and B, with length |A| and |B|, the dot product is equivalent to |A|*|B|*cos(angle).

So if both vectors have length one, the dot product is 1*1*cos(angle) = cos(angle).

This means if you want to use the dot product to find the angle between two vectors, you have to use normalized vectors.

In lighting, cos(angle) is also a perfect number to use to decide how bright something is depending on the angle of the incoming light. (L dot N)

Edited by Olof Hedman

##### Share on other sites

Right! Ok, so... cos() is more expensive than dot(); we normalize the vectors so we can use them in dot() to replace doing the more expensive cos operation (if I'm remembering correctly)

One of the reasons I think I am getting so mixed up here is because I often see variables being normalized in the vertex shader before being sent to the pixel shader, yet other times they are normalized in the pixel shader... however, now that I think about it more, the only time I normalize a value in the vertex shader (or pixel shader) is when that variable is being used for a calculation (like dot)!

I think another thing that's tripping me up is I'm not at all sure what's happening when my data is sent to the rasterizer... I know that it is linearly interpolated across the triangle, but I really don't know what that means (or perhaps more accurately, I cannot picture whats happening with the data during this stage)

Thanks for all the help everyone, it has cleared some road blocks I hit; maybe I should research what happens during the rasterization stage

Edited by digs

1. 1
2. 2
3. 3
Rutin
15
4. 4
khawk
13
5. 5
frob
12

• 9
• 9
• 11
• 11
• 23
• ### Forum Statistics

• Total Topics
633665
• Total Posts
3013245
×