**0**

# Normals and camera/eye direction computations (HLSL)

###
#1
Members - Reputation: **354**

Posted 05 April 2012 - 04:45 AM

I'm not experienced in vector maths related computations so I need some help here.

In the game engine I'm working with (xRay 1.0) I can sample the normal vector of a pixel in a pixel shader like this.

float3 N = tex2D(s_normal, texCoord);

However it seems the result is not the absolute surface normal vector, but something depending on the camera/eye direction. e.g. when I use the normal vector as color, horizontals are green when the camera points straight forward, but as soon as I turn the camera vertically towards the sky or ground, the color changes to blue or red, respectively, indicating that the "normal" N that I sampled above is somewhat relative to the camera direction rather than absolute. Same thing happens with verticals when turning the camera to the left or right.

I have the camera/eye direction vector, so it should be easily possible to get the absolute surface vector with some vector math, but I don't know which formula or HLSL function(s) to use. I tried several combinations of scalar multiplications, divisions, subtractions, and additions, as well as the "dot" and "mul" functions but did not achieve to get a correct result.

So, how do I get the absolute surface normal independent from the camera direction out of the above values?

###
#2
Crossbones+ - Reputation: **2337**

Posted 05 April 2012 - 05:38 AM

It is absolutely possible to calculate lighting in view space. There may even be some performance advantages.

Just transform the light position and direction vectors to view space and the math for lighting is exactly the same as if the lighting was done in the world space. The advantage is that the camera position is always at 0,0,0 in the view space. Also, storing normals in view space is good in deferred shading implementations.

Practically, you can also transform the normals back to world space using the 3x3 part of the view matrix, but "why would you?".

Cheers!

###
#5
Crossbones+ - Reputation: **2337**

Posted 05 April 2012 - 12:30 PM

Simple normal transformation in HLSL goes something like

float3 WorldNormal = mul(ViewSpaceNormal,(float3x3)mView); where ViewSpaceNormal contains the normal you sample from your normal texture and mView holds a 4x4 matrix which transforms from view/camera space to world space (it is the inverse of typical camera matrix which transforms from world space to view space).

The WorldNormal can then be used for the lighting in worldspace.

Still, I'd stay with the view space lighting. If vector/matrix multiplies give you hard time, I'd spend few hours studying them.

[edit] you'll also need to make sure that the normal you sample from the normal texture isn't packed.

###
#6
Members - Reputation: **354**

Posted 06 April 2012 - 11:08 AM

Now there's such one issue with this method, the results do not seem to be very precise. Verticals can easily be distinguished from horizonals now, but planes with about 30-40 degrees decline still deliver a surface normal equal to an entirely flat (0 degrees) plane. Might some part of that formula be responsible for this lack of precision?

###
#7
Members - Reputation: **271**

Posted 07 April 2012 - 12:22 AM

Make sure your inverse transform is exactly what kauna wrote.

float3 WorldNormal = mul(ViewSpaceNormal,(float3x3)mView);

mView (inverse of world -> view) HAS to be casted to a 3x3 matrix (for beginner safety reasons).

ViewSpaceNormal should ideally be a float3 (for beginner safety reasons).

Also, components (x, y, z) of your normal can be negative. So when debugging them, either abs() them, or negate them to find out what's happening.

###
#8
Members - Reputation: **354**

Posted 07 April 2012 - 01:25 PM

float3 viewspace_N = tex2D(s_normal, uv); // s_normal is the engine-provided normal sampler state, uv the float2 tex-coord

float3 N = mul(viewspace_N,(float3x3)m_WV); // m_WV is the engine-provided transformation matrix

Note that I've also tried both normalizing and unpacking the vectors without success.

Might the transformation matrix m_WV given by the engine happen to be responsible for the incorrect (or better, imprecise) result?

###
#9
Members - Reputation: **354**

Posted 07 April 2012 - 02:10 PM

So I have to find some approach to detect if a pixel is in a flat surface without relying on surface normals. I tried to take the depth/distance value of the position sampler into account for that purpose but it seems that this one is not more precise than the normal sampler. So what other options do I have in this case, if any?

###
#10
Members - Reputation: **271**

Posted 07 April 2012 - 05:08 PM

ddx/ddy (partial derivative aka 'change' of a value with respect to viewport X,Y)

To get world space normals, try the following

zdx = ddx(pixel world space z)

zdy = ddy(pixel world space z)

ydx = ddx(pixel world space y)

ydy = ...

v0 = xdx, ydx, zdx (or v0 = ddx(pixel world space position)

v1 = xdy, ydy, zdy (...)

normal = cross product of v0 and v1

You gotta decide what you want to do at edges (and inside special loops). Also becareful of n's direction.

###
#12
Crossbones+ - Reputation: **2337**

Posted 10 April 2012 - 01:47 AM

Also, I'm not 100% convinced that the matrix you used for the transform is the correct one. Typically the mView matrix is used for world-to-view space transform and what you need is the inverse of that matrix.

Cheers!

###
#13
Members - Reputation: **354**

Posted 12 April 2012 - 09:16 AM

Can you verify the surface format of the normal texture? There are lots of ways to store inaccurate normals.

How would I verify the surface format? I only see what I get when sampling some position, normal, or color value, but what "format" would it have?

Also, I'm not 100% convinced that the matrix you used for the transform is the correct one. Typically the mView matrix is used for world-to-view space transform and what you need is the inverse of that matrix.

I also thought of that. Given the fact that my engine wouldn't provide that inverse matrix, is there a way to compute it out of the mView matrix?