# Planar mapping and normals

## Recommended Posts

Hi everybody :) Excuse my English... I'm implementing LightMaps in my 3D engine using the planar mapping algorithm. I will explain it for whoever is interested. Steps (for each polygon): 1- Obtain the polygon normal and choose the greater element to generate the texture coordinates using the remaining elements: For a polygon (suppose a triangle) [A B C], I get the normal [N] and compare:
if (Abs(N.x) > Abs(N.y)) (Abs(N.x) > Abs(N.z)) then
Make UV coordinates from the YZ elements of the polygon.
Set variable "Flag = 1"
if (Abs(N.y) > Abs(N.yx) (Abs(N.y) > Abs(N.z)) then
Make UV coordinates from the XZ elements of the polygon.
Set variable "Flag = 2"
else
Make UV coordinates from the XY elements of the polygon.
Set variable "Flag = 3"


2- The next step in the algorithm is to obtain the maximum and minimum of these coordinates, in order to clamp them to the range [0..1]: For each coordinate:
   Save the minimum U into U_Min
Save the minimum V into V_Min
PolygonVertex.U = (PolygonVertex.U - U_Min) / (U_Max - U_Min)
PolygonVertex.V = (PolygonVertex.V - V_Min) / (V_Max - V_Min)


- Then, the lightmap position vectors are transformed into world space by using the equation of the plane to obtain the remaining unknown element:
if Flag = 1 then
// Vertex 1
V1.x = - (N.y * U_Min + N.z * V_Min & D) / N.x; // Plane equation
V1.y = U_Min
V1.z = V_Min
// Vertex 2
V2.x = - (N.y * U_Max + N.z * V_Min & D) / N.x; // Plane equation
V2.y = U_Max
V2.z = V_Min
// Vertex 3
V3.x = - (N.y * U_Min + N.z * V_Max & D) / N.x; // Plane equation
V3.y = U_Min
V3.z = V_Max
if Flag = 2 then
// Vertex 1
V1.x = U_Min
V1.y = - (N.x * U_Min + N.z * V_Min & D) / N.y;
V1.z = V_Min
// Vertex 2
V2.x = U_Max
V2.y = - (N.x * U_Max + N.z * V_Min & D) / N.y;
V2.z = V_Min
// Vertex 3
...
if Flag = 3 then
...


Now, to achieve the standard per pixel lighting, I need to do the product N.L for every pixel (or lumel, in this case). I can easily get the L factor from the previous algorithm, just interpolating between through the vertices:
Edge1 = V2 - V1
Edge2 = V3 - V1
for each pixel U
for each pixel V
Pixel = V1 + Edge1 * U + Edge2 * V


Finally (!), here comes my problem: I don't know how to get the same interpolation factor for the N (normal) factor just like I did for the L factor. That's because I have only the normal of the plane, and what I need is the normal for each vertex V1, V2, and V3. The Planar Mapping algorithm projects the vectors into the UV plane and changes its orientation depending on the normal N, so I can't see how can I get my original normals. Is there another equation that allows me to get the original vectors from the equation of the plane, the normal of the plane and/or the transformed vectors (V1, V2, V3)? Better still, is there a matrix that represents the transformation done, so I can get the original vectors from the new ones (V1, V2 and V3)? I will appreciate any help :) Thanks guys!

##### Share on other sites
Quote:
 I don't know how to get the same interpolation factor for the N (normal) factor just like I did for the L factor. That's because I have only the normal of the plane, and what I need is the normal for each vertex V1, V2, and V3.

I don't quite understand what you're asking in this sentence. You want vertex normals?
Why can't you just sum up the plane normals around each vertex?

##### Share on other sites
Oops, I know sometimes (almost anytime :) ) I'm not clear...
I will be more graphical now:

After getting the texture coordinates with planar mapping, I have only one normal for the entire triangle, the normal of the plane the polygon lies on. When I compute the N.L product my result is:

Instead, I need the normals of the original vertices of the triangle, in order to make an interpolation between them for Phong Shading:

(This last capture uses dynamic lighting instead of lightmaps).

But after the world space transform I don't know how to get them.
If I make the interpolation from the normal 1 to 3 (in the order of the vertex buffer) my results are wrong:

I hope you can help me... Thanks!

##### Share on other sites
From what I remember:

For a triangle, the Phong (?) interpolated normal is:

N = N0*w0 + N1*w1 + N2*w2

where barycentric weights:
w0 = 1 - u - v
w1 = u
w2 = v

N0,N1,N2 are the vertex normals of your triangle.

[Edited by - JakeM on May 10, 2008 7:33:24 PM]

##### Share on other sites
Thanks for your answer, but I already knew that formula :(. In fact, that's the one I'm applying in the third screenshot.

I will try to show my problem in another way.

By creating the lightmap for each triangle, I get the minimum and maximum U and V, and the vertices of the plane for the lightmap are created from these values, which generates a projection that may be, for example:

Case 1:

Case 2:

Where N1, N2 and N3 are the vertex normals, and Np is the plane normal.

This makes me unable to interpolate the vertex normals, as the projection on the plane for the lightmap may vary depending on the position values for each vertex.

On the other hand, I know that it is possible to apply Phong Shading on a lightmap.
Does anyone could do this?
Is it necessary to implement another different technique than Planar Mapping?

##### Share on other sites
For a vertex normal, I average the normals of all the triangles that share the vertex. It comes out looking nice and smooth.

##### Share on other sites
Well, I kinda do that now, by using the plane normal (screenshot 1)... But it doesn't come out looking nice and smooth in my case :(

How do you achieve that with just an average? I assume that you're talking about cubic meshes or spheres with a high poly count...?

##### Share on other sites
Quote:
 Original post by jpventosoWell, I kinda do that now, by using the plane normal (screenshot 1)... But it doesn't come out looking nice and smooth in my case :(How do you achieve that with just an average? I assume that you're talking about cubic meshes or spheres with a high poly count...?

Well, on the first one it looks like the entire polygon is lighted evenly based on that polygon's normal.

What I was referring to was how to find the vertex normals, although you still have to interpolate between them. The entire process would work something like this:

//pseudocodefor every polygon p {  for every vertex i in polygon p {    for every polygon q which uses vertex i {      vertex_normal[i] += polygon_normal[q] / number_of_polygons_that_use_vertex_i    }  }for every point x in polygon p {  for every vertex i in polygon p {    point_normal[x] += weight[i]*vertex_normal[i]  }}[/code]Where weight[i] is inverse proportional to the distance between point[x] and vertex[i], and the sum of all weight[i] = 1.

##### Share on other sites
Wow, at a first glimpse I didn't understand a thing but that algorithm seems to be just the solution I'm looking for! :)

I'm going to print it and think about it carefully; Thanks for the help!

I'll post any news tomorrow. Thanks again!

##### Share on other sites
Quote:
 This makes me unable to interpolate the vertex normals, as the projection on the plane for the lightmap may vary depending on the position values for each vertex.

How do you plan to use this interpolated normal? Are you trying to project it to 2D? If so, why?

##### Share on other sites
Quote:
Original post by JakeM
Quote:
 This makes me unable to interpolate the vertex normals, as the projection on the plane for the lightmap may vary depending on the position values for each vertex.

How do you plan to use this interpolated normal? Are you trying to project it to 2D? If so, why?

He wants to know how bright a particular spot should be, using a particular lighting technique, and you need to know what the normal for that particular spot is. The more the normal points towards the light source, the brighter it should be. Even though the mesh isn't smooth, this method can give the illusion of smoothness.

##### Share on other sites
Quote:
 Original post by erissianHe wants to know how bright a particular spot should be, using a particular lighting technique, and you need to know what the normal for that particular spot is. The more the normal points towards the light source, the brighter it should be. Even though the mesh isn't smooth, this method can give the illusion of smoothness.

I know, you can do all of that without projecting the plane to 2D. I still don't see the problem he's talking about.

##### Share on other sites
Quote:
Original post by JakeM
Quote:
 Original post by erissianHe wants to know how bright a particular spot should be, using a particular lighting technique, and you need to know what the normal for that particular spot is. The more the normal points towards the light source, the brighter it should be. Even though the mesh isn't smooth, this method can give the illusion of smoothness.

I know, you can do all of that without projecting the plane to 2D. I still don't see the problem he's talking about.

He's doing this, ultimately, for a light map, so he needs to be able to take this information and use it in texture-space.

##### Share on other sites
Your algorithm greatly improved things (I'm now a step closer to a real phong lighting :) ), but as it can be noticed in this capture, a lighting difference between each triangle still remains:

I think this is due to the location of each triangle within the Lightmap (cases of the diagrams on my previous post), and I couldn't find a solution to this problem yet...

Erissian, do you use planar mapping to generate your Lightmaps (if you're using any)?

Thanks again!

##### Share on other sites
Quote:
 Original post by jpventosoYour algorithm greatly improved things (I'm now a step closer to a real phong lighting :) ), but as it can be noticed in this capture, a lighting difference between each triangle still remains:I think this is due to the location of each triangle within the Lightmap (cases of the diagrams on my previous post), and I couldn't find a solution to this problem yet...

This last bit is now a problem with the interpolation. The triangles appear discretely because they are still treated as a plane defined by the vertexes, although the light is generated smoothly across that plane. For the final step, it will be a little harder because you now have to interpolate not across the shape of the model, but across the shape that the model represents. This will require spherical linear interpolation (slerp) between the three vertexes.

Quote:
 Erissian, do you use planar mapping to generate your Lightmaps (if you're using any)?Thanks again!

Sometimes, yes. It depends on the project. For instance, if I were rendering planets I would use two lightmaps that represent the planet's surface and the light from the star(s), the latter of which rotates with time, and multiply their values. Shadows from most objects are a non-issue, and until you are very close to the surface, lighting just isn't that dynamic.

Also, building interiors with fixed lighting. There's no sense in calculating the lighting in a scene every time you render several dozen static objects.

Also, no problem!

##### Share on other sites
Quote:
 Original post by erissianThis last bit is now a problem with the interpolation. The triangles appear discretely because they are still treated as a plane defined by the vertexes, although the light is generated smoothly across that plane. For the final step, it will be a little harder because you now have to interpolate not across the shape of the model, but across the shape that the model represents. This will require spherical linear interpolation (slerp) between the three vertexes.

OK, but, when I use per pixel lighting with shaders, I thought that I was only doing a linear interpolation. I'm asking this because, when I use shaders, this problem does not occur.

Or perhaps it is happening due to the fact that lightmaps are calculated at a lower resolution and thus loses definition on the edges?

Quote:
 Original post by erissianAlso, building interiors with fixed lighting. There's no sense in calculating the lighting in a scene every time you render several dozen static objects.

I wanted to implement lightmaps for the same reason, to increase performance by reducing the lighting calculations when the environment is static, and also to generate soft shadows (I'm having problems with this last item but that will be a new topic :) ).

##### Share on other sites
Quote:
 Original post by erissianThis will require spherical linear interpolation (slerp) between the three vertexes.

What's the equation for slerping 3 vertex normals?

##### Share on other sites
Quote:
 Original post by JakeMWhat's the equation for slerping 3 vertex normals?

I'm interested in that as well...

##### Share on other sites
The trick isn't slerping between three points (you slerp between two and then slerp between that answer and the third point), it's doing it so that the points are evenly distributed. This is how you iteratively slerp through a triangle using two coordinates:

Let's call our verts A,B and C. We'll use two coordinates, t and u. n will determine the number of individual points you want to sample, per ½(n-2)(n-1)

//pseudocode, returns Rn = some integerf = 1/nfor t=0:n  if t==0    R = A  P = slerp(A,B,t*f)  Q = slerp(A,C,t*f)  g = 1/t  for u=0:t    R = slerp(P,Q,u*g)

It may help to look at a ternary plot to get an idea of how it walks through the triangle. Also, it would be foolish to implement this literally - you can optimize it quite a bit by working through the math and reducing.

##### Share on other sites
Quote:
 Original post by erissianLet's call our verts A,B and C.

Are ABC vertex normals or quaternions? If they're quats, how did they get from vector to quat?

##### Share on other sites
Let me add another question to it: In order to do the slerp, Do I need to know the angle between the edges right? How can I get it?

##### Share on other sites
Well, if nobody is going to step up, I'll try to answer it.
Yes, you can slerp vectors!

Apparently, you can convert 3D vectors into 4D quaternions, by simply setting w=0:
quat = VEC4( normal.x, normal.y, normal.z, 0 );

Then you can use this quat in any slerp function intended for quaternions.

To convert back, simply drop the w component:
vect = VEC3( quat.x, quat.y, quat.z );

Dropping the w component shouldn't be a problem, because if you're interpolating 3 normals,
then your result should still point relatively in the same direction. I hope.

So, to interpolate three vectors, v1,v2,v3:

result1 = slerp( v1, v2, t1 );
result2 = slerp( result2, v3, t2 );

I would suppose t1 and t2 relate to the barycentric weights chosen for each vertex normal.

But as erissian pointed out, this isn't the best way to do it. The final result from slerping 3 rotations
is dependent on the order! So, slerping r1,r2, and then r3, produces a different result than r3,r2 and r1.

The way to get slerping that is independent of rotation order, is to use an iterative slerp algorithm.

[Edited by - JakeM on May 19, 2008 2:19:19 PM]

##### Share on other sites
Oh! Ok, I imagined that the order would be a problem ...

That's why I think I'm going to ignore the slerping and use LightMaps only in cubic figures by now...

Besides it, another problem would be to determine whether each triangle will form a cubic or spherical joint, just to avoid using the slerp in cases that is not needed...

## Create an account

Register a new account

• ### Forum Statistics

• Total Topics
628378
• Total Posts
2982348

• 10
• 9
• 15
• 24
• 11