```
Let C = Point on the polygon to be the texture map's origin.
Let P = Any Point on the polygon which will be linked to a texture coordinate.
[1, 0, 0, -Cx] [Px] [Px - Cx]
[0, 1, 0, -Cy] * [Py] = [Py - Cy]
[0, 0, 1, -Cz] [Pz] [Pz - Cz]
[0, 0, 0, 1 ] [ 1] [ 1 ]
```

Now that the center of the texture map is at the origin of the 3D coordinate system, we need to rotate the polygon so that it is on the x-y plane (taking it out of 3D space). This can be done by taking three orthogonal vectors that represent the way the texture is aligned on the polygon. One is the normal to the polygon's plane, a second is the vector that points along the direction that the x-coordinates in the texture space increases, and the third is the vector that points along the direction that the y-coordinate increases. Using just two of these we can calculate the third. These also should be normalized vectors (unit vectors).
```
Rotating a point using three orthogonal unit vectors:
Let P = Point to be rotated around the origin.
Let N = Normalized vector that is orthogonal to the polygon's plane.
Let U = Normalized vector on the polygon's plane that points in the
direction of increasing x in the texture space.
Let V = Normalized vector on the polygon's plane that points in the
direction of increasing y in the texture space.
[Ux, Vx, Nx, 0] [Px] [Ux*Px + Vx*Py + Nx*Pz]
[Uy, Vy, Ny, 0] * [Py] = [Uy*Px + Vy*Py + Ny*Pz]
[Uz, Vz, Nz, 0] [Pz] [Uz*Px + Vz*Py + Nz*Pz]
[ 0, 0, 0, 1] [ 1] [ 1 ]
```

A way to convince yourself that this actually works is by plugging in the points P = (1,0,0), P = (0,1,0), and P = (0,0,1). Notice that each identity point is rotated to point in the direction of one of the normalized vectors. Also notice which normalized vector goes with each point. This goes in the wrong direction since we want to move a point from the polygon space to the texture space, not the texture space to the polygon space. To fix this we take the transpose of the rotation matrix which reverses the direction of rotation.
```
Rotating a point from the polygon's space to texture space:
Let P = Point in polygon space.
Let N = Normalized vector that is orthogonal to the polygon's plane.
Let U = Normalized vector on the polygon's plane that points in the
direction of increasing x in the texture space.
Let V = Normalized vector on the polygon's plane that points in the
direction of increasing y in the texture space.
[Ux, Uy, Uz, 0] [Px] [Ux*Px + Uy*Py + Uz*Pz]
[Vx, Vy, Vz, 0] * [Py] = [Vx*Px + Vy*Py + Vz*Pz]
[Nx, Ny, Nz, 0] [Pz] [Nx*Px + Ny*Py + Nz*Pz]
[ 0, 0, 0, 1] [ 1] [ 1 ]
```

Now we put them together. Translate to the origin and then rotate to get the final result of moving from polygon space to texture space. (don't forget that we are moving the polygon to the x-y plane, so the z coordinate should either be zero, or should approach zero, and it can be ignored).
```
Moving from polygon space to texture space:
Let P = Any Point on the polygon which will be linked to a texture coordinate.
Let C = Point on the polygon to be the texture map's origin.
Let N = Normalized vector that is orthogonal to the polygon's plane.
Let U = Normalized vector on the polygon's plane that points in the
direction of increasing x in the texture space.
Let V = Normalized vector on the polygon's plane that points in the
direction of increasing y in the texture space.
[Ux, Uy, Uz, 0] [1, 0, 0, -Cx] [Px]
[Vx, Vy, Vz, 0] * [0, 1, 0, -Cy] * [Py] =
[Nx, Ny, Nz, 0] [0, 0, 1, -Cz] [Pz]
[ 0, 0, 0, 1] [0, 0, 0, 1 ] [ 1]
[Ux*(Px-Cx) + Uy*(Py-Cy) + Uz*(Pz-Cz)]
[Vx*(Px-Cx) + Vy*(Py-Cy) + Vz*(Pz-Cz)]
[Wx*(Px-Cx) + Wy*(Py-Cy) + Wz*(Pz-Cz)]
[ 1 ]
```

[size="5"]Transforming from screen space to polygon space
Now let's go from screen space to polygon space. To do this, we use the equation of the plane and the equations that translate a point from 3D space to screen space.
```
Transforming a point from 3D space to screen space:
Let d = The distance between the screen and the focal point.
Let P = The 3D point.
[d, 0, 0, 0] [Px] [d * Px] [(d * Px) / Pz]
[0, d, 0, 0] * [Py] = [d * Py] = [(d * Py) / Pz]
[0, 0, 0, 1] [Pz] [ 1 ] [ 1 / Pz ]
[0, 0, 1, 0] [ 1] [ Pz ] [ 1 ]
```

We can now ignore the z-coordinate because it is in orthogonal space, and say that x' = ((d * x) / z), y' = ((d * y) / z), and z' = 0. Rearranging these equations and plugging them into the equation of the plane, we can find out where lines poked through the screen will intersect with the plane and give us the seen point.
```
Finding the point on the polygon's plane:
Let N = The plane's normal vector.
Let P = Any point on the plane.
Let (Nx*x) + (Ny*y) + (Nz*z) = (Nx*Px + Ny*Py + Nz*Pz) be the equation of
a plane.
x' = ((d * x) / z)
x = (x' * z) / d
y' = ((d * y) / z)
y = (y' * z) / d
(Nx*((x'*z)/d)) + (Ny((y'*z)/d)) + (Nz*z) = (Nx*Px + Ny*Py + Nz*Pz)
z * (((Nx * x') / d) + ((Ny * y') / d) + Nz) = Nx*Px + Ny*Py + Nz*Pz
z = (Nx*Px + Ny*Py + Nz*Pz) / (((Nx*x')/d) + ((Ny*y')/d) + Nz)
```

Now we can plug in a screen coordinate (x', y') and get the point of intersection with the polygon's plane at (x, y, z). Neat.
Transforming from screen space to texture space
The last step to this process is now to put all of our pieces together and try to simplify our equation as much as possible. By using the coordinate found by intersecting the ray from the focus through the polygon's plane, we can translate the intersected point into texture space, which gives us the texture position that is being seen. I'll be using [u,v] to represent the dot product between u and v.
```
Moving from screen space to texture space:
Let (x', y') = The screen coordinates.
Let P = Any Point on the polygon's plane.
Let T = Texture point in texture coordinates.
Let C = Point on the polygon to be the texture map's origin.
Let N = Normalized vector that is orthogonal to the polygon's plane
Let U = Normalized vector on the polygon's plane that points in the
direction of increasing x in the texture space.
Let V = Normalized vector on the polygon's plane that points in the
direction of increasing y in the texture space.
(x'* z ) (y'* z ) ( )
Tx = Ux * (----- - Cx) + Uy * (----- - Cy) + Uz * (z - Cz)
( d ) ( d ) ( )
d * Tx = Ux*x'*z + Uy*y'*z + Uz*d*z - d*(Ux*Cx + Uy*Cy + Uz*Cz)
= z*(Ux*x' + Uy*y' + Uz*d) - d*(Ux*Cx + Uy*Cy + Uz*Cz)
= z*[U,(x', y', d)] - d*[U,C]
z = (Nx*Px + Ny*Py + Nz*Pz) / (((Nx*x')/d) + ((Ny*y')/d) + Nz)
= [N,P] / [N,(x'/d, y'/d, 1)]
d * Tx = ([N,P] / [N,(x'/d, y'/d, 1)])*[U,(x', y', d)] - d*[U,C]
= d * ([N,P] / [N,(x', y', d)])*[U, (x', y', d)] - d*[U,C]
Tx = (([N,P] / [N,(x', y', d)]) * [U, (x', y', d)]) - [U,C]
Ty = (([N,P] / [N,(x', y', d)]) * [V, (x', y', d)]) - [V,C]
Tz = 0
```

This is the basic answer and equation for going from screen to texture coordinates. Something you may want to notice is that you don't have to normalize N because the magnitudes cancel in the division. The way the magnitude of U and V are positioned may have some scaling properties that you like, so those may not have to be normalized either. Otherwise, the rest of the optimization and simplifying is up to you. If you have any ideas, don't be afraid to e-mail me.
[size="5"]Advanced analysis of the result
It would be good to see many of the nice features that appear in the result. The constant in each expression, [U,C] and [V,C] help offset the texture, so you could use these numbers to offset the texture on a polygon's plane. Also notice that [N,P] is duplicated, so it only needs to be calculated once per polygon with the texture centered at point P. The real disturbing fact with the result are the two divisions necessary per point. Division is slow, and cannot be easily approximated, so it would be nice to somehow dodge doing it for every point.
As of now, my method is just a simple linear interpolation between two calculated texture points on a linear scanline. As long as the amount of interpolated points stays small, it is hard to notice the difference between a non-interpolated texture and an interpolated one. Part of this is caused by the fact that everything is straight when you peer directly at the plane of a polygon and when the polygon's face isn't directly pointed at you, the points become more crammed. This is advantageous because you are more likely to notice discrepancies in objects that are much easier to see (those head on), and those are the objects less effected by the division by z.
One colleague, Andrew Jewett, suggests that it may be possible to have general line scanlines that follow the contour of changing z on a 3D plane. This would allow you to do only one division by z per scanline. If the lines could be lined up properly, not to overlap, then it may be able to provide a quick method of having near-perfectly drawn textured polygons. (I'll take a look at this method when I have time.)
[size="5"]Applying the equations in a program
Even with equations, there still has to be a method of using the practically. Not only can this possibly help one better understand the usefulness of the results, but also feel the joy of tangible success! To complement this study, I have produced a WWW page on Applying the Texture Mapping Technique.
[size="5"]Special thanks to:
Professor Forsyth, for teaching CS184 where much of what I learned went into the derivation of this technique. All of the Matrix techniques came from what I learned during lecture.