# Planar texture mapping/vector transformation

This topic is 3273 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

Hi, I'm trying to implement a basic form of planar texture mapping in my open-source skyscraper simulator app, and am having a lot of trouble figuring out some things related to it (mainly with 3D vector transformation). Basically what I'm trying to do is take an existing polygon, rotate it so that it lines up with an axis-aligned plane (generally a Z=0 plane), and then calculate the coordinate extents to create a square around the polygon (top-left, top-right, and bottom-right), and then rotate/transform those back to use as the texture mapping coordinates. Everything's in place except for the rotation transformation - I've spent quite a while trying to figure it out, but I can't really seem to get it right. My simulator uses the Crystal Space engine, which has a lot of transformation-related functions in it, but nothing that does exactly this. Right now I'm setting up a transformation matrix with the Y vector being the polygon's normal, the X vector being perpendicular to the Y vector (I found a function in Povray which appears to do this correctly, and translated it to C++), and the Z vector being the cross product of the X and Y vectors. Then I'm normalizing those, feeding them into Crystal Space's csReversibleTransform class with an origin vector of <0, 0, 0> - I'm thinking that the matrix I created needs to be correlated with the identity matrix first, but I don't really know (I'm very new to matrix transformation stuff). I can easily get the related polygon's plane, but don't really know how I'd use that for transformation. Has anyone on here done anything similar to this, or has a better understanding of 3D vector transformation than I do? Thanks :) Here's my project it anyone's interested: http://www.skyscrapersim.com -eventhorizon

##### Share on other sites
The texture UV's are 2D, so once you rotate the 3D polygon into the 2D (Z=0) plane, the next step is to compute the UV there. To rotate properly you need to build a matrix for the polygon and rotate the vertices by the transpose of the matrix. In a previous post I outlined the steps to build a matrix from a vector (in a quaternion post where mat to quat was the last step). In this case the vector would be the normal to the polygon (normalized cross product of the edges).

Note that you may want the bounding box for the whole mesh containing the polygon, not just one polygon. Either way, given the box dimensions (ulc and lrc or upper left and lower right corners) the UV is then computed as follows.

For each vertex in the polygon (pt) which is now in 2D (z=0) space:

u = (pt.x - ulc.x) / (lrc.x - ulc.x)
v = (pt.y - ulc.y) / (lrc.y - ulc.y)

In other words, the u,v represents the percentage of the distances between the upper left to the lower right corner, in both the x and y dimension.

There is no need to rotate the value back to 3D.

Also, make sure the camera is looking in the proper direction to see the Z=0 plane or you may see the "side" of the projection instead of head on, where the UV's converge to a line. A texture can be projected to front-face along any look-at vector.

1. 1
2. 2
3. 3
4. 4
Rutin
15
5. 5

• 13
• 26
• 10
• 11
• 9
• ### Forum Statistics

• Total Topics
633724
• Total Posts
3013555
×