Calculating .map projection data from UV coordinates [SOLVED]

Started by
7 comments, last by Kwizatz 15 years, 8 months ago
So I want to avoid having to remap textures from imported architectural/level geometry in order to convert it into .map format (Half Life mapversion 220), the formulas for getting UV's from the values stored in a .map are: u = x * u_axis.x + y * u_axis.y + z * u_axis.z + u_offset v = x * v_axis.x + y * v_axis.y + z * v_axis.z + v_offset So basically what I want to do is invert them, I would know the values for x,y,z,u and v, and I would like to find out what values go in u_axis.x,u_axis.y,u_axis.z, u_offset and the ones for the v counterpart. Assuming that the UVs are orthographically projected onto the polygon, I think the calculation is feasible right? but by solving for an unknown and replacing the equation becomes a nightmare as there are 4 for each one, I am sure this is one of the reasons Matrices were invented for, so has anyone ever done this or could help me get a nice clean piece of code? If anything above needs some elaboration, let me know and I'll elaborate on it, for example x,y,z is an arbitrary point in a face/polygon. Thanks in advance. [smile] [Edited by - Kwizatz on August 22, 2008 3:30:55 PM]
Advertisement
Rearranging the equations gives you:

u = [x,y,z] . u_axis + u_offset;
v = [x,y,z] . v_axis + v_offset;

You can interpret u_axis and u_offset together as a plane definition (call it 'U'), as well as v_axis and v_offset (call it 'V'). What these equations basically tell us is that we have some point [x,y,z], and that the distance to this point from plane 'U' is 'u', and the distance from this point to plane 'V' is 'v'. The point is a constant, but you can monkey around with the planes as much as you want. Since each plane has two degrees of freedom (the normal and the distance), there is no single solution. You pick a normal, and then solve for the distance, or vice versa.

So to answer your question, yes it's solvable, but there is no single solution. What you want to try and do is chose u_axis and v_axis to be perpendicular to each other, as well as coplanar with the face. This isn't strictly necessary as we've just seen, however it makes for sane texture axis (when's the last time you worked with a non-orthonormal basis?). What I usually do is take two adjacent edges of a face, let one of them be the first axis, and orthogonalize the other edge (with respect to the first edge) so that it can be the second axis. Once that's done, it's easy to solve for u_offset and v_offset.
Makes sense, I was fiddling with Mathomatic and realized that as you said there is no single option, no matter, all I need is a solution that gives me the same results for u,v when the original equations are computed.

Thanks for putting it into perspective, I'll give it a spin [smile].

Edit: Now I am thinking, the solution would be good for that point in the face, but may not be for the next one, how would I take those into account?

example:

Point 1 (P1): (x1,y1,z1) 0,0,0 (u1,v1) 0,0
Point 2 (P2): (x2,y2,z2) 1,0,0 (u2,v2) 0.5,0

How can I find a solution that is consistent for both sets of values?
I am thinking I take the vector P2-P1 as the first axis, normalize and multiply by u2-u1 or v2-v1 depending on which equation I am working on to set its magnitude, would that do it?

[Edited by - Kwizatz on August 20, 2008 5:51:06 PM]
Ok, I've given this some thought and now I believe what I should calculate is the axes for both U and V duh!, please bear with me.

What I mean is 2 vectors coplanar with the face, independent from each other, but very likely to be perpendicular, not necessarily normalized as they do contain scale information, which point in the direction on which U and V increase respectively.

So, given that I have at least 3 points that form a triangle or polygon and UV info for each point, is it possible to calculate these UV axes, how?

Once I get this part, I think the offset should be easy.

Thanks again. [smile]
Ignore my last post, for some reason I was assuming that each point would get its own u_offset and v_offset values, which of course doesn't make any sense since there's only one for each axis and not each point [rolleyes]

Essentially what we have here is a 2D problem in 3D space. This extra degree of freedom leads to all sorts of problems if you try and solve it directly, so what we need to do first is project the problem into 2D space, solve it there, and expand the solution back into 3D. Sounds tricky, but it really isn't so bad :)

Step 1: Projection to 2D

Start with three non-collinear points P0, P1, and P2. Generate two vectors V0 = (P1 - P0) and V1 = (P2 - P0). Next, orthogonalize V1 by doing:

V1 = V1 - [V0 * (V1.V0) / (V0.V0)]

Now normalize both V0 and V1. Finally, convert the three points P0-P2 into 2D (we'll let lowercase 'p' indicate the 2D points and uppercase 'P' indicate the 3D points):

pn.x = Pn.V0
pn.y = Pn.V1

Step 2: Find the 2D texture axis

Now that we're in 2D, we can find the texture axis and convert them to 3D later. With the three 2D points we can take the equations from your original post and form a system of equations:
[ p0.x p0.y 1 ]   [ ux ]   [ P0.u ][ p1.x p1.y 1 ] x [ uy ] = [ P1.u ][ p2.x p2.y 1 ]   [ uo ] = [ P2.u ]
Above, 'ux' and 'uy' are the coordinates of the unknown U-axis, since we have Pn.u on the right-hand side (you'd use Pn.v on the right-hand side for the V-axis). 'uo' is the offset. From here we apply matrix algebra rules -- invert the matrix, and mutiply it on the right-hand side. The result will give you the U-axis (with offset) in 2D! Rinse and repeat with Pn.v on the right-hand side (and 'vx'/'vy'/'vo' on the left-hand side) for the V-axis.

Step 3: Convert the axis back into 3D

Probably the easiest step, to find the 3D texture axis do:

u_axis = ux * V0 + uy * V1
v_axis = vx * V0 + vy * V1

The offsets, as it turns out, are the same in 3D as they were in 2D (u_offset = uo and v_offset = vo), so nothing special has to be done and you're all finished!
Alright, I'll have to give the matrix part some thought, but I get the basic idea, thanks.
Alright! it works, my 3x3 matrix inverse was no good, but after fixing it, it works, just one thing though, since the axes must keep scale, I had to skip the V0 and V1 normalization step, yay less FPU cycles!.

Thanks yet again!
Hmm, V0 and V1 are really just helper vectors used to transform the points into 2D, so they really should be normalized so that the points have the same scale in 2D as they did in 3D. The matrix inversion should find the correct scale for the texture axis. Is this not the case for you?
Quote:Original post by Zipster
Hmm, V0 and V1 are really just helper vectors used to transform the points into 2D, so they really should be normalized so that the points have the same scale in 2D as they did in 3D. The matrix inversion should find the correct scale for the texture axis. Is this not the case for you?


No, for most points it would find really close approximates which could be blamed on floating point error, but on some, the difference would be noticeable, for example a UV value of 0.4 turns to 0.6 when the original functions are executed on the found Axes.

If I don't normalize, the functions return the exact number, no floating point error, at least on a 2 unit axis aligned cube.

Edit: I though about normalizing the 2d vectors, but really leaving V0 and V1 untouched seem to work as intended.

This topic is closed to new replies.

Advertisement