Point in Box using Affine Combination of Box Verticies

Started by
2 comments, last by Emergent 16 years ago
I'm looking for a solution, similar to barycentric coordinates for a triangle, where any point inside or on a box can be defined by the eight vertexes of a box with the weights of all points adding up to one. I have been looking through a couple books and online references and I've been unable to turn anything up. An even better solution would be one in which a point could be defined by an affine combination of any number of vertexes as long as that point is contained on or within the object defined be those verts. This means anything as simple as a line to something as complex as my original box problem, or beyond. Any nudge in the right direction would be greatly appreciated. Thanks!
Advertisement
This might be naive (I haven't seen this before, and if there's a 'standard' solution I don't know it), but here are some ways to get what you're looking for:

You have an underdetermined system of equations,

[ x1   x2 ... x8 ] [ w1]     [ x ][ y1   y2 ... y8 ] [ w2]  =  [ y ][ z1   z2 ... z8 ] [ . ]     [ z ][  1    1 ...  1 ] [ . ]     [ 1 ]                   [ . ]                   [ w8]


where (x1, y1, z1), (x2, y2, z2), ... etc are the vertices of your cube.

More compactly, I'll write the above equation,

P W = X

where P is the matrix, and W and X are the vectors.

Now, a whole family of 'W' vectors will satisfy this, so we need to pick one. I can think of two ways to do this:

1 - Add more constraints (Perhaps you want to require that w1+w2+w3+w4 = 1, and w5+w6+w7+w8 = 1, etc. You might be able to find a nice geometrical justification for this: You might want the coordinates to add to one in each plane, for instance.)

2 - Choose the least-squares solution.

Start with the equation above and multiply each side by P'. So you have,

P' P W = P' X

now, P' P is invertible, so,

W = (P' P)^(-1) P' X

(Just recognize that this is not the only solution to PW=X. You can add to W any vector in the null space of P and get another solution.)

This is called the Pseudoinverse, by the way.

A more numerically stable way to compute the above is with the singular value decomposition. This and more about the Pseudoinverse is covered in this PDF.

Cheers.
Thanks for the reply.

After reading the wiki article it sounds like the pseudo inverse is going to be quite an undertaking to code. It seems as if it will accomplish what I am asking though, so that may be my only hope.

Thanks again.
Quote:Original post by Peter5897
Thanks for the reply.

After reading the wiki article it sounds like the pseudo inverse is going to be quite an undertaking to code. It seems as if it will accomplish what I am asking though, so that may be my only hope.

Thanks again.


I don't want to discourage you from writing linear algebra code -- if you do, it will be a very good learning experience (linear algebra is INCREDIBLY useful) -- but I'll also point out a few things:

1 - There are lots of linear algebra libraries out there if you don't want to roll your own. For instance, the famous LAPACK.

2 - If you want to do this yourself, I can think of some slower but potentially easier-to-implement solutions. Here's one (I'm not sure if it works [it might not converge], but I think it does):

- Start with an initial guess for W. Say W0=[0,0,....,0]'.
- Iterate the following equation:
- Wk+1 = (I - g P) Wk + g X

Where g is some positive number less than 1. I think this will be stable so long as you choose a small enough value for g. (Precisely, I think it needs to be small enough so that abs(1-g*lambda) < 1, for every eigenvalue 'lambda' of P. Someone else may want to double-check me.)

[Where did this equation come from? I simply defined an 'error,' e=PW-X (this should be zero), and then applied linear feedback to, hopefully, get rid of the error: Wk+1=Wk-g e.]

This topic is closed to new replies.

Advertisement