Eigenvectors and Covariance matrix

Started by
12 comments, last by jjd 18 years, 8 months ago
I'm currently implementing a matrix class for my engine and I want to include a method to get the eigenvectors from a matrix for use with a OBB tree class I intend to write. I only need to do this with symmeteric matrices. I implemented a method using the QR Decomposition algorithm, storing the product of all the Q matrices calculated during the decomposition and then taking the column vectors of that matrix as my eigen vectors. Is this the correct way of doing this? Is there a better algorithm for determining eigenvectors?
Advertisement
All I can say is, if you can get a peek at the RAPID collision detection library (source & docs), which relies heavily on that sort of stuff, and it is exactly what you are trying to achieve (on OBB tree structure).

Last time I checked, the code was publicaly available. I don't pretend to understand the matric calcs (finding covariance matrix, then finding the eigen vectors, I just skipped that part), but you may find it more useful. it also has discussions on finding the most balance tree, which again, is right down your alley.

Everything is better with Metal.

You may want to check Household reflections, or Jacobi Factorization of the type
A = Trans (M) * E * M
Both metros are well implements in numerical recipe, and are and order of magnitude faster than Q and R factorization.
Jacobi is iterative method and in generally converges on n passes when n in the row count, for a 3 x 3 matrix is 3 passes.

Another solution is not to use any method and just apply the eigenvector definition by root finding to the matrix.
Sine the matrix is 3 x 3 the process is very fast about 100 times faster than any of the algorithmic methods.

BTW in the rapid library if the Jacobi decomposition method is replaced by the
Sorry the post got torucaded, last sentence said

BTW in the rapid library if the Jacobi decomposition method is replaced by the theoretical higen vector decomposition the whole build process become about 2 to tree time faster.

Look here. I haven't tried it yet, but it seems to be a rather good way to get eigenvectors for 3x3 matrices in a non-iterative way.
Actually my apologies I said to look into the householder method, but this is not correct.

There are 4 methods for calculating the Eigen value / Eigen vector problem:
1- Root finding
2- Power method
3- Jacobi Factorization
4- Q and R factorization.

For 3 x 3 symmetric matrix the fasters would be Root finding.
I would not consider Power method for anything.
Jacobi method is very fast and accurate and you can find a extremely good version in Numerical recipe in C. This is the same version used in the Rapid library (verbatim), and is general.

Q and R is the more robust and powerful, but it is at the same time the slowest, however Q and R works almost in linear time if the matrix is diagonal, and that’s where the householder method is useful.
By combining householder method with the Q and R, it is possible to come up with a version that is more robust than Jacobi, but it is only practical for very large systems.

Conclusion:
For 3 x 3 use root finding, or jacobi, else Jacoby for everything else.
If jacobi fail (it will for very High Condition Number matrices)
Q and R with Housholder reflections
Again: PS:
Sorry again, these are havelly handed topics that one forget the detail easy if you are not on top of them, which is dificult.
this sentence is:
Quote:Original post by Anonymous Poster
Q and R is the more robust and powerful, but it is at the same time the slowest, however Q and R works almost in linear time if the matrix is diagonal, and that’s where the householder method is useful.

Should be:
"Q and R is the more robust and powerful, but it is at the same time the slowest, however Q and R works almost in linear time if the matrix is three-diagonal, and that’s where the householder method is useful."



Hi,

the original poster didn't specify whether the matrices were 3x3 or not. If you are limiting yourself to 3x3 matrices then finding the eigenvalues is easy since you have a 3rd order polynomial and you can directly solve it (see http://mathworld.wolfram.com/CubicFormula.html).

One option would be to get the appropriate routine from LAPACK. It'll be fast :)

For this kind of problem I would consult "matrix computations" by Golub and Van Loan. They have a chapter on the symmetric eigenvalue problem and the method they start with is QR. That's a good sign :) The approach they use is to make the matrix tridiagonal using the Householder transform and then a bunch of Givens rotations to diagonalize. However, be warned, this book is not light on the math.


-Josh

--www.physicaluncertainty.com
--linkedin
--irc.freenode.net#gdnet

I believe the question stated that it is for calculation of objects oriented bounding boxes for a graphic engine. If this is the case then the Covariance matrices are strictly 3 x 3 positive symmetric semi definitive which simplify the problem. Also I think Gorwooken was looking for some quick and simple routine, not a big math package like LAPACK, so I think cutting and pasting the routine in Mathematical recipe will do the job just fine.

I know if of topic but about the routines provided by LapPack, long time ago I used these routines extensively for calculating training vectors for block based vector quantization where the blocks sizes where 16 x 16 x 3 and 32 x 32 x 3 leading to matrices of sizes 768 x 768 and 3072 x 3072 respectively.

I experimented with ATLAS, LAPACK, and MATLAB. All of these packages use a variant of one routine called EISPACK. Which implement a Househould / QR method, however because the implementation is general and not only for symmetric matrices, the householder pass do not condition the matrix to be tridiagonal, but to an upper-Hessenberg matrix (one that is tridiagonal plus all the element in below the diagonal are not guaranty to be zero).
Therefore the QR pass has very much a generic complexity somewhere between O (n2) and O (n3).
The reason is that because the routine has to work with all kind a square matrices no just PSD.

The Jabobi method is based on applying Givens rotations row by row, but performs the rotations by saving the zeros and avoiding matrix multiplications, say 3072 x 3072 can be calculated in about 20 or 30 rotations for an error bound of about 1.0e-16. beating a generic QR by a lot if the matrices are large.
Sorry, I didn't know what OBB meant :) If the problem relates to 3x3 matrices, then numerical routines are overkill. Use the direct solution I gave.

Also, I'm unsure how long ago you used LAPACK, but you can download the individual routines, you don't need to install the entire library. Also, there *are* routines in LAPACK that are specifically for symmetric eigenvalue problems. See http://www.cs.colorado.edu/~jessup/lapack/drivers.html

Finally, the Jacobi method is interesting because it is "inherently parallel". And of course any method that employs direct matrix multiplication is going to be slow. The Household transformation that I quoted from Golub and Van Loan also avoids this problem. I also fail to see how a the Jacobi method -- a method that is applied "row-by-row" -- can reduce a 3072x3072 matrix in 20-30 rotations. I guess I am missing something.


-Josh

--www.physicaluncertainty.com
--linkedin
--irc.freenode.net#gdnet

This topic is closed to new replies.

Advertisement