Creating OBB

Started by
8 comments, last by max343 11 years, 5 months ago
Hi all,

I'm trying to create an OBB for my engine but failed to do so.

I have read good articles/tutorials on line, and I think I understand the concept, but still the OBB does not look good.

The algorithm I use:

for each vertex on the mesh vertices do:
m+= vertex

m/= vertices.size()

Now calculation the covariance matrix:

for each vertex on the mesh vertices do:
{

upperRight3x3Triangle[0] += p.x*p.x - m.x*m.x;
upperRight3x3Triangle[1] += p.x*p.y - m.x*m.y;
upperRight3x3Triangle[2] += p.x*p.z - m.x*m.z;
upperRight3x3Triangle[3] += p.y*p.y - m.y*m.y;
upperRight3x3Triangle[4] += p.y*p.z - m.y*m.z;
upperRight3x3Triangle[5] += p.z*p.z - m.z*m.z;
}

basically I have computed the upper right triangle of the covariance matrix.

Then I just create the matrix from those values, so that the lower triangle of the matrix is symmetrical to the upper triangle.

My problem is that according to the resources I have read, this matrix has the form of [ Right, Up, Forward ] (3x3 rotation matrix ),

Those 3 axes that represents the OBB orientations should be orthogonal to each other ( because the matrix is symmertic ). I have draw

the box axes, and they are not orthogonal to each other.

what am I missing ?

Any help/reference would be great.

tnx a lot
Advertisement
Your covariance calculation is incorrect. The covariance of (x,y) is E[(x-µ[sub]x[/sub])(y-µ[sub]y[/sub])] = E[xy - xµ[sub]y[/sub] - yµ[sub]x[/sub] - µ[sub]x[/sub]µ[sub]y[/sub]], where µ[sub]x[/sub] and µ[sub]y[/sub] are the mean values of x and y, respectively, which is not equal to E[xy - µ[sub]x[/sub]µ[sub]y[/sub]].
Hi,
Tnx for you reply.
If I understand u right, for vec3, I needs to have something like that (x-a)(y-b)(z-c) where x,y,z are the coords of some vertex, and a,b,c are the coords of the mean,

I have followed this paper:

[attachment=12026:Screen Shot 2012-10-31 at 6.28.07 PM.png]

which seems to take the multiplication of ( 3x1 ) ( 1x3 ) vectors, which gives the matrix.

please correct me if I'm wrong.

tnx
In my post, x and y are scalar random processes. They correspond to p.x, p.y or p.y, or whatever pair of coordinates axes you want to estimate the covariance of. On vector form, the equations are just what you posted. But my point of my post is that, if you expand the product in the summation, then you don't get what your code shows. Expand the product and and do the calculations for every component of the vectors and see what you get.
Hi,

Well I thought about it, and decided to change the covariance code.


float xx = 0,xy = 0,xz = 0,yy = 0,yz = 0,zz = 0;
for ( int i = 0 ; i < vertices.size(); ++i )
{
const vec3 &p = vv;
const vec3 diff = p - midPoint;
xx += diff.x * diff.x;
xy += diff.x * diff.y;
xz += diff.x * diff.z;
yy += diff.y * diff.y;
yz += diff.y * diff.z;
zz += diff.z * diff.z;
}
upperRight3x3Triangle[0] = xx;
upperRight3x3Triangle[1] = xy;
upperRight3x3Triangle[2] = xz;
upperRight3x3Triangle[3] = yy;
upperRight3x3Triangle[4] = yz;
upperRight3x3Triangle[5] = zz;



But still I'm getting odd results:
[attachment=12038:Untitled.png]

I'm not sure what is the problem. I don't care about the look of the box ( for now ), I just want to get the axis orthogonal to each other :(
If you want the orthogonal basis of the distribution having the calculated covariance matrix, then you need to further process the covariance matrix. Look up principal component analysis. The basic idea is that the orthogonal basis are made up of the eigenvectors of the covariance matrix. The power of each basis vector is given by the inverse of the eigenvalue of the corresponding eigenvector.
thank you very much.
This is what I have looked for.
Just a note, generally computing the cov matrix in order to find its eigenvectors is a grave mistake (may lead to a very serious round-off error).
A much better approach is to find the SVD directly, that is right after you QR'ed the deviations Nx3 matrix into Q Nx3 and R 3x3.
HI,

max343 : can u please refer me to some reference regarding that thing ?


I'm trying to update the OBB according the bones movement. Each OBB has its corresponding bone, which deforms it. The problem is the coords system of the

bone, does not matches to the OBB coords system, which causes wrong rotations. setting the OBB rotation axes to be the same as the bone, gives the correct

rotation, but the OBB loose its original shape. Any ideas?
I'm not sure whether your problem is with finding the eigenvectors or something else.

Either way, implementing the general SVD is rather complex (generally references describe only this), your best way out is to use some math library like LAPACK, GNUSL, MKL and etc...
If you don't want to use external math libraries, you can implement a special case 3x3 SVD with the next steps:
1. Implement the QR decomposition (this is easy). There're tons of documentation on how to do it correctly. Apply it on your deviations matrix, and toss Q away, as you won't be needing it anymore.
2. Bidiagonalize your R matrix. This is very easy since R is a 3x3 upper triangular matrix. All you'll need to do is to apply a single Householder transformation in order to zero the (1,3) element. Keep the Householder matrix (or vector, it doesn't matter) besides the obvious bidiagonal matrix (let's call it B).
3. Here comes the tricky part. You'll need to implement a variant of the QR algorithm to work on B, but not to find its eigenvalues/eigenvectors, but rather its singular values and right singular vectors. The easiest (right) way to do this is to diagonalize the matrix H which is defined by this matlab'ish notation H=[0,B';B,0]. But in order to do that efficiently you'll need to apply a permutation so it'll be a tridiagonal 6x6. The permutation for 3x3 base case is known: P=[e1,e4,e2,e5,e3,e6]. So you'll start with P'HP (this is basically one vector of the distinct elements of B in order, duplicated along the secondary diagonals of H). Now run the normal QR algorithm on this tridiagonal matrix to find its eigenvalues/eigenvectors. Afterwards apply the inverse of P (trivial) on the eigenvectors. Take the non-negative eigenvalues (there will be 3, zeroes notwithstanding), and the corresponding eigenvectors. The 3 eigenvalues are actually the singular values of R. Now truncate the taken eigenvectors to 3 coordinates so they form a 3x3 matrix. Multiply the Householder matrix from the previous step by this matrix to obtain the right singular vectors.

As you can see, even for this relatively simple case the SVD implementation is not that easy.

This topic is closed to new replies.

Advertisement