I'm not sure whether your problem is with finding the eigenvectors or something else.

Either way, implementing the general SVD is rather complex (generally references describe only this), your best way out is to use some math library like LAPACK, GNUSL, MKL and etc...

If you don't want to use external math libraries, you can implement a special case 3x3 SVD with the next steps:

1. Implement the QR decomposition (this is easy). There're tons of documentation on how to do it correctly. Apply it on your deviations matrix, and toss Q away, as you won't be needing it anymore.

2. Bidiagonalize your R matrix. This is very easy since R is a 3x3 upper triangular matrix. All you'll need to do is to apply a single Householder transformation in order to zero the (1,3) element. Keep the Householder matrix (or vector, it doesn't matter) besides the obvious bidiagonal matrix (let's call it B).

3. Here comes the tricky part. You'll need to implement a variant of the QR algorithm to work on B, but not to find its eigenvalues/eigenvectors, but rather its singular values and right singular vectors. The easiest (right) way to do this is to diagonalize the matrix H which is defined by this matlab'ish notation H=[0,B';B,0]. But in order to do that efficiently you'll need to apply a permutation so it'll be a tridiagonal 6x6. The permutation for 3x3 base case is known: P=[e1,e4,e2,e5,e3,e6]. So you'll start with P'HP (this is basically one vector of the distinct elements of B in order, duplicated along the secondary diagonals of H). Now run the normal QR algorithm on this tridiagonal matrix to find its eigenvalues/eigenvectors. Afterwards apply the inverse of P (trivial) on the eigenvectors. Take the non-negative eigenvalues (there will be 3, zeroes notwithstanding), and the corresponding eigenvectors. The 3 eigenvalues are actually the singular values of R. Now truncate the taken eigenvectors to 3 coordinates so they form a 3x3 matrix. Multiply the Householder matrix from the previous step by this matrix to obtain the right singular vectors.

As you can see, even for this relatively simple case the SVD implementation is not that easy.

### Show differencesHistory of post edits

### #3max343

Posted 12 November 2012 - 06:24 PM

I'm not sure whether your problem is with finding the eigenvectors or something else.

Either way, implementing the general SVD is rather complex (generally references describe only this), your best way out is to use some math library like LAPACK, GNUSL, MKL and etc...

If you don't want to use external math libraries, you can implement a special case 3x3 SVD with the next steps:

1. Implement the QR decomposition (this is easy). There're tons of documentation on how to do it correctly. Apply it on your deviations matrix, and toss Q away, as you won't be needing it anymore.

2. Bidiagonalize your R matrix. This is very easy since R is a 3x3 upper triangular matrix. All you'll need to do is to apply a single Householder transformation in order to zero the (1,3) element. Keep the Householder matrix (or vector, it doesn't matter), besides the obvious bidiagonal matrix (let's call it B).

3. Here comes the tricky part. You'll need to implement a variant of the QR algorithm to work on B, but not to find its eigenvalues/eigenvectors, but rather its singular values and right singular vectors. The easiest (right) way to do this is to diagonalize the matrix H which is defined by this matlab'ish notation H=[0,B';B,0]. But in order to do that efficiently you'll need to apply a permutation so it'll be a tridiagonal 6x6. The permutation for 3x3 base case is known: P=[e1,e4,e2,e5,e3,e6]. So you'll start with P'HP (this is basically one vector of the distinct elements of B in order, duplicated along the secondary diagonals of H). Now run the normal QR algorithm on this tridiagonal matrix to find its eigenvalues/eigenvectors. Afterwards apply the inverse of P (trivial) on the eigenvectors. Take the non-negative eigenvalues (there will be 3, zeroes notwithstanding), and the corresponding eigenvectors. The 3 eigenvalues are actually the singular values of R. Now truncate the taken eigenvectors to 3 coordinates so they form a 3x3 matrix. Multiply the Householder matrix from the previous step by this matrix to obtain the right singular vectors.

As you can see, even for this relatively simple case the SVD implementation is not that easy.

Either way, implementing the general SVD is rather complex (generally references describe only this), your best way out is to use some math library like LAPACK, GNUSL, MKL and etc...

If you don't want to use external math libraries, you can implement a special case 3x3 SVD with the next steps:

1. Implement the QR decomposition (this is easy). There're tons of documentation on how to do it correctly. Apply it on your deviations matrix, and toss Q away, as you won't be needing it anymore.

2. Bidiagonalize your R matrix. This is very easy since R is a 3x3 upper triangular matrix. All you'll need to do is to apply a single Householder transformation in order to zero the (1,3) element. Keep the Householder matrix (or vector, it doesn't matter), besides the obvious bidiagonal matrix (let's call it B).

3. Here comes the tricky part. You'll need to implement a variant of the QR algorithm to work on B, but not to find its eigenvalues/eigenvectors, but rather its singular values and right singular vectors. The easiest (right) way to do this is to diagonalize the matrix H which is defined by this matlab'ish notation H=[0,B';B,0]. But in order to do that efficiently you'll need to apply a permutation so it'll be a tridiagonal 6x6. The permutation for 3x3 base case is known: P=[e1,e4,e2,e5,e3,e6]. So you'll start with P'HP (this is basically one vector of the distinct elements of B in order, duplicated along the secondary diagonals of H). Now run the normal QR algorithm on this tridiagonal matrix to find its eigenvalues/eigenvectors. Afterwards apply the inverse of P (trivial) on the eigenvectors. Take the non-negative eigenvalues (there will be 3, zeroes notwithstanding), and the corresponding eigenvectors. The 3 eigenvalues are actually the singular values of R. Now truncate the taken eigenvectors to 3 coordinates so they form a 3x3 matrix. Multiply the Householder matrix from the previous step by this matrix to obtain the right singular vectors.

As you can see, even for this relatively simple case the SVD implementation is not that easy.

### #2max343

Posted 12 November 2012 - 06:22 PM

I'm not sure whether your problem is with finding the eigenvectors or something else.

Either way, implementing the general SVD is rather complex (generally references describe only this), your best way out is to use some math library like LAPACK, GNUSL, MKL and etc...

If you don't want to use external math libraries, you can implement a special case 3x3 SVD with the next steps:

1. Implement the QR decomposition (this is easy). There're tons of documentation on how to do it correctly. Apply it on your deviations matrix, and toss Q away, as you won't be needing it anymore.

2. Bidiagonalize your R matrix. This is very easy since R is a 3x3 upper triangular matrix. All you'll need to do is to apply a single Householder transformation in order to zero the (1,3) element. Keep the Householder matrix (or vector, it doesn't matter), besides the obvious bidiagonal matrix (let's call it B).

3. Here comes the tricky part. You'll need to implement a variant of the QR algorithm to work on B, but not to find its eigenvalues/eigenvectors, but rather its singular values and right singular vectors. The easiest (right) way to do this is to diagonalize the matrix H which is defined by this matlab'ish notation H=[0,B';B,0]. But in order to do that efficiently you'll need to apply a permutation so it'll be a tridiagonal 6x6. The permutation for 3x3 base case is known: P=[e1,e4,e2,e5,e3,e6]. So you'll start with P'HP (this is basically one vector of the distinct elements of B in order, duplicated along the secondary diagonals). Now run the normal QR algorithm on this tridiagonal matrix to find its eigenvalues/eigenvectors. Afterwards apply the inverse of P (trivial) on the eigenvectors. Take the non-negative eigenvalues (there will be 3, zeroes notwidstanding), and the correspoding eigenvectors. The 3 eigenvalues are actually the singular values of R. Now truncate the taken eigenvectors to 3 coordinates so they form a 3x3 matrix. Multiply the Householder matrix from the previous step by this matrix to obtain the right singular vectors.

As you can see, even for this relatively simple case the SVD implementation is not that easy.

Either way, implementing the general SVD is rather complex (generally references describe only this), your best way out is to use some math library like LAPACK, GNUSL, MKL and etc...

If you don't want to use external math libraries, you can implement a special case 3x3 SVD with the next steps:

1. Implement the QR decomposition (this is easy). There're tons of documentation on how to do it correctly. Apply it on your deviations matrix, and toss Q away, as you won't be needing it anymore.

2. Bidiagonalize your R matrix. This is very easy since R is a 3x3 upper triangular matrix. All you'll need to do is to apply a single Householder transformation in order to zero the (1,3) element. Keep the Householder matrix (or vector, it doesn't matter), besides the obvious bidiagonal matrix (let's call it B).

3. Here comes the tricky part. You'll need to implement a variant of the QR algorithm to work on B, but not to find its eigenvalues/eigenvectors, but rather its singular values and right singular vectors. The easiest (right) way to do this is to diagonalize the matrix H which is defined by this matlab'ish notation H=[0,B';B,0]. But in order to do that efficiently you'll need to apply a permutation so it'll be a tridiagonal 6x6. The permutation for 3x3 base case is known: P=[e1,e4,e2,e5,e3,e6]. So you'll start with P'HP (this is basically one vector of the distinct elements of B in order, duplicated along the secondary diagonals). Now run the normal QR algorithm on this tridiagonal matrix to find its eigenvalues/eigenvectors. Afterwards apply the inverse of P (trivial) on the eigenvectors. Take the non-negative eigenvalues (there will be 3, zeroes notwidstanding), and the correspoding eigenvectors. The 3 eigenvalues are actually the singular values of R. Now truncate the taken eigenvectors to 3 coordinates so they form a 3x3 matrix. Multiply the Householder matrix from the previous step by this matrix to obtain the right singular vectors.

As you can see, even for this relatively simple case the SVD implementation is not that easy.

### #1max343

Posted 12 November 2012 - 06:22 PM

I'm not so sure whether your problem is with finding the eigenvectors or something else.

Either way, implementing the general SVD is rather complex (generally references describe only this), your best way out is to use some math library like LAPACK, GNUSL, MKL and etc...

If you don't want to use external math libraries, you can implement a special case 3x3 SVD with the next steps:

1. Implement the QR decomposition (this is easy). There're tons of documentation on how to do it correctly. Apply it on your deviations matrix, and toss Q away, as you won't be needing it anymore.

2. Bidiagonalize your R matrix. This is very easy since R is a 3x3 upper triangular matrix. All you'll need to do is to apply a single Householder transformation in order to zero the (1,3) element. Keep the Householder matrix (or vector, it doesn't matter), besides the obvious bidiagonal matrix (let's call it B).

3. Here comes the tricky part. You'll need to implement a variant of the QR algorithm to work on B, but not to find its eigenvalues/eigenvectors, but rather its singular values and right singular vectors. The easiest (right) way to do this is to diagonalize the matrix H which is defined by this matlab'ish notation H=[0,B';B,0]. But in order to do that efficiently you'll need to apply a permutation so it'll be a tridiagonal 6x6. The permutation for 3x3 base case is known: P=[e1,e4,e2,e5,e3,e6]. So you'll start with P'HP (this is basically one vector of the distinct elements of B in order, duplicated along the secondary diagonals). Now run the normal QR algorithm on this tridiagonal matrix to find its eigenvalues/eigenvectors. Afterwards apply the inverse of P (trivial) on the eigenvectors. Take the non-negative eigenvalues (there will be 3, zeroes notwidstanding), and the correspoding eigenvectors. The 3 eigenvalues are actually the singular values of R. Now truncate the taken eigenvectors to 3 coordinates so they form a 3x3 matrix. Multiply the Householder matrix from the previous step by this matrix to obtain the right singular vectors.

As you can see, even for this relatively simple case the SVD implementation is not that easy.

Either way, implementing the general SVD is rather complex (generally references describe only this), your best way out is to use some math library like LAPACK, GNUSL, MKL and etc...

If you don't want to use external math libraries, you can implement a special case 3x3 SVD with the next steps:

1. Implement the QR decomposition (this is easy). There're tons of documentation on how to do it correctly. Apply it on your deviations matrix, and toss Q away, as you won't be needing it anymore.

2. Bidiagonalize your R matrix. This is very easy since R is a 3x3 upper triangular matrix. All you'll need to do is to apply a single Householder transformation in order to zero the (1,3) element. Keep the Householder matrix (or vector, it doesn't matter), besides the obvious bidiagonal matrix (let's call it B).

3. Here comes the tricky part. You'll need to implement a variant of the QR algorithm to work on B, but not to find its eigenvalues/eigenvectors, but rather its singular values and right singular vectors. The easiest (right) way to do this is to diagonalize the matrix H which is defined by this matlab'ish notation H=[0,B';B,0]. But in order to do that efficiently you'll need to apply a permutation so it'll be a tridiagonal 6x6. The permutation for 3x3 base case is known: P=[e1,e4,e2,e5,e3,e6]. So you'll start with P'HP (this is basically one vector of the distinct elements of B in order, duplicated along the secondary diagonals). Now run the normal QR algorithm on this tridiagonal matrix to find its eigenvalues/eigenvectors. Afterwards apply the inverse of P (trivial) on the eigenvectors. Take the non-negative eigenvalues (there will be 3, zeroes notwidstanding), and the correspoding eigenvectors. The 3 eigenvalues are actually the singular values of R. Now truncate the taken eigenvectors to 3 coordinates so they form a 3x3 matrix. Multiply the Householder matrix from the previous step by this matrix to obtain the right singular vectors.

As you can see, even for this relatively simple case the SVD implementation is not that easy.