Sign in to follow this  

Another matrix identity

This topic is 4575 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Given an arbitrary matrix A and a diagonal matrix D. Further let AT be the tranpose of A. Can it be shown that the product: A * D * AT results in a diagonal matrix? Or in other words: B = A * D * AT Is B always a diagonal matrix - proof? Regards, -Dirk

Share this post


Link to post
Share on other sites
can't you just do the multiplication yourself with a general matrix A?

The diagonal matrix scales the elements of one of the matrices and then it's just a normal multiplication of two matrices.

Share this post


Link to post
Share on other sites
If it wasn't the ninth of july, I'd say it's homework...

Give some background of your problem, please.
/def

Share this post


Link to post
Share on other sites
From the top of my head:
Since A*D is a scaled version of A, multiplication is transitive:
A * D * AT = A * AT * D = I * D = D, which is a diagonal matrix.

Share this post


Link to post
Share on other sites
A * AT = I Only if A is an orthagonal matrix. Generally, AT != A^-1.
(EDIT: in reply to the previous poster)

Share this post


Link to post
Share on other sites
Oh, shoot, yeah... I read inverse, where it said transpose... Thanks for the wake-up call! :-)

Share this post


Link to post
Share on other sites
Thanks for the help guys. Of course the resulting matrix is not diagonal, at least not in general. The background is solving is system DAEs like you have to to in constrained dynamics. In one step the ODE and OpenTissue just calculated the reciprocal elements of A, where A is the product as described:

A = J * W * JT // J = Jacobian, W = Inverse mass matrix

The reason that only the reciprocal is needed is that the Gauss-Seidel solver divdes by A[i][i].

So what actually is done to prepare for solving the DAE is that the equation is transformed such that the transpose of J is never needed.

For those who are interessted:

A = J * W * JT

1) W is block diagonal => W = WT
2) Matrix Multiplication is associativ

<=> A = J * ( W * JT )
<=> A = J * ( WT * JT )
<=> A = J * Transposed( W * J )

This way I save creating the transpose of the Jacobian...

-Dirk

Share this post


Link to post
Share on other sites
Sign in to follow this