• Advertisement
Sign in to follow this  

An extremely basic (I think) question about matrices...

This topic is 4712 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello guys and thanks in advance. Well, I'm at school right now (high-school to be more exact) and there's a thing I really wanted to know - How come that multiplication isn't commutative when it comes to matrices (e.g. A*B != B*A)? I mean, it doesn't makes any sense (everyone knows that the multiplication of any number 'a' by any number 'b' will result the same way if we were multiplying the same number 'b' by the same number 'a'). Some of you guys may laugh while reading this but I'm really interested in understanding why it works the way it does. By the way, for any of you who are thinking of giving me a full-detailed mathematical explanation, don't bother - I'm not that advanced when it comes to math (actually, now when I think of it, I don't even know what an imaginary number is!). That will be it, thanks a lot in advance (again) - any comment(s) will be apperciated. P.S. - Every mathematical operation between two matrices isn't commutative or is this rule only apply on multiplication? What mathematical operations can you apply on matrices? Can you apply any mathematical operation on them (e.g. can you sum two matrices? substract two matrices? divde two matrices? and so on...)? [Edited by - Phillipe on May 31, 2005 7:50:06 AM]

Share this post


Link to post
Share on other sites
Advertisement
Try multiplying


A = [1 2 3]

4
B = 5
6

AB = 1*4 + 2*5 + 3*6 = 32

4 8 12
BA = 5 10 15
6 12 18

You can easily see that AB =/= BA here


I don't know how to explain why in general AB =/= BA for matrices. But since multiplication of matrices involves summing and multiplying scalar, I would be surprised if matrix multiplication is communative.


You can add and subtract matrices. Addition of matrices is commutative as it is entry-wise addition. You can multiply two matrices A and B provided that
A is p by n
B is n by q
and AB will be p by q

Multiplication of matrix by vector is only multiplication of matrices as vector can be regarded as a matrix of 1 by p (row vector), or p by 1 (column vector).

You can also multiply a matrix by a scalar, in which case you perform entry-wise multiplication of the scalar with the entry.

You can also take inverse of some square matrices, these matrices are called invertible. Multiplication of a matrix with its inverse gives you identity matrix, ie. if B is the inverse of A, then AB = BA = I , where CI = IC = I for any square matrix C of the same dimension as the I.

"Divide" one matrix by another can be seen as multiplying suitable inverse, much like divding 3 is really multiplying 1/3.

Hope that helps. Maybe someone more knowledgable can elaborate more and give more details.

Share this post


Link to post
Share on other sites
Well written Peter :) Thanks a lot!
I'm starting to get the hang of it, although I didn't quite understand how you made the multiplication between the two matrices there...
If there's anyone else who want to reply, please do - as I said above, any comment(s) would be appreciated!
Thank you all :).

P.S. - When I asked above about which mathematical operations can you apply on matrices, I meant to ask about the four basic operations (i.e. +,-,*,/) and about the power/square root operations. Thanks again :).

[Edited by - Phillipe on May 31, 2005 8:08:26 AM]

Share this post


Link to post
Share on other sites
A less mathematical answer, to a specific case - a 3x3 matrix can represent a rotation in 3D, and to apply two rotations, you multiple them together.

If I have a matrix which represents a rotation around the y axis (look 90 degrees left, say), and another which represents a rotation about the x axis (look up 90 degrees), then the order in which I apply them will give different results:

If I rotate around the X axis so that I am looking straight up, and then rotate around the Y axis, I will still be looking straight up, but my head will twist to the side.

If I rotate around the Y axis first, so that I am looking to the left, then the rotation around the X axis will leave me looking in the same direction, but with my head tipped to one side.

Share this post


Link to post
Share on other sites
Notation:
Ai,j
stands for component with row i and column j.

For example, in matrix
A=
00 01 02
10 11 12
20 21 22
A2,1=21
In some books indexing starts with 0, in others, with 1

Multiplication:
Scala multiplication:
multiply each element by scalar(number). Commutative, that is,
Matrix*scalar=scalar*Matrix

Matrix multiplication:
Let C=A*B
Then Ci,j = dot product of row i of matrix A and column j of matrix B .
(dot product it's sum of per-component products)

It requirs that number of columns in first matrix be equal to number of rows in second.

Some pseudocode
(assuming indexing starts with 0 as in programming)
Let A is MxN matrix, B is NxO
Then C will be MxO, that is, result of multiplication have as many rows as first matrix and as many columns as second.

for(i=0;i<M;++i){
for(j=0;j<O;++j){
sum=0;
for(k=0;k<N;++k){
sum+=Ai,k*Bk,j
}
Ci,j=sum;
}
}


****************************
Let we have matrices
M=
A B C
D E F
G H I
and
N=
a b c
d r f
g h i
Then, M*N is
Aa+Bd+Cg , Ab+Br+Ch , Ac+Bf+Ci
Da+Ed+Fg , Db+Er+Fh , Db+Er+Fh
Ga+Hd+Ig , Gb+Hr+Ih , Gb+Hr+Ih
It is easy to see that it is in general case not equal to N*M (same as above with caps flipped).
****************************************

Addition: per-component, therefore commutative (A+B=B+A).
A*(B+C)=A*B+A*C holds true.
Subtraction: per-component, non-commutative(A-B=-(B-A))
A*(B-C)=A*B-A*C holds true.

Transpose: It's like swapping i and j.
Written as AT
Special properties:
A*B=BT*AT

Inverse:
Written as
A-1
Properties:
A-1*A=A*A-1=identity
(A-1)-1=A
If matrix is orthonormal(rotation only), A-1=AT

Exist only for square matrices, and not for all, some matrices (ones with zero determinant) don't have inverse (it's undefined).

Interesting things about multiplication:
when you transform a bunch of vectors (say, 100 3D vectors) by matrix, you in fact perform one matrix multiplication, with 3x100 or 100x3 (first if vectors is column, second if row).


Special square matrices:
Identity, is equal to
1 0 0
0 1 0
0 0 1

(similarly for other cases than 3x3)
also, use the power... err, [google]

[Edited by - Dmytry on May 31, 2005 10:24:44 AM]

Share this post


Link to post
Share on other sites
Quote:
Original post by Squirm
A less mathematical answer, to a specific case - a 3x3 matrix can represent a rotation in 3D, and to apply two rotations, you multiple them together.


I had coworker (university graduate in CS, no less) who was trying to prove me that rotation is commutative.

Share this post


Link to post
Share on other sites
Wow, thank you all for all of your replys :) (specially you Dmytry).
By reading what you guys have just said I somewhat understand it better. I also borrowed a math-class notebook of a freind of mine, and used the notes written in it, a thing which has also helped a lot.
Well, that will be it :). Thanks a lot again any of you guys who replied, I really appreciate it :).

@serg3d: Did he made it? ;)

Share this post


Link to post
Share on other sites
Matrix multiplication is the way it is because of function composition. Matrices represent linear transformations (linear functions) a lot. So A*B != B*A usually just like f(g(x)) != g(f(x)) usually.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement