Is this multiplication formula in linear algebra true?

Started by
11 comments, last by JohnnyCode 9 years, 9 months ago

There is a thread in which is an equation formula I believe is not true. I am very concerned (sure) it is not true and I would like to get oponing to prove me it is equal.

Thanks, origiinal thread is here but you do not need to look at it (just for curiosity) http://www.gamedev.net/topic/657986-why-does-this-matrix-multiplication-order-matter/

I will begin:

It stemed originaly from formula

projection * (view * (model * position))=projection * view * model * position

the projection is a vector of 4 dimension and rest are matrices of 4x4 dimension.

I have narrowed it to

P*(V*(M*p))=P*V*M*p; where p is a vector matrix and P V M are multiplyable matricies

Now, I provide a proof that I will just copy paste here without modifications since it is very suficient

claim: P*V*M*p=P*(V*(M*p)) where p is a vector (a matrix of [x,1] / [1,x] dimensions) and P,V,M are mutiplyable matricies

a helping true algebraic statement:

P*V*M*p=p*M*V*P (becouse P*V*M=(M*V*P) T)

I will now show that the claim negate the true statement if the claim is to be true:

P*V*M*p=P*(V*(M*p)) => P*(V*(M*p))=((p*M)*V)*P

do you agree with P*(V*(M*p))=((p*M)*V)*P ? I just used associativity in the helping true statement, but it yeilds

P*V*(M*p)=(p*M)*V*P /*P-1*V-1

M*p=p*M transposed result

the claim would be right if p was not present, it would yield M=M

So the conclusion is that P*(V*(M*p)) equals to p*P*V*M and not to P*V*M*p

in words a column vector time M time V time P equals to row vector time transpose(M time V time P)

thus

P*(V*(M*p)) = p*P*V*M

and this means

P*(V*(M*p)) =!= P*V*M*p

The following oponing I got I think is not negating any of my proof and I would like to show why

How is that even supposed to work? A matrix multiplication M1 * M2 is only defined if M1 is a (n x k) matrix and M2 is a (k x m) matrix. The result is then a (n x m) matrix. One side effect of this is that you can transpose M, V and P but the only place the vector (a (4 x 1) matrix) can be is at the right-most side. You could multiply it from the left as a row-vector p^T (a (1 x 4) matrix) but then you would have to test a row and column vector for equality, which just does not work.

yes it results in a n x m matrix, the reason that vector 1 x n inverses its majoring to n x 1 is becouse the n x m matrix transposes to m x n. Becouse m x n * n x 1 = m x 1 equals to(?!) 1 x n * n x k * k x m= 1 x n* n x m = 1 x m. Is not the situation there

I would cite now "but then you would have to test a row and column vector for equality, which just does not work.". I am comparing vectors of the same majoring (dimensions).

And I think the following oponing will help

(P*V*M*p)T=pT*(P*V*M)T

notice your equations were missing the transpose bit that makes them true

The upper formula is true, while, this formula is as well

(X*p)T=p*X

I think so becouse the following is true about the formula

(1 x n * n x k )T = k x n * n x 1 (oponable point) [Edit] Succesfully oponed, I am now aware that vectors cannot be examined for equality if they have different dimensions (though only order of those)

I will here bring the rule of matrix multiplications in case of conflict with why are the X and p matricies transposed and of oposite order, just becouse they changed order within multiply operator

I will enforce the oponable point by following true statement in linear algebra that equivalents (not just imply) my (oponable point)

f5dc1798baf936a9b258bf26ab5b08e6.png

(AB)T = BT AT = B A

this is true becouse if we take right equation to not equal with left one, then (BTAT)T =A B =!= ((AB)T)T

[EDIT] upper is nonsense steming from fact that a vector cannot be "premultiplied" (see nxk * kxm required condition to multiply matrix). while me thinking it can becaouse of v*A=AT*v formula accepting (which does not hold at all in mathematics).

That is why I rewrote

(X*p)T=p*X

to

(1 x n * n x k )T = k x n * n x 1

becouse if it was (1 x n * n x k )T = n x k * 1 x n (only order changed, not pose) then

(X*p)T=pT*XT =!=p*X, what is true, would be (1 x n * n x k )T = k x n * n x 1 =!= n x k * 1 x n

that is why I have compared the matricies with same dimensions with only thanks to mul operation switch, without keepin the transpose relation, if I kept the tranpose relation, yes I would still compare matricies of same dimension.

thus by all this, to not fall for unrelation of AT BT = B A we can about the original formulas write

(P*V*M*p)T=pT*(P*V*M)T=p*(P*V*M) =!= (M*V*P)*p=P*(V*(M*p)));

becouse that would mean that

(AB)T = AT B

Conclusion is I believe I am compraing the same dimensional matricies in original proof, and also that the formulas are not true from purely mathematical definition of multiply operation between matricies and the T relation to it, which carries an exclusive point to relativly each other

Advertisement

If you can find a case where both sides are not equal then you're right, you don't need all that algebra to prove that something is false. If you want to prove it's true then you do need the algebra, but it looks like you're pretty convinced that it's false, so I would go that way first in your place.

Anyway, matrix multiplication is associative, (AB)C = A(BC), and you can find proves all over the internet or any basic algebra book. For example, look here: https://proofwiki.org/wiki/Matrix_Multiplication_is_Associative


a helping true algebraic statement:
P*V*M*p=p*M*V*P (becouse P*V*M=(M*V*P) T)

You're starting off on an incorrect assumption. PVMp = p(PVM)^T = p(M^T)(V^T)(P^T) != pMVP.

The key here is that (B^T)(A^T) does not equal BA.

If you claim that Transpose(A*B) = B * A, you just proved every square matrix is equal to its transpose, by letting either A or B be the identity.
"Most people think, great God will come from the sky, take away everything, and make everybody feel high" - Bob Marley
Anyway, your confusion is caused by not being rigorous when it comes to treating a vector as a matrix. Let's say M is a 4x4 matrix. Let p be a 4d vector. If p is a column vector, then p * M is not defined, but M * p is, since you can't multiply a 4x1 matrix by a 4x4 matrix. You seem to be transposing column vectors as you see fit with no justification.
"Most people think, great God will come from the sky, take away everything, and make everybody feel high" - Bob Marley

Well, that was my argument to him in the original thread, but as he says now

I would cite now "but then you would have to test a row and column vector for equality, which just does not work.". I am comparing vectors of the same majoring (dimensions).

which has me a bit at a loss for words.

Indeed. Row major vs column major doesn't affect matrices and their associativity, it's just a decision about how you represent vectors. You can't chop and change how you represent vectors to make undefined equations "work".
"Most people think, great God will come from the sky, take away everything, and make everybody feel high" - Bob Marley

If you claim that Transpose(A*B) = B * A, you just proved every square matrix is equal to its transpose, by letting either A or B be the identity.

you mean, (A*I)T=I*A <=> AT= A

square matrix is equal to its transpose if vector matricies get premultiplied. This cannot be performed in linear algebra beocuse of m x k * k x n?

I think you were right then, that it holds for OpenGL only for not distunguishing row column vectors. ANd I have spoted also a mistake in the claim you quoted

That's why I was careful to say "square matrices".

The problem is you are using the vector p as a matrix and transposing it as you see fit and not checking that the multiplication is even defined.
"Most people think, great God will come from the sky, take away everything, and make everybody feel high" - Bob Marley

The problem is you are using the vector p as a matrix and transposing it as you see fit and not checking that the multiplication is even defined.

Yes, I was believing that if multiplication order shifts, then the dimensions of a vector matrix shift, but dimensions of other matricies do not.

But that is not possible since xT=!=x. What I find strange though (for example, dot product of x and xT is defined, thus their subtraction is defined, but thus subtraction of x and xT can never be a zero vector)

Though this modification is performed in HLSL GLSL operations, it is not linear algebra mathematical theory so I was wrong.

This topic is closed to new replies.

Advertisement