Jump to content

  • Log In with Google      Sign In   
  • Create Account

Banner advertising on our site currently available from just $5!


1. Learn about the promo. 2. Sign up for GDNet+. 3. Set up your advert!


Is this multiplication formula in linear algebra true?


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
12 replies to this topic

#1 JohnnyCode   Members   -  Reputation: 578

Like
-3Likes
Like

Posted 24 June 2014 - 10:44 AM

There is a thread in which is an equation formula I believe is not true. I am very concerned (sure) it is not true and I would like to get oponing to prove me it is equal.

 

Thanks, origiinal thread is here but you do not need to look at it (just for curiosity) http://www.gamedev.net/topic/657986-why-does-this-matrix-multiplication-order-matter/

 

I will begin:

 

It stemed originaly from formula

projection * (view * (model * position))=projection * view * model * position

 

the projection is a vector of 4 dimension and rest are matrices of 4x4 dimension.

 

I have narrowed it to

P*(V*(M*p))=P*V*M*p; where p is a vector matrix and P V M are multiplyable matricies

 

Now, I provide a proof that I will just copy paste here without modifications since it is very suficient

 

 

 claim: P*V*M*p=P*(V*(M*p)) where p is a vector (a matrix of [x,1] / [1,x] dimensions) and P,V,M are mutiplyable matricies

 

a helping true algebraic statement:

P*V*M*p=p*M*V*P  (becouse P*V*M=(M*V*P) T)

 

I will now show that the claim negate the true statement if the claim is to be true:

 

P*V*M*p=P*(V*(M*p)) => P*(V*(M*p))=((p*M)*V)*P

 

do you agree with P*(V*(M*p))=((p*M)*V)*P ? I just used associativity in the helping true statement, but it yeilds

 

P*V*(M*p)=(p*M)*V*P       /*P-1*V-1

 

M*p=p*M  transposed result

 

the claim would be right if p was not present, it would yield M=M

 

So the conclusion is that P*(V*(M*p)) equals to p*P*V*M and not to P*V*M*p

 

in words a column vector time M time V time P equals to row vector time transpose(M time V time P)

 

thus

P*(V*(M*p)) = p*P*V*M

and this means

P*(V*(M*p)) =!= P*V*M*p

 

The following oponing I got I think is not negating any of my proof and I would like to show why

 

How is that even supposed to work? A matrix multiplication M1 * M2 is only defined if M1 is a (n x k) matrix and M2 is a (k x m) matrix. The result is then a (n x m) matrix. One side effect of this is that you can transpose M, V and P but the only place the vector (a (4 x 1) matrix) can be is at the right-most side. You could multiply it from the left as a row-vector p^T (a (1 x 4) matrix) but then you would have to test a row and column vector for equality, which just does not work.

yes it results in a n x m matrix, the reason that vector 1 x n inverses its majoring to  n x 1 is becouse the n x m matrix transposes to m x n. Becouse m x n * n x 1 = m x 1 equals to(?!) 1 x n * n x k * k x m= 1 x n* n x m = 1 x m. Is not the situation there

 

I would cite now "but then you would have to test a row and column vector for equality, which just does not work.".  I am comparing vectors of the same majoring (dimensions).

 

And I think the following oponing will help

 

 

(P*V*M*p)T=pT*(P*V*M)T

 notice your equations were missing the transpose bit that makes them true

 

The upper formula is true, while, this formula is as well

(X*p)T=p*X

 I think so becouse the following is true about the formula

 

(1 x n * n x k )T = k x n * n x 1   (oponable point) [Edit] Succesfully oponed, I am now aware that vectors cannot be examined for equality if they have different dimensions (though only order of those)

 

I will here bring the rule of matrix multiplications in case of conflict with why are the X and p matricies transposed and of oposite order, just becouse they changed order within multiply operator

 

I will enforce the oponable point by following true statement in linear algebra that equivalents (not just imply) my (oponable point)

 

f5dc1798baf936a9b258bf26ab5b08e6.png

(AB)T = BT AT = B A

this is true becouse if we take right equation to not equal with left one, then (BTAT)T =A B =!= ((AB)T)T

[EDIT] upper is nonsense steming from fact that a vector cannot be "premultiplied" (see nxk * kxm required condition to multiply matrix). while me thinking it can becaouse of  v*A=AT*v formula accepting  (which does not hold at all in mathematics).

 

That is why I rewrote

(X*p)T=p*X

to

(1 x n * n x k )T = k x n * n x 1

becouse if it was (1 x n * n x k )T =  n x k * 1 x n  (only order changed, not pose) then

(X*p)T=pT*XT =!=p*X, what is true, would be (1 x n * n x k )T = k x n * n x 1 =!= n x k * 1 x n

 

that is why I have compared the matricies with same dimensions with only thanks to mul operation switch, without keepin the transpose relation, if I kept the tranpose relation, yes I would still compare matricies of same dimension.

 

thus by all this, to not fall for unrelation of AT BT = B A  we can about the original formulas write

(P*V*M*p)T=pT*(P*V*M)T=p*(P*V*M) =!= (M*V*P)*p=P*(V*(M*p)));

 

becouse that would mean that

(AB)T = AT B

 

 

Conclusion is I believe I am compraing the same dimensional matricies in original proof, and also that the formulas are not true from purely mathematical definition of multiply operation between matricies and the T relation to it, which carries an exclusive point to relativly each other


Edited by JohnnyCode, 05 July 2014 - 06:33 AM.


Sponsor:

#2 DiegoSLTS   Members   -  Reputation: 2085

Like
1Likes
Like

Posted 24 June 2014 - 10:59 AM

If you can find a case where both sides are not equal then you're right, you don't need all that algebra to prove that something is false. If you want to prove it's true then you do need the algebra, but it looks like you're pretty convinced that it's false, so I would go that way first in your place.

 

Anyway, matrix multiplication is associative, (AB)C = A(BC), and you can find proves all over the internet or any basic algebra book. For example, look here: https://proofwiki.org/wiki/Matrix_Multiplication_is_Associative


Edited by DiegoSLTS, 24 June 2014 - 11:25 AM.


#3 Samith   Members   -  Reputation: 2395

Like
2Likes
Like

Posted 24 June 2014 - 11:06 AM


a helping true algebraic statement:
P*V*M*p=p*M*V*P  (becouse P*V*M=(M*V*P) T)

 

You're starting off on an incorrect assumption. PVMp = p(PVM)^T = p(M^T)(V^T)(P^T) != pMVP.

 

The key here is that (B^T)(A^T) does not equal BA.


Edited by Samith, 24 June 2014 - 11:08 AM.


#4 Paradigm Shifter   Crossbones+   -  Reputation: 5600

Like
6Likes
Like

Posted 24 June 2014 - 11:07 AM

If you claim that Transpose(A*B) = B * A, you just proved every square matrix is equal to its transpose, by letting either A or B be the identity.
"Most people think, great God will come from the sky, take away everything, and make everybody feel high" - Bob Marley

#5 Paradigm Shifter   Crossbones+   -  Reputation: 5600

Like
1Likes
Like

Posted 24 June 2014 - 01:55 PM

Anyway, your confusion is caused by not being rigorous when it comes to treating a vector as a matrix. Let's say M is a 4x4 matrix. Let p be a 4d vector. If p is a column vector, then p * M is not defined, but M * p is, since you can't multiply a 4x1 matrix by a 4x4 matrix. You seem to be transposing column vectors as you see fit with no justification.
"Most people think, great God will come from the sky, take away everything, and make everybody feel high" - Bob Marley

#6 BitMaster   Crossbones+   -  Reputation: 5726

Like
1Likes
Like

Posted 24 June 2014 - 02:16 PM

Well, that was my argument to him in the original thread, but as he says now

I would cite now "but then you would have to test a row and column vector for equality, which just does not work.". I am comparing vectors of the same majoring (dimensions).

which has me a bit at a loss for words.



#7 Paradigm Shifter   Crossbones+   -  Reputation: 5600

Like
1Likes
Like

Posted 24 June 2014 - 02:24 PM

Indeed. Row major vs column major doesn't affect matrices and their associativity, it's just a decision about how you represent vectors. You can't chop and change how you represent vectors to make undefined equations "work".
"Most people think, great God will come from the sky, take away everything, and make everybody feel high" - Bob Marley

#8 JohnnyCode   Members   -  Reputation: 578

Like
-3Likes
Like

Posted 24 June 2014 - 02:27 PM

If you claim that Transpose(A*B) = B * A, you just proved every square matrix is equal to its transpose, by letting either A or B be the identity.

you mean, (A*I)T=I*A      <=>       AT= A 

 

 

square matrix is equal to its transpose if vector matricies get premultiplied. This cannot be performed in linear algebra beocuse of m x k * k x n?

I think you were right then, that it holds for OpenGL only for not distunguishing row column vectors. ANd I have spoted also a mistake in the claim you quoted



#9 Paradigm Shifter   Crossbones+   -  Reputation: 5600

Like
1Likes
Like

Posted 24 June 2014 - 02:35 PM

That's why I was careful to say "square matrices".

The problem is you are using the vector p as a matrix and transposing it as you see fit and not checking that the multiplication is even defined.
"Most people think, great God will come from the sky, take away everything, and make everybody feel high" - Bob Marley

#10 JohnnyCode   Members   -  Reputation: 578

Like
-3Likes
Like

Posted 24 June 2014 - 05:56 PM

The problem is you are using the vector p as a matrix and transposing it as you see fit and not checking that the multiplication is even defined.

Yes, I was believing that if multiplication order shifts, then the dimensions of a vector matrix shift, but dimensions of other matricies do not.

But that is not possible since xT=!=x. What I find strange though (for example, dot product of x and xT is defined, thus their subtraction is defined, but thus subtraction of x and xT can never be a zero vector)

 

Though this modification is performed in HLSL GLSL operations, it is not linear algebra mathematical theory so I was wrong.


Edited by JohnnyCode, 05 July 2014 - 06:26 AM.


#11 Álvaro   Crossbones+   -  Reputation: 16155

Like
5Likes
Like

Posted 24 June 2014 - 08:33 PM

Can we put an end to this nonsense? Go learn some math. If you have some genuine questions, have some humility and we'll be happy to help you learn. But don't come with this attitude of "I don't believe what everyone says" or "some of theory axioms [sic] are excluding others".

Does anyone else smell troll?

#12 BitMaster   Crossbones+   -  Reputation: 5726

Like
4Likes
Like

Posted 25 June 2014 - 12:12 AM

But that is not possible since xT=!=x. What I find strange though (for example, dot product of x and xT is defined, thus their subtraction is defined, but thus subtraction of x and xT can never be a zero vector)

First, you cannot have a dot product of \(x\) and \(x^T\) (unless \(x \in \mathbb R\)). The dot product is for two column vectors \(u, v\) is commonly defined as \(dot(u, v) := u^T \cdot v\). Trying \(dot(x^T, x)\) would once again be impossible because you would have to multiply an element of \(\mathbb R^{n \times 1}\) with and element of \(\mathbb R^{n \times 1}\) which is not defined for \(n \neq 1\).
 

To every vector (matrix) exists addition neutral vector (matrix) Z
To every vector (matrix) exists multiplication neutral vector (matrix) I such that V*I=V
To every vector (matrix) exists multiplication inverse vector J (matrix) such that V*J=I

You are thinking of the ring axioms but that applies only to square matrices. You cannot even multiply general vectors together like that (as already said above), so talking about multiplicative neutral and multiplicative inverse elements for vectors does not make any sense.
In some fields (like GLSL) it can be convenient to define a component-wise vector multiplication. However, that is something completely orthogonal to standard linear algebra. The presence of this extra operator does not change anything about the theory of matrices nor does it negate associativity of matrix multiplication.

I can only reinforce Álvaro's suggestion to "go learn some math", ideally under supervision. You apparently heard of several concepts but you are lacking the attention to detail and strictness required in the field.

Álvaro: I often smell troll on these forums but with him I'm more inclined to believe he actually believes what he is saying.

#13 JohnnyCode   Members   -  Reputation: 578

Like
0Likes
Like

Posted 25 June 2014 - 07:39 AM

Can we put an end to this nonsense? Go learn some math. If you have some genuine questions, have some humility and we'll be happy to help you learn. But don't come with this attitude of "I don't believe what everyone says" or "some of theory axioms [sic] are excluding others".

Does anyone else smell troll?

I just wanted to talk about linear algebra with more competent people.
I was bringing my concerns to find out what what I missunderstand, not for attitude troll fun or something. For that I thank you

 

I now undestand that making multiplication relate to dimension shifting as transponing relates is bad idea to exist general,  becouse linear algebra is a much wider discipline.

 

If I will happen to have some concerns again I will post them in a new topic of course

 

Thank you






Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS