• Advertisement
Sign in to follow this  

Is upper division Linear Algebra a good idea or necessary?

This topic is 1236 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I have a few options for mathematics this spring and I would like your input. I have posted something like this before but now I will be a bit more specific.

 

I've taken linear algebra and did very well -- it was probably about 30-40% proofs, and the rest was practical problem solving. Now I can take the following:

  • Calc 3 (multivariable)
  • Differential Equations
  • Linear Algebra (upper div with lots of proofs) (last chance to take it also)
  • Number Theory

Which, if any or all, might best serve me for game dev, specifically graphics and/or AI? Is another proof-laden linear algebra class necessary or overkill? Thanks.

Share this post


Link to post
Share on other sites
Advertisement

I found multi-vector calculus to be far more useful than one would initially imagine.  I like linear algebra, but the later stuff tended to be (as you mentioned) more proofs and less applicable to games.

Share this post


Link to post
Share on other sites

I scored 1 from Linear Algebra in first semester from MatFyz (surprised myself). Some people argued over me here that axioma that a vector has the inverse vector, resulting in multiplication neutral vector if those two gets multiplied, with fact that there is no such thing as "multipilaction of vectors" minding and missing that multiplication and addition are operations to be defined on a space (that how you yield a vector space or not).

 

a vector space on a scalar body is a  space where :

 

-a vector added to a vector is a vector of the same space

- to each vector exists exclusive vector that when added result in the addition neutral vector (zero vector)

- to each vector exists exclusive multiplication inverse vector ( if multipled with, result in multiplication neutral vector= I  so that I*x=x)

- a vector multiplied by vector is a vector of the space

- .. and a few more axioms

 

in OpenGL, multiplying is defined as per component throw, such as v*v1=(v.i*v1.i,....i+) vector

 

This breaks association of vector/matrix multiplication in GLSL/HLSL vector space thus, I was unable to explain recantly, being forced to accept ungenerall rules of linear algebra.

 

I have to accept this though, since linear algebra in US is studying 3 dimensional spaces only, while on my school it was studying spaces, forced to reduce much lesser defintion.

Share this post


Link to post
Share on other sites

Computer graphics is like 70% linear algebra, 20% calculus, 10% misc. But that doesn't mean a course in linear algebra is useful, partly because a lot of courses and books on linear algebra are !@#$ing terrible. The proofs in particular are completely pointless unless you have aspirations towards a master's degree or PhD in mathematics.

Share this post


Link to post
Share on other sites

 

The people who told you "there is no such thing as multiplication of vectors" were correct. The axioms that define what a vector space is talk about sum and difference of vectors and about multiplying a vector by a scalar. You can look it up anywhere.

I am not word picking, I will refrase again- that if one defines multiplication of vectors on a vector space, it will stop being a vector space if multiplication operation will break axioms mentioned.

 

If it is so that vectors do not multiply and have no rule towards if doing so, then why we were all the time speaking about matrix multiplication being associative, and a vector being a matrix special case , and distinguishing column and row vectors and actual dimensions of those- those are relations existing only if multiplying of vectors is defined.

 

 

But I insist that in linear algebra generaly, there are axioms about multiplication operation if multiplication gets introduced onto vector space , having to be fullfilled with the new space and operations to keep it being a vector space .

 

In Opengl, the operation is component wise multiply, though it has no sence to be used in manner of geomteric need, it ields a vector space where column and row vectors are the same, implying certain inpropriatness of talking about transposing a matrix, since it transposes by its order with multiply operation (that is impossible in linear algebra becouse of NxM demand in mutipling matricies/vectors, implying also vectors to be nx1x1xm, but that algebra has its definition of operation, that opengl does differ in (it will multiply vectors without their dimension introduced), thus opengl has a vector space of different properties- where multiplication is not associative- but it is a vector space).

Share this post


Link to post
Share on other sites

If it is so that vectors do not multiply and have no rule towards if doing so, then why we were all the time speaking about matrix multiplication being associative, and a vector being a matrix special case , and distinguishing column and row vectors and actual dimensions of those- those are relations existing only if multiplying of vectors is defined.


I can tell you that. In modern mathematics (after Grothendieck, I believe), you don't just study objects like vector spaces: You also study the morphisms between them that preserve the structure. In the case of vector spaces, the corresponding morphisms are linear mappings(*). When we talk about matrix multiplication being associative and all those things you mention, we are looking at matrices as representing linear mappings between vector spaces. A column vector represents a vector, and a row vector represents a co-vector, which is a linear mapping from the vector space to the field of scalars. The whole thing makes perfect sense without introducing a mechanism to multiply vectors.


(*)- Other examples are sets and functions, topological spaces and continuous functions, differentiable manifolds and smooth functions, and groups and group morphisms (they are just called that). Edited by Álvaro

Share this post


Link to post
Share on other sites

 

looking at matrices as representing linear mappings between vector spaces.

I understand it similiary. The vector of N dimnesion has a space, and a space is well defined by N vectors of N dimension. If vector is to functionaly relate to those base vectors, it must have multilplication relation with a vector . Then vector can get expressed by other base vectors (transform). 

 

At this point matrix is a set of vectors, that does not introduce further properties, the properties will emit only from the multiplication relation definition of vectors.

 

Of course, there is no requirement for multiplication relation of vector to be defined, so I know that when one says, linear Algebra, there is too much of subset  having nearly disappearing common inclusion.

Share this post


Link to post
Share on other sites

You are not a career.  Study what you're most interested in.

This. You have a rare opportunity, make the most of it. You'll have time later in life to study things you're not interested on just to make it to the deadlines and/or practical targets.

Share this post


Link to post
Share on other sites

In Opengl, the operation is component wise multiply, though it has no sence to be used in manner of geomteric need, it ields a vector space where column and row vectors are the same, implying certain inpropriatness of talking about transposing a matrix, since it transposes by its order with multiply operation (that is impossible in linear algebra becouse of NxM demand in mutipling matricies/vectors, implying also vectors to be nx1x1xm, but that algebra has its definition of operation, that opengl does differ in (it will multiply vectors without their dimension introduced), thus opengl has a vector space of different properties- where multiplication is not associative- but it is a vector space).

OpenGL uses the same data type for several different things, not just geometric vectors. In particular, it uses them for colors. The component-wise multiplication is in fact very useful in this case. Code implementations of mathematical concepts are rarely completely equivalent to their theoretical version.

Share this post


Link to post
Share on other sites

The people who told you "there is no such thing as multiplication of vectors" were correct. The axioms that define what a vector space is talk about sum and difference of vectors and about multiplying a vector by a scalar. You can look it up anywhere.

I am not word picking, I will refrase again- that if one defines multiplication of vectors on a vector space, it will stop being a vector space if multiplication operation will break axioms mentioned.
 
If it is so that vectors do not multiply and have no rule towards if doing so, then why we were all the time speaking about matrix multiplication being associative, and a vector being a matrix special case , and distinguishing column and row vectors and actual dimensions of those- those are relations existing only if multiplying of vectors is defined.


When we speak about matrix multiplication in the context of graphics programming we are speaking practically almost always about endomorphisms only (that is, mappings from a vector space \(V\) into \(V\) itself. The matrices are in this case always square. Square matrices form a ring, which includes the properties you are alluding to (like some matrices having a multiplicative inverse or associativity).
These are properties of square matrices only, not any matrix. Again, linear algebra only defines a multiplication between matrices for:
\( \cdot: \mathbb R^{n \times m} \times \mathbb R^{m \times k} \rightarrow \mathbb R^{n \times k}\)
It should be obvious it is not possible to multiply just any matrices (or vectors if we chose to model them as a special case of matrix) together any way you please. It should also be noted that matrix multiplication is always associative (that is \( (A\cdot B)\cdot C = A\cdot (B\cdot C) \)) provided the dimensions of the matrices allow that multiplication in the first place. It is left as an exercise to the reader to show that the associativity does not interfere with the restricted multiplication operator above. Edited by BitMaster

Share this post


Link to post
Share on other sites

 

They're all useful. Number theory tends to be a little more essoteric, but is useful for understanding things like the minimum number of bits needed to store (complex) information, or, in understanding how much accuracy you can rely on maintaining through a series of floating-point operations (and strategies to maximize accuracy by ordering operations differently).
 


The exact cource contents vary between colleges/universities but usually Number theory is about integers, e.g., prime numbers and such. In my university, a course that deals with floating point approximations of real numbers and the like is named Numerical analysis.

 

 

That is the usual definition of number theory. Study of properties of integers. As such it is useful in cryptography but not very useful for games. Numerical Analysis (also called Numerical Methods) is useful for physics programming etc.

Share this post


Link to post
Share on other sites

 

which includes the properties you are alluding to (like some matrices having a multiplicative inverse or associativity).
These are properties of square matrices only, not any matrix. Again, linear algebra only defines a multiplication between matrices for:

I apointed the rules only for multiplication of a vector, it does not automaticly imply rules about matricies.There is no need to have matricies  invertable, to have multiplication operation on a vector space.

 

The particular multiplication of a vector, defined in linear algebra, considers a vector of 2 dimensions nx1 or 1xn. And this multiplication operation thus runs in pattern of being complex  . But you can validly define a multiplication operation to be not complex (as in GLSL or HLSL),  and have a vector space. Making the multiplication of vectors comutative.

 

You surely agree then that there are plenty of differences fired in also morphisms , transformations. Yes, the multiplication of matricies is not associative then, having multiplication of matricies undestood as F(L(D(v))) where F, L D are linear functions of v vector (linear functions can be multiplied to one function)

Share this post


Link to post
Share on other sites

 

They're all useful. Number theory tends to be a little more essoteric, but is useful for understanding things like the minimum number of bits needed to store (complex) information, or, in understanding how much accuracy you can rely on maintaining through a series of floating-point operations (and strategies to maximize accuracy by ordering operations differently).
 


The exact cource contents vary between colleges/universities but usually Number theory is about integers, e.g., prime numbers and such. In my university, a course that deals with floating point approximations of real numbers and the like is named Numerical analysis.

 

Yes, you're right, I was mixed up.

Share this post


Link to post
Share on other sites

 

Linear Algebra (upper div with lots of proofs) (last chance to take it also)

Take this. You will examine questions to reasonings, proofs being honest (rare), and if you do, you will master a plenty . I am not smart myself (whaether I have strenghts or not in a smart field), but I handled it and it catalyzed me to master everything (except paranormal world we live in)

 

Linear Algebra is fair and flexing powerfull , making you able to aproximate paranormal real world. Linear Algebra is simple and honest and pasionate

 

(I thiknk I need a poem about linear algebra,  it is an orgastic discipline, dive to it. It is no hard, being hardest)

Edited by JohnnyCode

Share this post


Link to post
Share on other sites

The particular multiplication of a vector, defined in linear algebra, considers a vector of 2 dimensions nx1 or 1xn. And this multiplication operation thus runs in pattern of being complex  . But you can validly define a multiplication operation to be not complex (as in GLSL or HLSL),  and have a vector space. Making the multiplication of vectors comutative.

As the last time we talked about that: you can certainly add additional stuff (like operators) to a vector space. You might even be able to do more with it then (although the polite way in mathematics is to give it's proper name then), but that doesn't suddenly give all vector spaces that property.

Even so, if I remember correctly the last time that discussion came up you needed something like getting rid of a vector \(v\) on the right in an expression \(Av\) for a suitable matrix \(A\). Even if you have an inner product \(*: V \times V \rightarrow V\) that does not work: first, not every vector has an inverse element regarding \(*\). Second, even if you had the expression \(Ae\) (where \(e\) is the one-element regarding \(*\)), you cannot make the \(e\) vanish. It's the multiplicative neutral element for \(*\), not for (matrix, vector) multiplication. The result of \(Ae\) is a vector, not another matrix, definitely not \(A\) itself (except in the degenerate case of one-dimensional vector spaces).
 

You surely agree then that there are plenty of differences fired in also morphisms , transformations. Yes, the multiplication of matricies is not associative then, having multiplication of matricies undestood as F(L(D(v))) where F, L D are linear functions of v vector (linear functions can be multiplied to one function)

I have no clue what you are talking about. Function composition is always associative. You don't even need linear functions for that, it's a basic property. Changing the representation to \((F \circ L \circ D)(x)\) instead of matrices has no influence on that, you still have \((F \circ L) \circ D = F \circ (L \circ D)\).

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement