• Create Account

# Prove that every non zero vector u in R^n is an element of some orthogonal basis.

Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

10 replies to this topic

### #1warnexus  Prime Members   -  Reputation: 1197

Like
0Likes
Like

Posted 11 December 2013 - 09:14 AM

I need a verification on my proof for the above question.

Proof:

Vectors are orthogonal or perpendicular if their dot product is zero.

Suppose a vector u has at least one non zero components(u1,u2...un) where these component can be positive or negative component and a vector v has components also has at least one non zero component (v1,v2...vn) where these components are positive or negative.

Then the dot product u1v1 + u2v2 + unvn = 0 if the components are inverse identity of each other.

Edited by warnexus, 11 December 2013 - 09:23 AM.

### #2Álvaro  Crossbones+   -  Reputation: 9952

Like
0Likes
Like

Posted 11 December 2013 - 09:25 AM

No, you didn't prove anything, as far as I can tell.

### #3warnexus  Prime Members   -  Reputation: 1197

Like
0Likes
Like

Posted 11 December 2013 - 09:48 AM

No, you didn't prove anything, as far as I can tell.

I'm having trouble with proofs. May I use the definitions to back up my claim?

### #4Álvaro  Crossbones+   -  Reputation: 9952

Like
0Likes
Like

Posted 11 December 2013 - 09:53 AM

Of course you may.

But I don't see the overall structure of your proof at all. What I mean by that is that I expect your proof to say something like "given a non-zero vector v, this is a procedure to find an orthogonal basis that contains it", or "if we had a non-zero vector v that weren't part of any orthogonal basis, the following deduction would lead to a contradiction".

### #5warnexus  Prime Members   -  Reputation: 1197

Like
0Likes
Like

Posted 11 December 2013 - 10:16 AM

Let's see we can have a more concrete example: we can have two given vectors (1,-1,1)  and (-1,1,2).

Their dot product is (1)(-1) + (-1)(1) + (1)(2) = -1 +  -2 + 2 = 0. The dot product is zero. We know the two vectors are orthogonal or perpendicular to each other.

Edited by warnexus, 11 December 2013 - 10:19 AM.

### #6Pink Horror  Members   -  Reputation: 793

Like
1Likes
Like

Posted 11 December 2013 - 10:43 AM

1) Isn't there some rule against homework?

2) It's been awhile since i've had to do anything like this, but if you use the "components" of a vector, I believe you're already counting on the fact it has an orthogonal basis, so I don't think you can prove it has one by using its components for anything. Conceptually, I think you're supposed to be proving that every vector has meaningful components.

Edit: Oh, it's in "R^n" already, so you're starting with components, so nevermind the above. As I've said, it's been awhile. I suppose the only part left would be, given some n, lay out what the basis is:

Something like, For R^n, I can construct N vectors (0, 0, 0, .... 1, 0, 0, 0, ...). Each of these are orthogonal to each other. Any vector in R^n can be constructed from them.

Edited by Pink Horror, 11 December 2013 - 11:23 AM.

### #7King of Men  Members   -  Reputation: 391

Like
0Likes
Like

Posted 11 December 2013 - 11:06 AM

You assert that given a vector u, there exists an orthogonal vector v if "the components are inverse identity of each other", which is not meaningful - what's an "inverse identity"? Perhaps you meant "inverse" (or "negative inverse"), but that still wouldn't be true, as demonstrated by your concrete example, in which the components are not inverses. In addition to which, you have not shown that the vector with "inverse identity" components exists, just that if it exists then there is an orthogonal vector. (Perhaps, if you clarified what the "inverse identity" is supposed to be, it would be obvious how to construct it, and then you would at least have demonstrated that the orthogonal vector exists.) But even if your assertion were accurate, this would not demonstrate that u and v form a basis, just that there exists an orthogonal.

Back to the drawing board.

Edited by King of Men, 11 December 2013 - 11:09 AM.

To win one hundred victories in one hundred battles is not the acme of skill. To subdue the enemy without fighting is the acme of skill.

### #8warnexus  Prime Members   -  Reputation: 1197

Like
0Likes
Like

Posted 11 December 2013 - 11:11 AM

1) Isn't there some rule against homework?

2) It's been awhile since i've had to do anything like this, but if you use the "components" of a vector, I believe you're already counting on the fact it has an orthogonal basis, so I don't think you can prove it has one by using its components for anything. Conceptually, I think you're supposed to be proving that every vector has meaningful components.

I just need some directions with it. I already provided my thought process.

### #9Álvaro  Crossbones+   -  Reputation: 9952

Like
0Likes
Like

Posted 11 December 2013 - 03:26 PM

Reread my previous post: You need an overall structure of the argument that has the theorem as a conclusion. What you wrote so far says something about two vectors which has nothing to do with the statement of the theorem.

### #10Paradigm Shifter  Crossbones+   -  Reputation: 4755

Like
0Likes
Like

Posted 11 December 2013 - 04:01 PM

Yeah and I think constructing the basis is the way to go, you have to construct a set of linear independent orthogonal vectors which span the space.

It's Gram-Schmidt with orthonormality relaxed to just orthogonal, innit?

EDIT: Although with Gram-Schmidt you start with a set of vectors which span the space IIRC.

Edited by Paradigm Shifter, 11 December 2013 - 04:11 PM.

"Most people think, great God will come from the sky, take away everything, and make everybody feel high" - Bob Marley

### #11Álvaro  Crossbones+   -  Reputation: 9952

Like
0Likes
Like

Posted 11 December 2013 - 06:04 PM

Since Paradigm Shifter has given away the main idea, this is how I would do it.

We'll build the basis B incrementally, thusly:

Initialize B = {v}.
while (the span of B is not the whole space) {
Let w be a vector that is not in the span of B.
Find the orthogonal projection of w into the span of B, u. The vector w-u is now perpendicular to the span of B, so it is orthogonal to all the vectors in B. Add w-u to B
}
This program will stop after n-1 steps if we are doing this in R^n. At the end, B will be an orthogonal basis that contains v.

If you know about Gram-Schmidt diagonalization, you can do it with a shorter description, like this:
Initialize B = {v}.
while (the span of B is not the whole space) {
Let w be a vector that is not in the span of B.
}
Use the Gram-Schmidt procedure to turn B into an orthogonal basis whose first vector is v.

EDIT: With a little bit of work I am sure one can generalize the theorem to any inner product space, even if ones with infinite dimension.

Edited by Álvaro, 11 December 2013 - 06:07 PM.

Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

PARTNERS