Prove that every non zero vector u in R^n is an element of some orthogonal basis.

Started by
9 comments, last by alvaro 10 years, 4 months ago

I need a verification on my proof for the above question.

Proof:

Vectors are orthogonal or perpendicular if their dot product is zero.

Suppose a vector u has at least one non zero components(u1,u2...un) where these component can be positive or negative component and a vector v has components also has at least one non zero component (v1,v2...vn) where these components are positive or negative.

Then the dot product u1v1 + u2v2 + unvn = 0 if the components are inverse identity of each other.

Advertisement

No, you didn't prove anything, as far as I can tell.

No, you didn't prove anything, as far as I can tell.

I'm having trouble with proofs. May I use the definitions to back up my claim?

Of course you may.

But I don't see the overall structure of your proof at all. What I mean by that is that I expect your proof to say something like "given a non-zero vector v, this is a procedure to find an orthogonal basis that contains it", or "if we had a non-zero vector v that weren't part of any orthogonal basis, the following deduction would lead to a contradiction".

Let's see we can have a more concrete example: we can have two given vectors (1,-1,1) and (-1,1,2).

Their dot product is (1)(-1) + (-1)(1) + (1)(2) = -1 + -2 + 2 = 0. The dot product is zero. We know the two vectors are orthogonal or perpendicular to each other.

1) Isn't there some rule against homework?

2) It's been awhile since i've had to do anything like this, but if you use the "components" of a vector, I believe you're already counting on the fact it has an orthogonal basis, so I don't think you can prove it has one by using its components for anything. Conceptually, I think you're supposed to be proving that every vector has meaningful components.

Edit: Oh, it's in "R^n" already, so you're starting with components, so nevermind the above. As I've said, it's been awhile. I suppose the only part left would be, given some n, lay out what the basis is:

Something like, For R^n, I can construct N vectors (0, 0, 0, .... 1, 0, 0, 0, ...). Each of these are orthogonal to each other. Any vector in R^n can be constructed from them.

You assert that given a vector u, there exists an orthogonal vector v if "the components are inverse identity of each other", which is not meaningful - what's an "inverse identity"? Perhaps you meant "inverse" (or "negative inverse"), but that still wouldn't be true, as demonstrated by your concrete example, in which the components are not inverses. In addition to which, you have not shown that the vector with "inverse identity" components exists, just that if it exists then there is an orthogonal vector. (Perhaps, if you clarified what the "inverse identity" is supposed to be, it would be obvious how to construct it, and then you would at least have demonstrated that the orthogonal vector exists.) But even if your assertion were accurate, this would not demonstrate that u and v form a basis, just that there exists an orthogonal.

Back to the drawing board.

To win one hundred victories in one hundred battles is not the acme of skill. To subdue the enemy without fighting is the acme of skill.

1) Isn't there some rule against homework?

2) It's been awhile since i've had to do anything like this, but if you use the "components" of a vector, I believe you're already counting on the fact it has an orthogonal basis, so I don't think you can prove it has one by using its components for anything. Conceptually, I think you're supposed to be proving that every vector has meaningful components.

I just need some directions with it. I already provided my thought process.

Reread my previous post: You need an overall structure of the argument that has the theorem as a conclusion. What you wrote so far says something about two vectors which has nothing to do with the statement of the theorem.

Yeah and I think constructing the basis is the way to go, you have to construct a set of linear independent orthogonal vectors which span the space.

It's Gram-Schmidt with orthonormality relaxed to just orthogonal, innit?

EDIT: Although with Gram-Schmidt you start with a set of vectors which span the space IIRC.

"Most people think, great God will come from the sky, take away everything, and make everybody feel high" - Bob Marley

This topic is closed to new replies.

Advertisement