• Create Account

Banner advertising on our site currently available from just \$5!

# Matrix "sign"

Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

5 replies to this topic

### #1Doublefris  Members   -  Reputation: 316

Like
0Likes
Like

Posted 04 May 2014 - 06:19 PM

These matrices ( A & B ) both qualify as being orthogonal.

$\left [ \begin{matrix} 1 &0 &0 \\ 0 &1 &0 \\ 0 &0 &1 \end{matrix} \right ]$

$\left [ \begin{matrix} -1 &0 &0 \\ 0 &1 &0 \\ 0 &0 &1 \end{matrix} \right ]$

Furthermore, my code says their quaternion representation is actually the same ( both the identity quaternion). glm gives me { 1/sqrt(2), 0, 0, 0 } for B curiously.

Also I know that det(A) = 1 and det(B) = -1

So I guess that orthogonal matrices have some sort of "sign" property to them, that gives the handedness of the basis? And how this work with quaternions? I suspect it might have some relation to the famous 'double cover' property of quaternions that people always mention. Basically the fact that Q and -Q correspond to the same rotation. I also wonder if this works for complex numbers as well as they're the 2d equivalent of quaternions.

Are there any other properties involving this 'sign'? I have a vague feeling it had something to do with the diagonal as well.

Maybe someone here has some deeper insights or facts to share about this

### #2Bacterius  Crossbones+   -  Reputation: 9961

Like
0Likes
Like

Posted 04 May 2014 - 06:50 PM

That second matrix is not orthogonal and its determinant is zero. Did you mean this one?

$\left [ \begin{matrix} -1 &0 &0 \\ 0 &1 &0 \\ 0 &0 &1 \end{matrix} \right ]$

The slowsort algorithm is a perfect illustration of the multiply and surrender paradigm, which is perhaps the single most important paradigm in the development of reluctant algorithms. The basic multiply and surrender strategy consists in replacing the problem at hand by two or more subproblems, each slightly simpler than the original, and continue multiplying subproblems and subsubproblems recursively in this fashion as long as possible. At some point the subproblems will all become so simple that their solution can no longer be postponed, and we will have to surrender. Experience shows that, in most cases, by the time this point is reached the total work will be substantially higher than what could have been wasted by a more direct approach.

- Pessimal Algorithms and Simplexity Analysis

### #3Doublefris  Members   -  Reputation: 316

Like
2Likes
Like

Posted 04 May 2014 - 06:55 PM

That second matrix is not orthogonal and its determinant is zero. Did you mean this one?

$\left [ \begin{matrix} -1 &0 &0 \\ 0 &1 &0 \\ 0 &0 &1 \end{matrix} \right ]$

Corrected. Also did not know we could use latex in here.

### #4Aressera  Members   -  Reputation: 1637

Like
0Likes
Like

Posted 04 May 2014 - 10:29 PM

One of the eigenvalues of the second matrix is -1, whereas all of the first matrix's eigenvalues are 1.

### #5cdoubleplusgood  Members   -  Reputation: 851

Like
0Likes
Like

Posted 05 May 2014 - 01:32 AM

So I guess that orthogonal matrices have some sort of "sign" property to them, that gives the handedness of the basis?

Definitely no.

The math doesn't care where the thumbs are attached to your hands. "Handedness" comes into play when you visualize the results in real world, e.g. on the screen. The identity matrix always looks like your 1st matrix.

Your 2nd matrix performs a mirror operation. In general, if the determinant is negative, the transformation includes mirroring.

In this sense, there is a relationship between this "sign" and handedness: To project vectors from left- to a right handed coordinates, you can use a mirror operation.

### #6Álvaro  Crossbones+   -  Reputation: 14869

Like
0Likes
Like

Posted 05 May 2014 - 04:11 AM

These matrices ( A & B ) both qualify as being orthogonal.

$\left [ \begin{matrix} 1 &0 &0 \\ 0 &1 &0 \\ 0 &0 &1 \end{matrix} \right ]$
$\left [ \begin{matrix} -1 &0 &0 \\ 0 &1 &0 \\ 0 &0 &1 \end{matrix} \right ]$

So far, so good.

Furthermore, my code says their quaternion representation is actually the same ( both the identity quaternion).

I don't know what code you are referring to, but the second one cannot be represented by a quaternion, so things might not make a lot of sense.

glm gives me { 1/sqrt(2), 0, 0, 0 } for B curiously.

Perhaps the fact the quaternion you get doesn't have length 1 is an indication that something went wrong?

Also I know that det(A) = 1 and det(B) = -1

Correct.

So I guess that orthogonal matrices have some sort of "sign" property to them, that gives the handedness of the basis?

Yes. Orthogonal matrices are invertible, and the sign of the determinant of an invertible matrix is an important number. If you interpret the matrix as representing a change of basis, positive determinant means the bases have the same orientation.

And how this work with quaternions?

It doesn't. Unit-length quaternions can only represent rotations, whose determinant is 1.

I suspect it might have some relation to the famous 'double cover' property of quaternions that people always mention. Basically the fact that Q and -Q correspond to the same rotation.

I suspect you are wrong.

I also wonder if this works for complex numbers as well as they're the 2d equivalent of quaternions.

I am not sure what "this" is.

Are there any other properties involving this 'sign'? I have a vague feeling it had something to do with the diagonal as well.

Maybe someone here has some deeper insights or facts to share about this

No, sorry. I don't have anything deep to say about this.

EDIT: If you want to read something about it, this sign you are talking about is what separates the group O(3) into two halves, one of which is the subgroup of 3D rotations, SO(3).

Edited by Álvaro, 05 May 2014 - 04:18 AM.

Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

PARTNERS