• 15
• 15
• 11
• 9
• 10

# Interesting commutative operation on quaternions

This topic is 4194 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

So I was musing over multiplication order for a skeletal pose building thing that I'm doing, and this operation came to mind: lim(n->infinity) (q1^(1/n) * q2(1/n)) ^ n The basic idea being, alternating teeny bits of q1 and q2 instead of multiplying all of one by all of the other. Some quick MATLAB tests confirm that the equation converges and is commutative, and suggest that it's numerically stable for reasonable n. It also seems rather like an obvious thing to try to do. Has anyone come across this operation before? Does it have a name? Is it actually useful for anything?

##### Share on other sites
Doesn't it just converge to (q1^0*q2^0)^0 though? Which means it will effectively converge to 1 in the same way that (sin x)/x -> 1 as x -> 0?

[EDIT: No, will tend to 1^inf but since the bracketed term is tending to 1 and 1^x == 1 for all x it will still tend to 1 anyways, I reckon]

I think the bottom line is very small powers of quats are effectively equal to 1 since that means no rotation occurs, that's why I mentioned sin x which is effectively x for small x.

[EDIT2: Replaced "converge to" with "tend to"]
[EDIT3: Duh, I meant 1^inf]

The thing to bear in mind is that very small powers of anything (non-zero, such as a unit quat) will be near 1 in a (non-finite) ring, and 1, being the multiplicative inverse, always commutes, in any ring.

[EDIT4: Added "non-finite" since powers of numbers in a finite ring could obvioulsy alternate].

I've not got a proof for the "near 1" statement but it seems to make sense to me. I have had a few beers though. It's easy to show that the multiplicative inverse always commutes (and is unique) in a ring though.

[Edited by - Paradigm Shifter on September 28, 2006 7:55:38 PM]

##### Share on other sites
Quote:
 Original post by Paradigm ShifterDoesn't it just converge to (q1^0*q2^0)^0 though?

No, because the final exponent is not 1/n but n.

##### Share on other sites
This operation has a name, and it's "addition". What is happening, essentially, is that you take two elements in the Lie group of quaternions, take preimages in the corresponding Lie algebra (this involves a choice, and does not always exist in the case of general Lie groups) and add them. In a small neighbourhood of 1, you have a 1-1 mapping exp:g->G and its inverse ln:G->g, and when you take ln(Q_1)+ln(Q_2), it only differs by O(|Q_1|*|Q_2|) from ln(Q_1*Q_2) (search for "Baker, Campbell, Hausdorff" to learn about the whole series), so when you divide the logarithms by N (taking an Nth root), add them, multiply by N and then send the result back with exp, it only differs in an O(N^(-1)*(|Q_1|+|Q_2|)) error term from what you describe in your post, hence when you take the limit, you get exp(ln(Q_1)+ln(Q_2)).

edit: sorry, can't help myself adding that q^(1/n) is, of course, far from being unique, and is only meaningful because the exponential map of the quaternions is onto. And I would have added the example of SL_2(R), but it's in the Wikipedia, so you can look it up yourself ;)

edit2: heck, you can even see it with the orthogonal matrices, very explicitely (forget the double cover with the unit quaternions, it does nothing locally anyway); exp assigns e^A=I+A+(1/2)*A*A+... to every scew-symmetric matrix A, and where e^A is of the form
1    0      00  cos(a) sin(a)0 -sin(a) cos(a)

we can choose A to be
0  0  00  0  a0 -a  0

and you add those matrices for the two rotations and e^ them back to orthogonal matrices.

[Edited by - Darkstrike on September 29, 2006 8:22:46 AM]