# Matrix math help

This topic is 2098 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

I can't believe I haven't needed this before..... How would you lerp between 2 matrix's???

Is there a function already out there or would I need to make my own? If I make my own, how?

##### Share on other sites

Well, ApochPiQ answer is probably what you needed to hear, but in fact lerping between 2 matrices is a perfectly well-defined operation. Matrices can be added, subtracted and scaled (i.e., multiplied by constants). Verify a couple of axioms more, and it turns out they are a vector space. You can always compute t*v + (1-t)*w, and that's what lerp is. Notice that the non-commutative nature of matrix multiplication doesn't enter the picture, because we are only multiplying the matrices by scalars, and that works just fine.

##### Share on other sites

You can't guarantee (i.e. it usually won't be true) that the matrix will be orthonormal (each row/column perpendicular and normalised), orthogonal (each row/column perpendicular) or each row or column linearly independent though. You normally have to re-orthonormalise your matrix after a lerp (e.g. via Gram-Schmidt) but then you may as well use quaternion/scale/translation components then.

Edited by Paradigm Shifter

##### Share on other sites

It'd be interesting to know what your intentions are.

##### Share on other sites

Well, ApochPiQ answer is probably what you needed to hear, but in fact lerping between 2 matrices is a perfectly well-defined operation. Matrices can be added, subtracted and scaled (i.e., multiplied by constants). Verify a couple of axioms more, and it turns out they are a vector space. You can always compute t*v + (1-t)*w, and that's what lerp is. Notice that the non-commutative nature of matrix multiplication doesn't enter the picture, because we are only multiplying the matrices by scalars, and that works just fine.

Yes, lerping matrices is defined, but it will violate the semantics of the matrix in the general case, as Paradigm Shifter elaborated on.

##### Share on other sites

The original question didn't say anything about orthonormal matrices. It is very likely that the OP is interested in that case, which is why I said that ApochPiQ's answer is probably what he needed to hear.

There are contexts where the naive lerp between matrices is a perfectly reasonable thing to use.

Edited by Álvaro

##### Share on other sites

It'd be interesting to know what your intentions are.

Animation. I created a function to load a mesh (.x) with animation matrices. I need a way to move/rotate the object smoothly. Given that each frame has its own matrix, if I want it to animate very slowly would pose a problem. The difference between, say, frame0 and frame1 could be a rotation of 22.5 degrees, but I want it to move at 0.001 degree increments. Without a way to lerp, I could only show frame0 or frame1. With lerping, I could show incremental rotation of 0.001 each frame (smooth)......

orthonormal

Wow. I think I understand what that means (wiki). I am still not sure I need to worry about it (considering how I want to lerp). I'm pretty sure I can lerp the location part of the matrix, but the rotation I may have a problem with. As I understand, the "rotation" parts of the matrix is -1 to +1. Could I simply make a quaternions (after removing the "location"), lerp between the quaternions and then assemble the final matrix?

##### Share on other sites

Yes, use quaternions for that.

EDIT: Kind of. This bit doesn't make sense: As I understand, the "rotation" parts of the matrix is -1 to +1

The rotation part (assuming no scale/shear/other shenanigans) is the upper 3x3 matrix of a 4x4, if it is orthonormal you can extract the rotation as a quaternion and slerp.

Edited by Paradigm Shifter

##### Share on other sites

Scaling and location can be a simple vector lerp, right?

• ### Game Developer Survey

We are looking for qualified game developers to participate in a 10-minute online survey. Qualified participants will be offered a \$15 incentive for your time and insights. Click here to start!

• 16
• 11
• 23
• 42
• 75