• 12
• 27
• 9
• 9
• 20

# Matrix LookAt problem

This topic is 1799 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

So as a learning process I'm trying to create some matrix functions to use in place of the XNA Math functions in DirectX.

Right now I'm stuck on MatrixLookAtLH which would replace XMMatrixLookAtLH.

Here is my function:

inline void MatrixLookAt( Matrix4F& matrixOut, Vector3F& eye, Vector3F& at, Vector3F& up )
{
Vector3F zAxis( at.x() - eye.x(), at.y() - eye.y(), at.z() - eye.z() );
VectorNormalize( zAxis );

Vector3F xAxis = VectorCrossProduct( up, zAxis );
VectorNormalize( xAxis );

Vector3F yAxis = VectorCrossProduct( zAxis, xAxis );
VectorNormalize(yAxis);

MatrixSetRow( matrixOut, 0, xAxis );
MatrixSetRow( matrixOut, 1, yAxis );
MatrixSetRow( matrixOut, 2, zAxis );

Vector3F wAxis( -VectorDotProduct(xAxis,eye), -VectorDotProduct(yAxis,eye), -VectorDotProduct(zAxis,eye) );
MatrixSetRow( matrixOut, 3, wAxis );
}



And here is some code that calls it and the XM function:

   XMVECTOR Eye = XMVectorSet( 0.0f, 3.0f, -6.0f, 0.0f );
XMVECTOR At = XMVectorSet( 0.0f, 1.0f, 0.0f, 0.0f );
XMVECTOR Up = XMVectorSet( 0.0f, 1.0f, 0.0f, 0.0f );

Vector3F Eye2 = Vector3F(0.0f, 3.0f, -6.0f);
Vector3F At2 = Vector3F(0.0f, 1.0f, 0.0f);
Vector3F Up2 = Vector3F(0.0f, 1.0f, 0.0f);

Matrix4F view;
MatrixLookAt(view, Eye2, At2, Up2);

XMMATRIX xmView = XMMatrixLookAtLH( Eye, At, Up );


The results of the functionis are simiar but not exact.

This is the matrix "xmView" after calling the XNA XMMatrixLookAtLH function:

1 0 0 0
0 0.94868326 -0.31622776 0
0 0.31622776  0.94868326 0
0 -0.94868326 6.6407828 1

And this is the matrix "view" after calling my MatrixLookAtLH function:

1 0 0 0

0 0.94868326 0.31622776 0
0 -0.31622776  0.94868326 0
0 -0.94868326 6.6407828 1

As you can see, [1][2] and [2][1] seem to be reversed in the output from my code and everything else is the same.

Is there something fundamental in my calculation that I'm doing wrong, or is it possible that the step that I am missing is just reversing those 2 values?

Thanks!

Edited by codecandy2k

##### Share on other sites

The view matrix transforms from world to screen space.  It is the inverse of the camera's world matrix.

We know that the inverse of a rotation matrix is its transpose.  Since what you are calculating is the view matrix, you need to transpose the upper 3x3 portion of the matrix to get the desired result.  Notice if you do that with your results, that you get the same thing as the library's matrix (don't just negate that's silly).  You can either transpose the matrix after setting the 3 columns or just set it directly.

What you are doing now is right for the translation, but is giving the world space rotation (when really we want its inverse).  Use your left hand given your input, have it point from the "eye" to the "at".  Hopefully it then becomes clear how the properly calculated view matrix is undoing that transformation.