Jump to content
  • Advertisement
Sign in to follow this  

Matrix LookAt problem

This topic is 1919 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

So as a learning process I'm trying to create some matrix functions to use in place of the XNA Math functions in DirectX.


Right now I'm stuck on MatrixLookAtLH which would replace XMMatrixLookAtLH.


Here is my function:



inline void MatrixLookAt( Matrix4F& matrixOut, Vector3F& eye, Vector3F& at, Vector3F& up )
    Vector3F zAxis( at.x() - eye.x(), at.y() - eye.y(), at.z() - eye.z() );
    VectorNormalize( zAxis );

    Vector3F xAxis = VectorCrossProduct( up, zAxis );
    VectorNormalize( xAxis );

    Vector3F yAxis = VectorCrossProduct( zAxis, xAxis );

    MatrixSetRow( matrixOut, 0, xAxis );
    MatrixSetRow( matrixOut, 1, yAxis );
    MatrixSetRow( matrixOut, 2, zAxis );

    Vector3F wAxis( -VectorDotProduct(xAxis,eye), -VectorDotProduct(yAxis,eye), -VectorDotProduct(zAxis,eye) );
    MatrixSetRow( matrixOut, 3, wAxis );

And here is some code that calls it and the XM function:


   XMVECTOR Eye = XMVectorSet( 0.0f, 3.0f, -6.0f, 0.0f );
    XMVECTOR At = XMVectorSet( 0.0f, 1.0f, 0.0f, 0.0f );
    XMVECTOR Up = XMVectorSet( 0.0f, 1.0f, 0.0f, 0.0f );

    Vector3F Eye2 = Vector3F(0.0f, 3.0f, -6.0f);
    Vector3F At2 = Vector3F(0.0f, 1.0f, 0.0f);
    Vector3F Up2 = Vector3F(0.0f, 1.0f, 0.0f);

    Matrix4F view;
    MatrixLookAt(view, Eye2, At2, Up2);

    XMMATRIX xmView = XMMatrixLookAtLH( Eye, At, Up );


The results of the functionis are simiar but not exact.

This is the matrix "xmView" after calling the XNA XMMatrixLookAtLH function:


1 0 0 0
0 0.94868326 -0.31622776 0
0 0.31622776  0.94868326 0
0 -0.94868326 6.6407828 1



And this is the matrix "view" after calling my MatrixLookAtLH function:


1 0 0 0

0 0.94868326 0.31622776 0
0 -0.31622776  0.94868326 0
0 -0.94868326 6.6407828 1


As you can see, [1][2] and [2][1] seem to be reversed in the output from my code and everything else is the same.


Is there something fundamental in my calculation that I'm doing wrong, or is it possible that the step that I am missing is just reversing those 2 values?



Edited by codecandy2k

Share this post

Link to post
Share on other sites

The view matrix transforms from world to screen space.  It is the inverse of the camera's world matrix.  


We know that the inverse of a rotation matrix is its transpose.  Since what you are calculating is the view matrix, you need to transpose the upper 3x3 portion of the matrix to get the desired result.  Notice if you do that with your results, that you get the same thing as the library's matrix (don't just negate that's silly).  You can either transpose the matrix after setting the 3 columns or just set it directly.


What you are doing now is right for the translation, but is giving the world space rotation (when really we want its inverse).  Use your left hand given your input, have it point from the "eye" to the "at".  Hopefully it then becomes clear how the properly calculated view matrix is undoing that transformation.

Share this post

Link to post
Share on other sites

Brilliant, thanks so much for the answer and the explanation. I created a function to transpose the 3x3 portion of a 4x4 matrix in-place, and it works like a champ now.

Share this post

Link to post
Share on other sites
Sign in to follow this  

  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!