Jump to content

  • Log In with Google      Sign In   
  • Create Account


Matrix LookAt problem


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
2 replies to this topic

#1 codecandy2k   Members   -  Reputation: 106

Like
0Likes
Like

Posted 12 April 2013 - 04:23 PM

So as a learning process I'm trying to create some matrix functions to use in place of the XNA Math functions in DirectX.

 

Right now I'm stuck on MatrixLookAtLH which would replace XMMatrixLookAtLH.

 

Here is my function:

 

 

inline void MatrixLookAt( Matrix4F& matrixOut, Vector3F& eye, Vector3F& at, Vector3F& up )
{
    Vector3F zAxis( at.x() - eye.x(), at.y() - eye.y(), at.z() - eye.z() );
    VectorNormalize( zAxis );

    Vector3F xAxis = VectorCrossProduct( up, zAxis );
    VectorNormalize( xAxis );

    Vector3F yAxis = VectorCrossProduct( zAxis, xAxis );
    VectorNormalize(yAxis);


    MatrixSetRow( matrixOut, 0, xAxis );
    MatrixSetRow( matrixOut, 1, yAxis );
    MatrixSetRow( matrixOut, 2, zAxis );

    Vector3F wAxis( -VectorDotProduct(xAxis,eye), -VectorDotProduct(yAxis,eye), -VectorDotProduct(zAxis,eye) );
    MatrixSetRow( matrixOut, 3, wAxis );
}


And here is some code that calls it and the XM function:

 

   XMVECTOR Eye = XMVectorSet( 0.0f, 3.0f, -6.0f, 0.0f );
    XMVECTOR At = XMVectorSet( 0.0f, 1.0f, 0.0f, 0.0f );
    XMVECTOR Up = XMVectorSet( 0.0f, 1.0f, 0.0f, 0.0f );

    Vector3F Eye2 = Vector3F(0.0f, 3.0f, -6.0f);
    Vector3F At2 = Vector3F(0.0f, 1.0f, 0.0f);
    Vector3F Up2 = Vector3F(0.0f, 1.0f, 0.0f);

    Matrix4F view;
    MatrixLookAt(view, Eye2, At2, Up2);

    XMMATRIX xmView = XMMatrixLookAtLH( Eye, At, Up );
 

 

The results of the functionis are simiar but not exact.


This is the matrix "xmView" after calling the XNA XMMatrixLookAtLH function:

 

1 0 0 0
0 0.94868326 -0.31622776 0
0 0.31622776  0.94868326 0
0 -0.94868326 6.6407828 1

 

 

And this is the matrix "view" after calling my MatrixLookAtLH function:

 

1 0 0 0

0 0.94868326 0.31622776 0
0 -0.31622776  0.94868326 0
0 -0.94868326 6.6407828 1
 

 

As you can see, [1][2] and [2][1] seem to be reversed in the output from my code and everything else is the same.

 

Is there something fundamental in my calculation that I'm doing wrong, or is it possible that the step that I am missing is just reversing those 2 values?

 

Thanks!


Edited by codecandy2k, 12 April 2013 - 04:26 PM.


Sponsor:

#2 l0k0   Members   -  Reputation: 278

Like
1Likes
Like

Posted 13 April 2013 - 02:13 AM

The view matrix transforms from world to screen space.  It is the inverse of the camera's world matrix.  

 

We know that the inverse of a rotation matrix is its transpose.  Since what you are calculating is the view matrix, you need to transpose the upper 3x3 portion of the matrix to get the desired result.  Notice if you do that with your results, that you get the same thing as the library's matrix (don't just negate that's silly).  You can either transpose the matrix after setting the 3 columns or just set it directly.

 

What you are doing now is right for the translation, but is giving the world space rotation (when really we want its inverse).  Use your left hand given your input, have it point from the "eye" to the "at".  Hopefully it then becomes clear how the properly calculated view matrix is undoing that transformation.


<shameless blog plug>
A Floating Point
</shameless blog plug>

#3 codecandy2k   Members   -  Reputation: 106

Like
0Likes
Like

Posted 13 April 2013 - 02:31 PM

Brilliant, thanks so much for the answer and the explanation. I created a function to transpose the 3x3 portion of a 4x4 matrix in-place, and it works like a champ now.






Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS