Sign in to follow this  

direction vector from 2 matrices

This topic is 3462 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I use 2 matrices to place my camera into the world ( - a translation matrix to place the camera somewhere in the virtual world - a rotation matrix to point it into a certain direction) These 2 matrices make up my view matrix. now I want to move my camera backwards and forwards into the virtual world. (backwards/forwards would be moving it towards or away from the direction it is looking at) The question is how using the rotation matix / translation matrix I can get a direction vector that tells me in which direction the camera is looking for example: translation matrix: 1 0 0 0 0 1 0 0 0 0 1 0 -20 0 -20 1 rotation matrix: 0.707 0 0.707 0 0 1 0 0 -0.707 0 0.707 0 0 0 0 1 The camera is looking towards x: 0 y: 0 z: 0 (note that I am not using height (y) yet). so the direction vector should be sth like: [1 0 1 0]

Share this post


Link to post
Share on other sites
Your camera looks along its local x axis, so the direction is simply given by the first row of its orientation matrix. If you're only applying one rotation to it then it will be (0.707 0 0.707 0) in your example below - which is just the normalised version of (1 0 1 0) as you wrote.

Share this post


Link to post
Share on other sites
Why are you using two matrices? The translation and rotation parts are mutually exclusive anyway so just combine them into one.

Anyway, as to your question, the camera usually looks along the Z-axis. At least, in DirectX and OpenGL that is the case (by convention). So, just transform (0,0,1,0) by the inverse of your view matrix and you get the direction in which it is looking in world space. Don't forget to normalize it afterwards.

Edit: in your case, like MrRowl said, you look along another axis. Just use that axis instead of (0,0,1,0). In the case of the X-axis that should be (1,0,0,0)

Share this post


Link to post
Share on other sites
As I am new to 3D math with matrices....What makes my camera look along the X-axis and how to change it so it looks along the Z-axis as seems to be the convention.

I am using 2 matrices for the camera as that is the way I add the position of the camera and the orientation of my camera to the scenegraph. When rendering the scene both matrices get combined. For me 2 matrices is easier as then I can just move the camera easily to which ever position I want. I can also add input controllers to a transformation node (orientation/translation matrix) and with 2 independant matrices I can easily rotate/move the camera in worldspace. The problem is how to move it forward/backward into and away from the view direction.

Scenegraph:

orientation matrix ==> translation matrix ==> camera


If I combine the view and orientation matrix given in my first post I would get this matrix:

0.707 0 0.707 0
0 1 0 0
-0.707 0 0.707 0
0 0 -28.28 0

and from this matrix I can't easily get the position of the camera in worldspace as the position is given in cameraspace.

Share this post


Link to post
Share on other sites
Quote:
Original post by Christiaan
As I am new to 3D math with matrices....What makes my camera look along the X-axis and how to change it so it looks along the Z-axis as seems to be the convention.

Take a look at DirectX's D3DXMatrixLookAtLH. It shows you what kind of matrix it creates.

Quote:
Original post by Christiaan
If I combine the view and orientation matrix given in my first post I would get this matrix:

0.707 0 0.707 0
0 1 0 0
-0.707 0 0.707 0
0 0 -28.28 0

and from this matrix I can't easily get the position of the camera in worldspace as the position is given in cameraspace.

It might be easier to just keep seperate data for the rotation and position of the camera. Personally, I prefer:

struct Camera {
Vector3 Position;
Vector3 Target;
Vector3 Up;
};


and then recalculate the View matrix (for example with D3DXMatrixLookAtLH) whenever any of these change.

But however you construct your final view matrix, getting the 'look at' direction is simple:

Vector3 direction = normalize(Vector4(0,0,1,0) * inverse(ViewMatrix));

As I said, replace the 0,0,1,0 vector with the vector for the appropriate axis.

Of course, if you have a Camera struct, it's much easier with: normalize(Target - Position);

Share this post


Link to post
Share on other sites
Quote:
Original post by Mike nl
Quote:
Original post by Christiaan
As I am new to 3D math with matrices....What makes my camera look along the X-axis and how to change it so it looks along the Z-axis as seems to be the convention.

Take a look at DirectX's D3DXMatrixLookAtLH. It shows you what kind of matrix it creates.



pEye: -20, 0, -20
pAt: 0, 0, 0
pUp: 0, 1, 0

if I then use D3DXMatrixLookAtLH I get the following matrix:
0.707 0 0.707 0
0 1 0 0
-0.707 0 0.707 0
0 0 28.28 0

Which is the EXACT same matrix I get when I multiply my orientation matrix X translation matrix...

[edit] in my earlier post I said that cameraMatrix._43 was -28.28, it actually is 28.28 (I put the breakpoint in the wrong place).

Share this post


Link to post
Share on other sites
Ah, well, then your final matrix is fine. Get the inverse and multiply with the Z direction to get viewing direction (or use target - position if you have those).

Share this post


Link to post
Share on other sites

This topic is 3462 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this