• Advertisement
Sign in to follow this  

Camera Math Question

This topic is 3742 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello, all: I've a question regarding some -- hopefully -- simple vector mathematics. I have created a class called CObject and it has three vector objects in it: position, velocity, and normal. The vectors are three dimensional vectors. In my scene, I'm setting the camera behind the object THREE units, or unit vectors. This I have no problem doing. Camera.Position = Object.Position + ((Object.Velocity - Object.Position).Normaliz() * -3) As stated, this pseudocode correctly sets the camera position. However, now I would like to set the camera's target -- or look-at -- direction and the camera's up position. Should the Camera.LookAt be equal to the Object.Velocity? Should the Camera.UpVector be equal to the Object.Normal? I don't think this is correct, for when experimenting, it didn't work as expected. What calculation should I do to obtain the correct Camera.LookAt and the correct Camera.UpVector if I want it to ALWAYS follow the object [IE: If the object turns, I won't notice the difference in rotating, for the camera rotates as well]? Thank you, all. Trecius

Share this post


Link to post
Share on other sites
Advertisement
Hi,

the Up vector defines the current world-space's up, usually (0, 1, 0).

The lookAt is the point where the camera looks at, i.e. the direction the camera looks. Here only the direction of (lookAt - eye) is important. (eye = camera position)

For example:

eye = (-10, 0, 0) and lookAt = (0, 0, 0)

has the same effect as

eye = (-10, 0, 0) and lookAt = (2, 0, 0)


Mr X

Share this post


Link to post
Share on other sites
Quote:
Original post by Mr X
the Up vector defines the current world-space's up, usually (0, 1, 0).


While this is normally the case, it does not have to be. The camera's up vector becomes the new Y axis (or whatever your vertical axis happens to be) in camera space. So, if you wanted to tilt the camera, e.g. to make it look like you were leaning out from behind a wall, you would want an up vector not equal to the world's up.

Share this post


Link to post
Share on other sites
Quote:
Original post by Driv3MeFar
While this is normally the case, it does not have to be. The camera's up vector becomes the new Y axis (or whatever your vertical axis happens to be) in camera space. So, if you wanted to tilt the camera, e.g. to make it look like you were leaning out from behind a wall, you would want an up vector not equal to the world's up.


Of course, you're right. (I have implemented it correcly, but I've forgotten it.)

I think the following is correct.
It's C++/Direct3D/pseudocode, but it's easy to translate.

We start with initial lookAt and up vectors, and then compute the current lookAt and up vectors according to the current eye position and rotation.


// we want to compute
D3DXVECTOR3 lookAt, up

// the initial lookAt and up vectors.
D3DXVECTOR3 orgLookAt = D3DXVECTOR3(0.0f, 0.0f, 1.0f);
D3DXVECTOR3 orgUp = D3DXVECTOR3(0.0f, 1.0f, 0.0f);

// the current eye / camera position
D3DXVECTOR3 eye = ...

// the current rotation around x-axis
float directionX = ...

// the current rotation around y-axis
float directionY = ...

D3DXMATRIXA16 t, r, temp;

// compute translation matrix
D3DXMatrixTranslation(&t, eye.x, eye.y, eye.z);

// compute rotation matrix
D3DXMatrixRotationYawPitchRoll(&r, directionY, directionX, 0.0f);

// multiply them
D3DXMatrixMultiply(&temp, &r, &t);

// transform lookAt and up according to transformation matrix
D3DXVec3TransformCoord(&lookAt, &orgLookAt, &temp);
D3DXVec3TransformCoord(&up, &orgUp, &r);

// now we have computed the current lookAt and up vectors





Mr X

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement