Archived

This topic is now archived and is closed to further replies.

Matrix calculated by gluLookAt?

This topic is 5009 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I''ve converted all my objects'' rotations to 4x4 matricies, and I consider my camera an inherited "object" (so I can make it affected by things like forces- gravity, acceleration, etc.) This means that the camera inherits an orientation matrix, and I''d like to be able to use the forward, position up and right vectors to make a matrix like the one generated by gluLookAt. How does this function make its matrix?

Share this post


Link to post
Share on other sites
Wow, I'm still really confused...

edit: found this: http://pyopengl.sourceforge.net/documentation/manual/gluLookAt.3G.html

Silly question, but is that "really" correct? Don't orientation matricies usually go down columns, like:

rightX upX forwardX
rightY upY forwardY
rightZ upZ forwardZ

rather than going across like they have it? And in the example, f is just the normalized forward vector, and up is the normalized up vector... so isn't s just the normalized right vector?

And then they say u=s x f... but wouldn't that just be crossing right with forward, which returns up?

Basically, according to this page, it says that a camera matrix has the up, right and forward vectors across the rows, with the forward vector being multiplied by -1. Finally, where would I put a translation into this matrix?

[edited by - sirSolarius on March 24, 2004 3:00:25 PM]

Share this post


Link to post
Share on other sites
In an OpenGL matrix (column major), you would take an identity matrix, then set these elements as:

m[0]=right.x
m[1]=right.y
m[2]=right.z

m[4]=up.x
m[5]=up.y
m[6]=up.z

m[8]=look.x
m[9]=look.y
m[10]=look.z

gluLookAt() then calls glMultMatrix() with this matrix to concatenate it onto the current matrix. Then apply the translation to the eye-point with glTranslatef(-eye.x, -eye.y, -eye.z);

Orthogonal vectors such as right,look and up inherently define the rotation matrix, so there is no tricky math required. Just simple element setting.

edit: Note that gluLookAt() calculates the look vector as eye-center (vector from eye point to point to look at) then normalizes it. All axis vectors must be normalized to unit length to prevent distortion.


Golem
Blender--The Gimp--Python--Lua--SDL
Nethack--Crawl--ADOM--Angband--Dungeondweller


[edited by - VertexNormal on March 24, 2004 3:06:27 PM]

Share this post


Link to post
Share on other sites
quote:
Original post by sirSolarius
Wow, I'm still really confused...

edit: found this: http://pyopengl.sourceforge.net/documentation/manual/gluLookAt.3G.html

Silly question, but is that "really" correct? Don't orientation matricies usually go down columns, like:

rightX upX forwardX
rightY upY forwardY
rightZ upZ forwardZ

rather than going across like they have it? And in the example, f is just the normalized forward vector, and up is the normalized up vector... so isn't s just the normalized right vector?

And then they say u=s x f... but wouldn't that just be crossing right with forward, which returns up?

Basically, according to this page, it says that a camera matrix has the up, right and forward vectors across the rows, with the forward vector being multiplied by -1. Finally, where would I put a translation into this matrix?

[edited by - sirSolarius on March 24, 2004 3:00:25 PM]



OpenGL matrices are column-major rather than row-major, so many OpenGL folks will write a matrix "backwards". It's a little tricky getting used to it. Just remember that if you have a Direct3D matrix, you need to transpose it (swap rows and columns) to use it in OpenGL.

Translation in an OpenGL matrix is specified at M[12]=x, M[13]=y, M[14]=z.

edit: And yes, you are correct. u=s x f, s=f x u and f = u x s. Those three vectors together, as long as they are orthogonal (perpendicular to each other) define an orientation. All the Rotate() calls you make simply use trigonometric functions to, in effect, generate the components of these 3 vectors. gluLookAt() performs this trick just in case the given UP vector is not exactly orthogonal. It will calculate side from look, then recalculate up, to ensure orthogonality.

argh. second edit: The forward vector is multiplied by -1 because gluLookAt calculates it as center-eye. I usually calculate it as eye-center, and pass on the multiply by -1. But it's the same, really. It's just that OpenGL maps the Look vector to look down the -Z axis.


Golem
Blender--The Gimp--Python--Lua--SDL
Nethack--Crawl--ADOM--Angband--Dungeondweller


[edited by - VertexNormal on March 24, 2004 3:12:18 PM]


[edited by - VertexNormal on March 24, 2004 3:20:17 PM]

Share this post


Link to post
Share on other sites
quote:
Original post by VertexNormal
OpenGL matrices are column-major rather than row-major, so many OpenGL folks will write a matrix "backwards". It''s a little tricky getting used to it. Just remember that if you have a Direct3D matrix, you need to transpose it (swap rows and columns) to use it in OpenGL.


look closer at the matrices in dx and you will see that they are exactly the same. just because they have their own operator to use "correct" indices doesnt mean anything. if you memcpy the dx matrix to an opengl matrix it should work right away.

Share this post


Link to post
Share on other sites
quote:
Original post by Trienco
quote:
Original post by VertexNormal
OpenGL matrices are column-major rather than row-major, so many OpenGL folks will write a matrix "backwards". It''s a little tricky getting used to it. Just remember that if you have a Direct3D matrix, you need to transpose it (swap rows and columns) to use it in OpenGL.


look closer at the matrices in dx and you will see that they are exactly the same. just because they have their own operator to use "correct" indices doesnt mean anything. if you memcpy the dx matrix to an opengl matrix it should work right away.




Ah, I didn''t know that. I always thought dx matrices were stored row major, for some reason. Of course, it''s also been years since I''ve used DX...

Thanks, Trienco.

Share this post


Link to post
Share on other sites