Quaternion from latitude and longitude

Started by
4 comments, last by apatriarca 11 years, 10 months ago
Hi ,all,

I want to get a quaternion from the latitude and longitude. could any one hell me how to construct the quaternion?


the coordinate frame details:

[font=arial, helvetica, sans-serif] given that t[color=#000000][background=rgb(249, 249, 249)]he Z axis points to the north pole. The X axis points to the intersection of the prime [/background][color=#000000][background=rgb(249, 249, 249)]meridian and the equator, in the equatorial plane. The Y axis completes a right-handed [/background][color=#000000]coordinate system, and is 90 degrees east of the X axis and also in the equatorial plane.[color=#000000][background=rgb(249, 249, 249)] [/background][/font]

it seems that I can get the quaternion from multiplication from QLat=(sin(lat),0,0,cos(lat)) and QLon=(0,0,sin(lon),cos(lon)), ie Quat = Qlat*Qlon,but the result is incorrect when applying it to the camera.Is there any mistake ? Thanks you very much!

Advertisement
A unit quaternion represents an orientation or rotation, but you only have a single point on the unit sphere in 3D. What this point really represents? The composition of two different rotations? One of the axes? You have written two rotations in your post, but the formula you have used is probably wrong. A quaternion representing a rotation is in fact usually given in the form cos(angle/2) + sin(angle/2)*axis.
I don't know if this will help, but it converts a latitude/longitude to a unit vector and back again. You can always play around the axes to find the orientation that you need. And yeah, I dunno what you're trying to achieve with all that other stuff, so I'll ignore it.


void latlon_to_xyz(const double lat, const double lon, double &x, double &y, double &z)
{
double theta = 2*pi*(lon + 180.0)/360.0;
double phi = pi*(lat + 90.0)/180.0;

x = -(cos(theta)*sin(phi));
z = -(sin(theta)*sin(phi));
y = -cos(phi);
}


void xyz_to_latlon(const double x, const double y, const double z, double &lat, double &lon)
{
double theta = pi + atan2(z, x);
double phi = acos(-y);

lat = phi/pi*180.0 - 90.0;
lon = theta/(2*pi)*360.0 - 180.0;
}
Thanks for all replies!

Actually, I want to orient a camera in a virtual globe application using OpenGL.In this app, the globe camera is described by latitude ,longitude and azimuth .(the lat,lon are not the position of a camera,there are the viewing target of the camera.there are some other parameter such as the distance from looking target,tilt)

At the beginning,the camera is aligned with world coordinate frame.(the viewing direction of camera is toward the world -Z axis.)

When the user specifies lat,lon,azimuth angle ,the camera should be targeted to the specified lat,lon from space with some distance.

Given a (longitude,latitude) as the viewing target of the camera,I firstly rotate the camera longitude degree around the local Z axis .

Then ,the camera is rotated latitude degree around local X axis of camera.

So the quaternion of orientation Q' = Qlat*Qlon. The angles in the Qlat and Qlon have been halved before construct the quaternion.

I have used the Q' in the app, but the result is incorrect.

Is there any problem in using quaternion to orient the camera described by latitude and longitude? I wish some one can pinpoint the problem rest in the method used by me.

Thank you very much!



Here is how I convert spherical coordinates to a normalized direction vector: float3::SetFromSphericalCoordinates, and the other way around: float3::ToSphericalCoordinates.

I use spherical coordinates to generate a simple freelook camera input system where the left and right keyboard buttons (and mouse x) increment and decrement the azimuth, generating a yaw rotation effect. Then up and down keyboard buttons (and mouse y) increment and decrement inclination, providing a vertical pitch lookat control.

The code looks like this:

InputSystem input;
Frustum camera; // preinitialized to initial position and orientation.

void MoveCamera(float dt)
{
float2 spherical = camera.front.ToSphericalCoordinatesNormalized();
if (input.IsKeyDown(Left)) spherical.x -= dt;
if (input.IsKeyDown(Right)) spherical.x += dt;
if (input.IsKeyDown(Up)) spherical.y -= dt;
if (input.IsKeyDown(Down)) spherical.y += dt;
camera.front = float3::FromSphericalCoordinates(spherical.x, spherical.y);
camera.up = float3::unitY;
float3::Orthonormalize(camera.front, camera.up);

// If I wanted a quaternion out of the camera transform, I'd do the following:
Quat q = camera.WorldMatrix().Float3x3Part().ToQuat();
}


The Frustum, float2, float3 and other classes are from MathGeoLib.

Strictly interpreted, a (latitude, longitude) pair can only be converted to a normalized direction vector. To convert this to a full orientation, one needs to fix one additional degree of freedom, namely the twist/roll around the lookat direction. In the above code that is achieved by constraining the camera up direction towards +Y (i.e. in this case, zero roll around the forward lookat direction).
As already clb said, your problem is under-constrained. You have to add some other constraints like requiring the +Y (or the +Z in your case) to point upwards. You should then compute a unit vector from your spherical coordinate and use your additional constraints to compute a complete orthonormal basis. In this way you basically have the corresponding rotation matrix (the matrix whose rows are the basis vectors of the reference frame) and you can use a matrix-to-quaternion conversion to get your quaternion (or maybe you can use some more direct, but equivalent, frame-to-quaternion conversion). In general, I suggest looking at the implementation of gluLookAt and search for matrix-to-quaternion conversions formula (for example here).

This topic is closed to new replies.

Advertisement