For the past few days I have been struggling with a problem concerning a graphics program I am writing. It displays a 3d sphere with lat/lon coordinates for the user to see. An overlay of the US is also shown, with true 3d mapping. Everything works up to this point.

Here is what I have:

* The globe can be zoomed in and out by adjusting the horizontal field of view angle

* The globe can also be rotated

* The latitude and longitude are shown for the point that the screen is centered on (this works correctly)

Now, here is what I want to have:

* Show the latitude and longitude for the point where the mouse cursor is centered on (having issues with this)

Now, this is not like in other posts where people ask, how do I unproject a 2d point (the mouse coordinates) to 3d. No. This is different. Instead, I am asking about retrieving the latitude and longitude at the mouse coordinates, and this is feasible since we know the size of the sphere in its 3d space as well as the x,y coordinates and field of view.

So, here is what I have done so far.

I have managed to correctly calculate the change in horizontal angle and change in vertical angle from the center. This is accomplished by way of using hx=tan(fov/2) * d, where d is the distance to the center of the sphere. hx gives me the horizontal length at the current field of view (which necessary since, again, the sphere can be zoomed in and out). And then taking arcsin(((cursor.x-400)*(hx/400))/SPHERE_RADIUS) to get the change in horizontal angle based on horizontal change from the center of the screen (i.e., mouse cursor x movement from center).

So you might think, okay, he's got the latitude and longitude of what the screen is centered on, as well as the change in vertical and horizontal angles. So what's the problem? Can't he just subtract the horizontal angle change from the longitude, and similar for latitude?

As it turns out, it's not that simple. For example, when the user is looking directly at latitude 90 degrees (North pole), then any mouse deviation from the center of the screen necessitates a change in latitude even if only horizontal mouse movement has taken place. But then, when looking at latitude 0 degrees (the equator), then horizontal mouse movement will only affect longitude.

Herein lies the problem: what to do when the user is looking at between those two extremes? Say, latitude 45 degrees? I had managed to use cos(latitude)*(change in vertical) and also add to that the sin(latitude)*(combination of latitude and longitude) to mix the two.. And eventually got it working at lat=0 and lat=90. But never for in-between.

Going back to how I arrived at displaying the centered point latitude and longitude in the first place. I use quaternion for the rotation:

// Get orig quaternion LatLonToQuaternion(g_angRotateX, g_angRotateY, x, y, z, w);

That gives me a quaternion from my centered position's latitude and longitude. Yes the lat/lon are flipped in the function's input. But the output quaternion is correct, I have verified.

So then, now that I have a quaternion... I should be able to multiply that by my change in latitude and longitude, yes?

// Rotate by lonX RotationXYZToQuaternion(0.0f, lonX, 0.0f, tx, ty, tz, tw); QuaternionMultiply(tx, ty, tz, tw, x, y, z, w, ux, uy, uz, uw); x=ux; y=uy; z=uz; w=uw; // Rotate by latX RotationXYZToQuaternion(latX, 0.0f, 0.0f, tx, ty, tz, tw); QuaternionMultiply(tx, ty, tz, tw, x, y, z, w, ux, uy, uz, uw); QuaternionToLatLon(ux, uy, uz, uw, lonF, latF);

The above code uses a function called RotationXYZToQuaternion to give me a quaternion from a given X,Y,Z rotation. So I first get a quaternion for the change in horizontal angle. Then I multiply that quaternion by the original quaternion that I derived from the lat/lon at the center of the screen. Next, I do the same thing for the change in vertical angle, and finally convert the result back into a latitude and longitude.

In this case the above code works at lat=0, but not at lat=90 or anywhere else in that I do not get back the expected latitude and longitude for where the user is pointing the mouse cursor.

So, this is where I am after about four days of struggling with it -- I have managed to get this far. But am really stuck.

Basically it boils down to this problem:

**Given:**

x=longitude at screen center

y=latitude at screen center

qx=change in the sphere's horizontal angle, as calculated from the deviation of the mouse cursor's x position from screen center (*not the same thing as change in longitude, unless lat=0)

qy=change in the sphere's vertical angle, as calculated from the deviation of the mouse cursor's y position from screen center (*also not the same as change in latitude, unless lat=0)

**Determine**the new latitude and longitude.

So I ask, if anyone can help me on this, how to solve the problem. What do I do when the user is not centered at latitude=0? How is the change in vertical and horizontal angles applied when latitude=30 degrees, for example?

Any help would be

*immensely*appreciated!

Thanks.