Ok, first I''m calculating a point like thus:
gluUnProject( 0.0f, 0.0f, 0.0f, mModel, mProj, viewport, &objXn, &objYn, &objZn);
gluUnProject( 0.0f, 0.0f, 1.0f, mModel, mProj, viewport, &objXf, &objYf, &objZf);
double dist = -objYn/(objYn-objYf);
ScreenLimits.x = (float)(objXn + dist*(objXn-objXf));
ScreenLimits.z = (float)(objZn + dist*(objZn-objZf));
This grabs the world coordinates for Y=0 at one corner of the screen. In this case, the value for ScreenLimits.x is 77.6540. Sadly, when I render a point using ScreenLimits.x, 0.0f, ScreenLimits.z, I get nothing. HOWEVER, I can add this line:
ScreenLimits.x=77.6540;
I add this right after the ScreenLimits.z = ... line, thus setting ScreenLimits.x (which I verified is 77.6540) to the same value, but using a constant. Now it displays correctly.
For some more freakiness, I have verified that the correct value is in ScreenLimits.x when it comes time to render the point (and is not getting scrambled through some sort of pointer issue). 77.6540 is the value passed to the glVertex3f call either way. Additionally, three more corners are calculated in the same way after that one, the last two of which work correctly. I experimented with a few stunts like ScreenLimits.x=ScreenLimits.x * 1.0f, but it makes no difference.
Any notions?