Freaky glPoint behavior

Started by
1 comment, last by Brother Erryn 20 years, 3 months ago
Ok, first I''m calculating a point like thus:

	gluUnProject( 0.0f, 0.0f, 0.0f, mModel, mProj, viewport, &objXn, &objYn, &objZn);
	gluUnProject( 0.0f, 0.0f, 1.0f, mModel, mProj, viewport, &objXf, &objYf, &objZf);

	double dist = -objYn/(objYn-objYf);
 	ScreenLimits.x = (float)(objXn + dist*(objXn-objXf));
	ScreenLimits.z = (float)(objZn + dist*(objZn-objZf));
 
This grabs the world coordinates for Y=0 at one corner of the screen. In this case, the value for ScreenLimits.x is 77.6540. Sadly, when I render a point using ScreenLimits.x, 0.0f, ScreenLimits.z, I get nothing. HOWEVER, I can add this line:

        ScreenLimits.x=77.6540;
 
I add this right after the ScreenLimits.z = ... line, thus setting ScreenLimits.x (which I verified is 77.6540) to the same value, but using a constant. Now it displays correctly. For some more freakiness, I have verified that the correct value is in ScreenLimits.x when it comes time to render the point (and is not getting scrambled through some sort of pointer issue). 77.6540 is the value passed to the glVertex3f call either way. Additionally, three more corners are calculated in the same way after that one, the last two of which work correctly. I experimented with a few stunts like ScreenLimits.x=ScreenLimits.x * 1.0f, but it makes no difference. Any notions?
Advertisement
Is ScreenLimits.x a double or a float;
gluUnProject() which should be gluUnproject (I don''t know why OpenGL capitalized the ''P'') uses doubles.

I cannot explain why but when using MS Visual C++ v. 6 standard I often have had casting errors between doubles and floats that were only be alleviated when I passed doubles into functions asking for doubles and then explicitly cast them to floats when assigning them to my storage variables.

This problem occurred randomly for me.

...I hope that helps point you in the right direction.
ScreenLimits.x is a float, which is why I''m explicitly casting the results from UnProject as such.

I did figure out something else. Apparently if a glPoint is "off-screen", then it won''t render, no matter the point size (learned something new). If I added the line:

ScreenLimits.x-=0.00001f;

Then it displayed. So I suppose it''s a rounding error somewhere, even if it doesn''t show during debugging. Could the debugger be truncating the displayed value? That would be the most reasonable explanation, I think.

This topic is closed to new replies.

Advertisement