Jump to content
  • Advertisement


This topic is now archived and is closed to further replies.

Brother Erryn

Freaky glPoint behavior

This topic is 5308 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Ok, first I''m calculating a point like thus:
	gluUnProject( 0.0f, 0.0f, 0.0f, mModel, mProj, viewport, &objXn, &objYn, &objZn);
	gluUnProject( 0.0f, 0.0f, 1.0f, mModel, mProj, viewport, &objXf, &objYf, &objZf);

	double dist = -objYn/(objYn-objYf);
 	ScreenLimits.x = (float)(objXn + dist*(objXn-objXf));
	ScreenLimits.z = (float)(objZn + dist*(objZn-objZf));
This grabs the world coordinates for Y=0 at one corner of the screen. In this case, the value for ScreenLimits.x is 77.6540. Sadly, when I render a point using ScreenLimits.x, 0.0f, ScreenLimits.z, I get nothing. HOWEVER, I can add this line:
I add this right after the ScreenLimits.z = ... line, thus setting ScreenLimits.x (which I verified is 77.6540) to the same value, but using a constant. Now it displays correctly. For some more freakiness, I have verified that the correct value is in ScreenLimits.x when it comes time to render the point (and is not getting scrambled through some sort of pointer issue). 77.6540 is the value passed to the glVertex3f call either way. Additionally, three more corners are calculated in the same way after that one, the last two of which work correctly. I experimented with a few stunts like ScreenLimits.x=ScreenLimits.x * 1.0f, but it makes no difference. Any notions?

Share this post

Link to post
Share on other sites
Is ScreenLimits.x a double or a float;
gluUnProject() which should be gluUnproject (I don''t know why OpenGL capitalized the ''P'') uses doubles.

I cannot explain why but when using MS Visual C++ v. 6 standard I often have had casting errors between doubles and floats that were only be alleviated when I passed doubles into functions asking for doubles and then explicitly cast them to floats when assigning them to my storage variables.

This problem occurred randomly for me.

...I hope that helps point you in the right direction.

Share this post

Link to post
Share on other sites
ScreenLimits.x is a float, which is why I''m explicitly casting the results from UnProject as such.

I did figure out something else. Apparently if a glPoint is "off-screen", then it won''t render, no matter the point size (learned something new). If I added the line:


Then it displayed. So I suppose it''s a rounding error somewhere, even if it doesn''t show during debugging. Could the debugger be truncating the displayed value? That would be the most reasonable explanation, I think.

Share this post

Link to post
Share on other sites

  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!