Advertisement Jump to content
Sign in to follow this  
tomer_sh

OpenGL stupid question about OpenGL

This topic is 4914 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I have a problem with this sentence: glVertex3f( 0.0f, 1.0f, 0.0f); glVertex3f(-1.0f,-1.0f, 0.0f); glVertex3f( 1.0f,-1.0f, 0.0f); in all OpenGL examples everything seems normalized, why? are the coords being scaled by some factor? anyway, when I render this triangle it looks big. but, if I draw this triangle in my software renderer it will be so small. it seems that every unit of 1 equals to a bigger number such as 32 or something. thanks, tomer

Share this post


Link to post
Share on other sites
Advertisement
Opengl forum, please.

No, there is no reason for everything to be normalised. It's just handy to make unit cubes etc for tutorials.

The scaling you see is due to the camera perspective transform. Things that are close look bigger that those that are far away.

Father Ted:
<Holds up plastic toy cow>
Okay Dougal, these are small....
<Points to window, through which cows can be seen grazing>
Those, are far away......

Dougal:
<Looks puzzled, shakes his head & grins>

Share this post


Link to post
Share on other sites
they are just units in world space, they are not pixel coordinates. In your software renderer, the camera is simply further from the triangle than in your opengl sample. Like MrBastard said, using 1.0s is just easier for simple examples.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!