Why won't th square start from the origin?

Started by
3 comments, last by Seroja 16 years, 8 months ago
Hi, I'm new to OpenGL, I wrote a simple square that is supposed to start at (0.0, 0.0) but it appears like this when I run: | |.......... _____________ |..........|....................| |..........|....................| |..........|....................| |..........|....................| |..........|____________.| | |__________________________ (It looked better in the editor) The point is that it starts like at (10, 10) -or (1, 1), I'm really new to this-. ...Here's the code: - (void) drawRect: (NSRect) bounds { glClearColor (1, 1, 1, 0); glClear (GL_COLOR_BUFFER_BIT); glBegin (GL_POLYGON); glColor3f (0.0, 0.0, 1.0); glVertex2f (0.0, 0.0); glColor3f (0.0, 1.0, 0.0); glVertex2f (0.0, 2.0); glColor3f (0.0, 1.0, 1.0); glVertex2f (2.0, 2.0); glColor3f (1.0, 0.0, 0.0); glVertex2f (2.0, 0.0); glEnd (); glFlush (); } By the way I'm using Xcode and Interface Builder.
Advertisement
My memory may be failing me, but isn't (0,0) mapped to the center of the screen in OpenGL, as opposed to a corner?
How did you setup your projection? If it's perspective, (0,0,0) is actually at the middle of the viewport (window). If it's orthographic, then it depends on the parameters passed.

Anyway, a screenshot could be useful.
I think ToohrVyk is right. How can I set it to ortho?
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, 1, 0, 1, -1, 1);
glMatrixMode(GL_MODELVIEW);


That would make (0,0) bottom-left and (1,1) top-right.

This topic is closed to new replies.

Advertisement