# glOrtho off-by-one errors?

## Recommended Posts

What's up with glOrtho and drawing primitives? I'm having all sorts of weird off-by-one errors in the 2D section of my rendering loop. They're all easily correctable but I'd prefer to know WHY and not just adjust them as they happen. See at first I noticed this when I was drawing borders around a few objects on my screen using LINE_LOOP. What would happen is it would draw everything fine except the very upper-left corner would not get plotted. Well for the longest time I thought it wasn't plotting the first pixel for some reason. It was kind of annoying but I was just ignoring it for the time being by patching it up with a GL_POINT. Well then last night I was working on a new piece of the project (some window drawing code) which required drawing lots of 2D lines and rectangles and such and they all had to be pixel-perfect precision to match exactly the template I had in Photoshop. Well that's when I figured out that LINE_LOOP was screwing up because as I went around the square clockwise from the upper left corner, the lower-left corner was drawn one pixel to the left of the upper-left corner, so that even though I was using the exact same X coordinate for each of them, the left side of the square showed up 1 pixel to the left of the X. It does it on other 2D primitives too, but lines are the really weird ones. They have no appear pattern that I could figure out as to when one or the other endpoint will be inclusive and when it will not be. I did get my window drawing code to work properly but the way I did it was to experiment with the different shapes when drawn very small (2x2 or less) so that I could easily see when there was an error. Doing this I was able to develop the following set of macros which draws the lines "properly". I was hoping someone would be able to explain why the values in there are the way they are because to me they're just "magic numbers". When I say "properly" know that I come from 2D graphics libraries. i.e., the upper-left corner of the screen is (0,0), a line drawn from (1,1)-(2,1) is a horizontal line 2 pixels in width, offset 1x1 pixel from the upper-left corner. These macros do have blackslashes on the end of every line but I took them out because it was screwing up the forum.
	#define glfLineH(X1,Y1,X2) {
glBegin(GL_LINES);
glVertex2i((X1), (Y1)+1);
glVertex2i((X2)+1, (Y1)+1);
glEnd();
}
#define glfLineV(X1,Y1,Y2) {
glBegin(GL_LINES);
glVertex2i((X1)+1, (Y1));
glVertex2i((X1)+1, (Y2)+1);
glEnd();
}
#define glfSquare(X1,Y1,X2,Y2) {
glBegin(GL_LINE_LOOP);
glVertex2i((X1), (Y1)+1);
glVertex2i((X2)+1, (Y1)+1);
glVertex2i((X2)+1, (Y2)+1);
glVertex2i((X1)+1, (Y2)+1);
glEnd();
}
#define glfBox(X1,Y1,X2,Y2) {
glVertex2i((X1), (Y1));
glVertex2i((X2)+1, (Y1));
glVertex2i((X2)+1, (Y2)+1);
glVertex2i((X1), (Y2)+1);
glEnd();
}
#define glfPoint(X1,Y1) {
glBegin(GL_POINTS);
glVertex2i((X1), (Y1)+1);
glEnd();
}


I think it's pretty clear from their names and the code what they are supposed to do (and they do work, I just don't know why). You'll notice that the one to draw a filled box out of a QUAD makes sort-of sense, presumably gl is considering the endpoints to be exclusive. However when I draw the exact same shape with a LINE_LOOP I have to use different coordinates, and a lot of them are startpoints, except for the first X coordinate which is not for some reason. For reference here is my function to enter 2D orthographic mode. The "game.renders" test is for making a smaller message-console window when the program is running as a dedicated multiplayer server (in the case I'm dealing with game.renders is 1). It's not that though, I know it's the first thing I homed in on but I tried commenting it out both ways and it did not change anything. I also thought maybe I was declaring the screen size one pixel too big and it was scaling everything but this is apparently not the case because the primitives appear in exactly the same way no matter where they are on the screen, and because I tried adding +1 and -1 on SCREEN_WIDTH and SCREEN_HEIGHT and it did not fix it.
void rend_Enter_2D(void)
{
// to 2D drawing mode
glMatrixMode(GL_PROJECTION);
glPushMatrix();
if (!game.renders)
glOrtho(0,window_width,window_height,0,-1,1);
else
glOrtho(0,SCREEN_WIDTH,SCREEN_HEIGHT,0,-1,1);
glMatrixMode(GL_MODELVIEW);
glPushMatrix();
glDisable(GL_DEPTH_TEST);
glDisable(GL_LIGHTING);
}


One more conundrum on top of all this; is that when I turn on GL_LINE_STIPPLE, the rules change and my "glf" macros no longer work. I did not fully investigate the behavior while LINE_STIPPLE is on but it appears that everything works "normally" with the endpoints of the lines except that anything I draw is offset by 1x1 pixels, so [0x0] is actually one pixel right and down from the top of the screen. Whereas with LINE_STIPPLE off I can not draw anything at [0x0] because it is one pixel past the top of the screen.
[wow]
Obviously something is Not Right here and it's probably either my code or my understanding of the API, so does anybody have a better version of either?

##### Share on other sites
For the sake of getting you a quick answer and something easy to try:
It probably has to do with the way OpenGL (and really any 3D api) addresses texels (pixels). What is happening is that you are drawing at a whole number, which happens to be the border between 2 texels, not dead center on a texel. This is causing some of your lines to pop between texels, and other alignment issues. What you need to do is draw at the center of the pixel, so instead of a line from (0, 0) to (2, 2) draw from (0.5, 0.5) to (2.5, 2.5)

You can do this either by appending .5 to each of your coordinates, or the easy way is to simply modify the modelview matrix to shift everything to texel centers before any 2D drawing by doing glTranslatef(0.5f, 0.5f)

Hope this helps

##### Share on other sites
glTranslatef(0.5f, 0.5f) does not work; it is missing the Z coordinate. glTranslatef(0.5f, 0.5f, 0) does not make any difference. I can not append 0.5 to my coordinates while drawing because I am using glVertex2i.

If that is the case then how come everything shows up the same way no matter where on the screen I draw it? If it was a floating point error wouldn't the error come and go as an object moved around the screen just like it would for scaling?

##### Share on other sites
But I don't want to use glVertex2f! I'm drawing 2D objects at integer positions, there's no reason to use vertex2f except maybe for testing to see if it fixes the problem. It's stupid to hold information like that in floats, it's too inefficient and slow. And if the solution is to use vertex2f then why did they even bother making a vertex2i if you can't use it because it doesn't work right?

##### Share on other sites
Try modify the modelview matrix by 0.375f, that's from the opengl.org faq and it works for me, I get pixel perfect drawing.

My init code:

glMatrixMode(GL_PROJECTION);
glOrtho(0, 640, 0, 480, -1, 1);

glMatrixMode(GL_MODELVIEW);
glTranslatef(0.375f, 0.375f, 0);

##### Share on other sites
Quote:
 It's stupid to hold information like that in floats, it's too inefficient and slow.

A/ an int + a float contain the same number of bytes, ie theyre the same size
B/ floats WRT vertices are more likely to be on the fast path, using ints for data can throw u into software, IIRC floats are best for all data except maybe GLubyte for color

##### Share on other sites
Thanks! That does have an effect on it. However unfortunately all it does is now when I draw a GL_LINE_LOOP instead of the upper-left pixel missing now the lower-right one is missing. I found in the OpenGL FAQ where it says to do that (add .375) but it doesn't say why, all it says is "you might want to put a small translation in the ModelView matrix" with no explanation of how come or where the number .375 came from. So although it is probably related to the matrix in some way seeing as how translating by .375 affected it, I can't seem to figure out how to fix it because I have no information as to why I should make that transformation.

##### Share on other sites
Quote:
 A/ an int + a float contain the same number of bytes, ie theyre the same size
Well that's true but not my ints; I know most C compilers consider an int to be 4 bytes but when I say int I'm usually talking about an "unsigned short", i.e. 16 bits. What C calls an "int" I would call an "unsigned long"; just one of my programming quirks I guess, it just seems right that way. (I also store all multibyte values big-endian incidentally)
Quote:
 B/ floats WRT vertices are more likely to be on the fast path, using ints for data can throw u into software, IIRC floats are best for all data except maybe GLubyte for color
Ok I can believe that, although it seems counter-intuitive coming from 2D games where floats are the devil. What's the integer vertex functions for then? I thought I had read that they had provided those because they were supposed to be faster.