Sign in to follow this  
Darragh

Different rendering results across GL implementations - Problems drawing a GUI

Recommended Posts

Darragh    308
Hi everyone. I'm currently working on a little GUI system with various widgets and such for an upcoming game I'm making. I was intending on using a free GUI library originally but unfortunately I couldn't find a library that fit my needs well, hence the reason I'm writing my own code for it. Anyhow, to cut a long story short my widgets are drawing differently on machines. On my machine that I work with at home (which has a Radeon X800 GPU) the GUI renders perfectly, with nothing out of place. However when I test it on the much lower end machines they have in college (which ultimately must be my target spec since this is a college project) then some of the lines and shapes I draw are displaced slightly by a a pixel. These machines have only an intel integrated graphics chipset, so the renderer is totally different to what I have at home. You might say well so what? Well since I'm going to have to work at a fairly low resolution (640x480) then even a pixels worth of a difference does count. It might be a small difference but it does affect the overall look of the GUI, and ruins the nice clean cut appearance. It also annoys me quite a bit that if I specify a screen coordinate of say (2,2) on one system then it might really be (2,3) or something else on another.. It makes it really hard to get things like this right! What i'm wondering I guess is if any of you have run into such problems when writing your own GUI systems and if you know of any workarounds ? Im not using any fancy antialiasing or anything (which could affect things) and am just drawing the widgets using GL_LINES and GL_QUADS. All the coordinates are also specified in integers. I'd be very interested to hear from people who have similar problems, or who may have some ideas about the problem. Thanks for the help. Just to demonstrate what i'm on about, heres two screen shots from two different systems running the same version of the executable. You'll have to examine closely the borders around the widgets (the buttons in particular are where the difference is most noticeable) but you should be able to spot the discrepancy. Some of the lines are off position in the second picture. Rendered by an X800 GPU: (click for full view) Rendered by 'Intel Extreme Graphics 2' (click for full view) Free Image Hosting at www.ImageShack.us

Share this post


Link to post
Share on other sites
kRogue    100
when you draw do you do: glVertex2f( pixel.x, pixel.y)where pixel.x and pixel.y are integers of what pixel to draw at or do you do: glVertex2f( pixel.x+0.5f, pixel.y+0.5f)? if you do the first, there is a chance that the "pixel" will obe one off to the lower value than you expect (this is because of the fact that the hardware may have different ways of going from floating point position to actauly integer position on the screen, i.e. how does it round off float to integers, i.e. rounding or jsut plain truncating) the method of using glVertex2f( pixel.x+0.5f, pixel.y+0.5f) tries to gaurantee that it will land on pixel.x, pixel.y after all the transfomrations.


Share this post


Link to post
Share on other sites
haegarr    7372
Although not written w.r.t. different implementation issues, an older Red Book (OpenGL Programming Guide) noticed in its appendix "Programming Tips", that
Quote:
Red Book
If exact two-dimensional rasterization is desired, you must carefully specify both the orthographic projection and the vertices of primitives that are to be rasterized. The orthographic projection should be specified with integer coordinates, as shown in the following example:
gluOrtho2D(0, width, 0, height);
where width and height are the dimensions of the viewport. Given this projection matrix, polygon vertices and pixel image positions should be placed at integer coordinates to rasterize predictably. For example, glRecti(0, 0, 1, 1) reliably fills the lower left pixel of the viewport, and glRasterPos2i(0, 0) reliably positions an unzoomed image at the lower left of the viewport. Point vertices, line vertices, and bitmap positions should be placed at half-integer locations, however. For example, a line drawn from (x1, 0.5) to (x2, 0.5) will be reliably rendered along the bottom row of pixels into the viewport, and a point drawn at (0.5, 0.5) will reliably fill the same pixel as glRecti(0, 0, 1, 1).

and furthur, as a compromise if working with integer co-ordinates is desired
Quote:
Red Book
An optimum compromise that allows all primitives to be specified at integer positions, while still ensuring predictable rasterization, is to translate x and y by 0.375, as shown in the following code fragment. Such a translation keeps polygon and pixel image edges safely away from the centers of pixels, while moving line vertices close enough to the pixel centers.

glViewport(0, 0, width, height);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluOrtho2D(0, width, 0, height);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glTranslatef(0.375, 0.375, 0.0);
/* render all primitives at integer positions */

(However, perhaps there are some pitfalls with this solution. E.g. blender (at least 2.40) used that magic 0.375, and one of the developer's comments lets assume that s/he wasn't lucky about that.)

Share this post


Link to post
Share on other sites
Darragh    308
Thanks for the response guys. Yeah I have been using an integer sized viewport and coordinates, hence the reason I was puzzled by the discrepancies between the two renderers. I think I'll try adding 0.5 on to all my coordinates though and see what difference that makes. Hopefully it will do the trick!

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this