Drawing directly to the screen in OpenGL

Started by
4 comments, last by Mantear 18 years, 6 months ago
Drawing directly to the screen in OpenGL Can it be done? Mind you I'm not talking about drawing to the frame buffer. Whenever I try to do any 2D stuff, it never looks quite right, and I kind of figured that menuing might work better when drawing directly to the screen.
--------
Whoisdoingthis.com - my stupid website about warning labels.
Advertisement
What is 'not quite right'?
I'd say that you would be better drawing to the screen using triangles/quads with an orthographic projection, but, you could use glDrawPixels, which copies memory to the framebuffer.

-- Jonathan
Quote:Original post by Pipo DeClown
What is 'not quite right'?


I'm mostly trying to draw directly to the screen for User Interface aspects.

The location of the images being blitted to the screen is sometimes off by one pixel in any direction. Possibly my calculations aren't perfect, but it's kind of annoying to have to do this when I just want to perform a simple Blt.
(I'm having a hard time getting the position to the EXACT pixel I want.)

glDrawPixels seems to be quite a bit slower than drawing a textured square.

Drawing a textured square tends to yield a blurred image (which is especially noticable if it contains text) and is even more difficult to place at the exact desired pixel. Also, sometimes part of the 3D scene will draw over top of it.

---

Is there a way to grab the actual video memory or use any of the classic "Blitting" functions?

Thanks for answering my n00b questions.
--------
Whoisdoingthis.com - my stupid website about warning labels.
MSDN:

Quote:
To obtain exact two-dimensional rasterization, carefully specify both the orthographic projection and the vertices of the primitives that are to be rasterized. Specify the orthographic projection with integer coordinates, as shown in the following example:
gluOrtho2D(0, width, 0, height);
The parameters width and height are the dimensions of the viewport. Given this projection matrix, place polygon vertices and pixel image positions at integer coordinates to rasterize predictably. For example, glRecti(0, 0, 1, 1) reliably fills the lower-left pixel of the viewport, and glRasterPos2i(0, 0) reliably positions an unzoomed image at the lower-left pixel of the viewport. However, point vertices, line vertices, and bitmap positions should be placed at half-integer locations. For example, a line drawn from (x (1) , 0.5) to (x (2) , 0.5) will be reliably rendered along the bottom row of pixels in the viewport, and a point drawn at (0.5, 0.5) will reliably fill the same pixel as glRecti(0, 0, 1, 1).

An optimum compromise that allows all primitives to be specified at integer positions, while still ensuring predictable rasterization, is to translate x and y by 0.375, as shown in the following code sample. Such a translation keeps polygon and pixel image edges safely away from the centers of pixels, while moving line vertices close enough to the pixel centers.

glViewport(0, 0, width, height);
glMatrixMode(GL_PROJECTION);
glLoadIdentity( );
gluOrtho2D(0, width, 0, height);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity( );
glTranslatef(0.375, 0.375, 0.0);
/* render all primitives at integer positions */


As for the other problems(blurring images and 3D scene obscuring the 2D drawings), you just need to disable filtering and depth testing when you're on 2D mode.
Unfortunately, the example from MSDN does not work in all cases. Different video cards and drivers can produce different results, even if you follow what was stated. By fiddling with the Translated offset, you can get the pixel-perfect alignment that you desire, but more than likely, it won't be perfect on all PCs. I've used a laptop with a GeForce4 and a desktop with an Radeon 9800, and they produce different results. I can get it to look right on both, but not with the same offset values.

This topic is closed to new replies.

Advertisement