Hardware Anitaliasing

Started by
5 comments, last by zedzeek 19 years, 5 months ago
Hi everyone, Does anybody know of a way to control the level of Hardware Anitaliasing thorugh an OpenGL, or wgl command? Most graphics cards have a anitaliasing settings on them, that users can set (0x, 2x, 4x, etc). But there's usually an option to "Allow the application to choose". As an application programmer, how do I define that what my App's preference is? Thanks in advance!
Advertisement
From the OpenGL superbible 3rd edition:

you have to first request a multisampled framebuffer, under glut you'd do something like:

glutInitDisplayMode( GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH | GLUT_MULTISAMPLE );

There are operating system dependent ways too.

You can then use glEnable and glDisable to turn multisampling on and off.

The book suggests that you might turn it off when drawing points (which can be antialiased anyway) and such.

By default it doesn't multisample alpha. you can turn this on with
glEnable and one of the GL_SAMPLE_* values.

glSampleCoverage looks like it allows some further (implementation dependent) fine tuning.
-- Jonathan
I wonder if we should add please dont cross post to Opengl.org to the forum FAQ/rules every so often there seems to be a lot of it going on, and its not like there was a huge gap between the answering of the question on either board... [rolleyes]
And yet both responses were subjectively different...

However, I apologize, and wont do it again. I was unaware such things were prohibited.

In reponse to Lucid's speedy and informative reponse:

I'm not using GLUT, so without wanting to trace the GLUT source, is there an easy way to translate: "glutInitDisplayMode( GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH | GLUT_MULTISAMPLE );" into gl commands?
Quote:Original post by DeadPath
And yet both responses were subjectively different...

However, I apologize, and wont do it again. I was unaware such things were prohibited.


Its not prohibited as such, its just a case of maners more than anything, posting in both places can lead to two groups of people answering the question and wasting one groups time if the answer has already accured. No harm done, just something to keep in mind [smile]

Quote:
I'm not using GLUT, so without wanting to trace the GLUT source, is there an easy way to translate: "glutInitDisplayMode( GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH | GLUT_MULTISAMPLE );" into gl commands?


For for the question, the number of sample buffers has to be setup at window creation, however i've not seen a way todo it via the PIXELFORMATDESCRIPTOR structure, instead it has to be done via the WGL_ARB_pixel_format extension to allow you to select the correct pixel format.
Nvidia have a pdf on how todo this here and Nehe has a tutorial on setting it up here
That's what I need, awesome.

Many thanks to both of you.
if youre using sdl (and why wouldnt u be :) u can specify this info

SDL_GL_SetAttribute( SDL_GL_MULTISAMPLEBUFFERS, intended_multisamplebuffers
SDL_GL_SetAttribute( SDL_GL_MULTISAMPLESAMPLES, intended_multisamplesamples

when u create the window

This topic is closed to new replies.

Advertisement