Sign in to follow this  
DeadPath

OpenGL Hardware Anitaliasing

Recommended Posts

Hi everyone, Does anybody know of a way to control the level of Hardware Anitaliasing thorugh an OpenGL, or wgl command? Most graphics cards have a anitaliasing settings on them, that users can set (0x, 2x, 4x, etc). But there's usually an option to "Allow the application to choose". As an application programmer, how do I define that what my App's preference is? Thanks in advance!

Share this post


Link to post
Share on other sites
From the OpenGL superbible 3rd edition:

you have to first request a multisampled framebuffer, under glut you'd do something like:

glutInitDisplayMode( GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH | GLUT_MULTISAMPLE );

There are operating system dependent ways too.

You can then use glEnable and glDisable to turn multisampling on and off.

The book suggests that you might turn it off when drawing points (which can be antialiased anyway) and such.

By default it doesn't multisample alpha. you can turn this on with
glEnable and one of the GL_SAMPLE_* values.

glSampleCoverage looks like it allows some further (implementation dependent) fine tuning.

Share this post


Link to post
Share on other sites
And yet both responses were subjectively different...

However, I apologize, and wont do it again. I was unaware such things were prohibited.

In reponse to Lucid's speedy and informative reponse:

I'm not using GLUT, so without wanting to trace the GLUT source, is there an easy way to translate: "glutInitDisplayMode( GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH | GLUT_MULTISAMPLE );" into gl commands?

Share this post


Link to post
Share on other sites
Quote:
Original post by DeadPath
And yet both responses were subjectively different...

However, I apologize, and wont do it again. I was unaware such things were prohibited.


Its not prohibited as such, its just a case of maners more than anything, posting in both places can lead to two groups of people answering the question and wasting one groups time if the answer has already accured. No harm done, just something to keep in mind [smile]

Quote:

I'm not using GLUT, so without wanting to trace the GLUT source, is there an easy way to translate: "glutInitDisplayMode( GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH | GLUT_MULTISAMPLE );" into gl commands?


For for the question, the number of sample buffers has to be setup at window creation, however i've not seen a way todo it via the PIXELFORMATDESCRIPTOR structure, instead it has to be done via the WGL_ARB_pixel_format extension to allow you to select the correct pixel format.
Nvidia have a pdf on how todo this here and Nehe has a tutorial on setting it up here

Share this post


Link to post
Share on other sites
if youre using sdl (and why wouldnt u be :) u can specify this info

SDL_GL_SetAttribute( SDL_GL_MULTISAMPLEBUFFERS, intended_multisamplebuffers
SDL_GL_SetAttribute( SDL_GL_MULTISAMPLESAMPLES, intended_multisamplesamples

when u create the window

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this