Jump to content
  • Advertisement
Sign in to follow this  
serratemplar

glAccum() without GLUT

This topic is 4106 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm writing a graphical app using SDL and OpenGL, and I want to use the accumulation buffer for fullscreen aliasing. The only initializations I've seen yet for the accum buffer are glClear(GL_ACCUM_BUFFER_BIT) and glutInitDisplay(...|GLUT_ACCUM|...). I am not using GLUT, so I've dug around thru my copy of the redbook and thru docs online, and - so far as I can see - there is no init call for the accum buffer. Is this true? I've got the antialiasing working in GLUT; I just want to move away from GLUT since it was last updated in the late 90s, and I'd like to join the modern world ;) (and also someday potentially use programmable shaders). Any thoughts or links would be very much appreciated; thank you in advance.

Share this post


Link to post
Share on other sites
Advertisement
Creating the accumulation buffer is, as any other buffer management task, not related to OpenGL. It's all up to the windowind API you use to provide it. Check the documentation for your windowing API about describing the pixel format and see where the accumulation buffer fits into the API.

Share this post


Link to post
Share on other sites
Can anyone point me to the place to learn about how this works on a Mac? Got it working ok in Windows and Linux, but not yet my MacBook.

Share this post


Link to post
Share on other sites
You can enable accumulator buffer with SDL_GL_SetAttribute before you set the video mode with SDL_SetVideoMode. You need to set SDL_GL_ACCUM_RED_SIZE, SDL_GL_ACCUM_GREEN_SIZE, SDL_GL_ACCUM_BLUE_SIZE and SDL_GL_ACCUM_ALPHA_SIZE according to your needs.

Be aware that the accumulator buffer isn't hardware accelerated except for the most recent graphics adapters. However using render to texture (possibly with framebuffer objects) you can get the same effect with hardware acceleration even on older H/W.

If you want fullscreen antialiasing, you probably want multisampling instead of accumulator buffer. You can set up multisampling with SDL_GL_SetAttribute (again, before SDL_SetVideoMode) with the attributes SDL_GL_MULTISAMPLEBUFFERS and SDL_GL_MULTISAMPLESAMPLES. I have my attributes set to 1 and 4 respectively. Multisampling does fullscreen antialiasing "automagically", so you won't have to do anything but enable it.

-Riku

Share this post


Link to post
Share on other sites
Thanks for your responses

I do set the sizes of the accum buffer(s) with SDL_GL_SetAttribute, however once I do it feels like it goes to software rendering. What I mean by that is, when I experienced the exact same effect on my Linux box (extreme slowdown, lots of artifacts), the way I fixed it was by installing my 3d drivers (which I discovered were not properly installed). My Mac however is brand new, out of the box..and I've run World of Warcaft on it with all of the settings turned up to the top. So I have a functional graphics card and working drivers, which means I'm initializing something incorrectly, or not at all.

Is there some argument I should be passing to SetAttribute to turn on the buffer? So far my google digging hasn't revealed such an argument, so I'm honestly not sure what to do. My SDL is up to date. GLUT has precisely the same trouble too (code that definately works on both Linux and Windows won't run on my Mac).

Any ideas? Thanks everyone.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!