Jump to content
  • Advertisement
Sign in to follow this  
jdaniel

glGetIntegerv(GL_ACCUM_* sets parameter to 0

This topic is 4863 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi, I am querying my openGL implementation and I am receiving unexpected results from the bits per pixel accumulation buffer queries. For example, the following function:
void queryState(void) {
int color_bpp[4];
int depth_bpp;
int stencil_bpp;
int accum_bpp[4];
int testme;
   
   glGetIntegerv(GL_RED_BITS, color_bpp);
   glGetIntegerv(GL_GREEN_BITS, &color_bpp[1]);
   glGetIntegerv(GL_RED_BITS, &color_bpp[2]);
   glGetIntegerv(GL_ALPHA_BITS, &color_bpp[3]);
   glGetIntegerv(GL_DEPTH_BITS, &depth_bpp);
   glGetIntegerv(GL_STENCIL_BITS, &stencil_bpp);

   // why would the accumulation buffer report 0 bits per pixel?
   glGetIntegerv(GL_ACCUM_RED_BITS, accum_bpp);
   glGetIntegerv(GL_ACCUM_GREEN_BITS, &accum_bpp[1]);
   glGetIntegerv(GL_ACCUM_BLUE_BITS, &accum_bpp[2]);
   glGetIntegerv(GL_ACCUM_ALPHA_BITS, &accum_bpp[3]);

   // print openGL buffer information for this implementation
   printf("\n\nNumber of bits in the color buffer. R: %d  G: %d  B: %d  A: %d", 
          color_bpp[0], color_bpp[1], color_bpp[2], color_bpp[3]);
   printf("\nNumber of bits in the depth buffer. %d", depth_bpp);
   printf("\nNumber of bits in the stencil buffer. %d", stencil_bpp);

   printf("\nNumber of bits in the accumulation buffer. R: %d  G: %d  B: %d  A: %d\n", 
          accum_bpp[0], accum_bpp[1], accum_bpp[2], accum_bpp[3]);



}




posts this to stdout:
Number of bits in the color buffer. R: 8 G: 8 B: 8 A: 8
Number of bits in the depth buffer. 24
Number of bits in the stencil buffer. 8
Number of bits in the accumulation buffer. R:0 G:0 B: 0 A: 0
I do not understand why 0 bits per pixel is reported in my accumulation buffer. According to the redbook:
Quote:
At a minimum, you're guaranteed to have one color buffer for use in RGBA mode with associated stencil, depth, and accumulation buffers that have color components of nonzero size, and one color buffer for use in color-index mode with associated depth and stencil buffers.
I've tested this code on both my 850XT and my laptop Mobility X600 chips. Any advice or suggestions? Thank you, jdaniel

Share this post


Link to post
Share on other sites
Advertisement
If you want an accumulation buffer, you have to request one when you create your window. Note that the accumulation buffer is very rarely used, so on most (all?) cards, requesting an accumulation buffer will force software rendering.

Share this post


Link to post
Share on other sites
Ah, I see. Testing this with glut revealed my error.

Initializing the window with...

glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGBA | GLUT_DEPTH | GLUT_ACCUM);



posted the output:

Number of bits in the accumulation buffer. R: 16 G: 16 B: 16 G: 16


Very helpful, thank you.

Is there a common way to test for hardware acceleration of the accumulation buffer?

Share this post


Link to post
Share on other sites
Quote:
Original post by jdaniel
Is there a common way to test for hardware acceleration of the accumulation buffer?


Not that I know of, due to OpenGL's design - the point is that if a feature is not implemented in hardware, you shouldn't need to worry as it will transparently be emulated in software.

For those of us who want something resembling decent performance, however...

I suppose you cound check the driver/renderer string for some known good cards, but that is quite hackish.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!