glReadPixels() & alpha values

Started by
5 comments, last by Z01 21 years, 2 months ago
I''m trying to read back RGBA values from the frame buffer using glReadPixels(). The RGB values are reading correctly, but the alpha value is always 255. I am clearing the color buffer beforehand using glClearColor(0.0f,0.0f,0.0f,0.0f). I spent some time with Google and after quite a bit of searching I finally came up with: "If the framebuffer does not support alpha values then the A that is obtained is 1.0". I''m using a GeForce2. At first it seemed odd to me that the framebuffer should actually store alpha values because an alpha value really tells you how to combine the incoming fragment''s color with the existing fragments color and there''s really no need to store it. Then I realized that framebuffer alpha is used for destination alpha when blending... and I searched for that: "Many OpenGL devices don''t support destination alpha. In particular, the OpenGL 1.1 software rendering libraries from Microsoft don''t support it. The OpenGL specification doesn''t require it." Can someone explain this to me a bit more? I''m not sure if: (1) my graphics card not store alpha values in the framebuffer, (2) my card does store alpha values but windows OpenGL won''t let me access them? (DirectX probably could?) (3) would the alpha values be accessible if I used some kind of extension? Thanks
Advertisement
Make sure when you created your rendering context that you allowed for 32 bit color (that is RGBA) Also, might want to allow for depth buffering.



~Main

==
Colt "MainRoach" McAnlis
Programmer
www.badheat.com/sinewave
==Colt "MainRoach" McAnlisGraphics Engineer - http://mainroach.blogspot.com
I'm using a 32 bit buffer and a 32 bit depth buffer atm.

More info, from my app's log file:

Creating Render Context:
Resolution.....: 480x300 x 32 @ 60 hz
VSYNC..........: Off
Z-Buffer Depth.: 32
Window Mode....: Windowed

Hardware Details: NVIDIA Corporation (1.3.1) GeForce2 GTS/PCI/3DNOW!
Maximum Texture Size Supported: 2048x2048
Maximum Hardware Texture Units: 2

OpenGL extensions supported:
+ GL_ARB_imaging
+ GL_ARB_multitexture
....

Thanks for the reply though!

[edited by - Z01 on January 29, 2003 4:52:48 PM]
>>Z-Buffer Depth.: 32
Window Mode....: Windowed

Hardware Details: NVIDIA Corporation (1.3.1) GeForce2 GTS/PCI/3DNOW!<<

no nvidia card supports 32bit depth so this info is incorrect!

how r u creating the window?
eg if youre using glut u need to add GLUT_ALPHA to the window init line

http://uk.geocities.com/sloppyturds/kea/kea.html
http://uk.geocities.com/sloppyturds/gotterdammerung.html
Ahhh. Sorry, my mistake, 32 bit zbuffer is what is requested, not necessarily what is given. Here is a code snippet from the render window creation function: (I had set bpp=32, zbuffer=32 previously)


  static PIXELFORMATDESCRIPTOR pfd = {      sizeof (PIXELFORMATDESCRIPTOR),      1,                         // Version      PFD_DRAW_TO_WINDOW | PFD_SUPPORT_OPENGL | PFD_DOUBLEBUFFER,      PFD_TYPE_RGBA,bpp,      0,0,0,0,0,0,0,0,0,0,0,0,0,      zbuffer,0,                 // Depth/Stencil      0,      PFD_MAIN_PLANE,      0,0,0,0                   };GLuint pix_fmt = 0;if (! (pix_fmt = ChoosePixelFormat (hdc, &pfd))) {    oo << "ChoosePixelFormat () failed" << endl;    return false;}if (! SetPixelFormat (hdc, pix_fmt, &pfd)) {    oo << "SetPixelFormat () failed" << endl;    return false;}if (! (hrc = wglCreateContext (hdc))) {    oo << "wglCreateContext () failed" << endl;    return false;}if (! wglMakeCurrent (hdc, hrc)) {    oo << "wglMakeCurrent () failed" << endl;    return false;}  


(Aside: Is it true if you request an 8 bit stencil buffer on a GF3/4, you''ll get a 16-bit zbuffer instead of 24-bit? - I read something about this...)
quote:
static PIXELFORMATDESCRIPTOR pfd = {      sizeof (PIXELFORMATDESCRIPTOR),      1,                         // Version      PFD_DRAW_TO_WINDOW | PFD_SUPPORT_OPENGL | PFD_DOUBLEBUFFER,      PFD_TYPE_RGBA,bpp,      0,0,0,0,0,0,      0,0,          // <--- alpha bits      0,0,0,0,0,      zbuffer,0,                 // Depth/Stencil      0,      PFD_MAIN_PLANE,      0,0,0,0                   };


This is the pixel format you''re requesting. You''re not asking for destination alpha (cAlphaBits is set to zero), so you''re not getting it. Set cAlphaBits to 8 or something to get a destination alpha channel.
(Aside: grrr... don''t you hate it when you click reply and you get a 500 server error and even when you hit "back", you''ve lost your post?)

I tried setting cAlphaBits=8 and cAlphaShift=0 and now glReadPixels() is reading back proper alpha values, thanks Brother Rob!

Its something I''d never had suspected, because all the OpenGL books & tutorials I''ve read have all the bits fields in PIXELFORMATDESCRIPTOR set to zero. I guess that means none of them have working destination alpha also?

Anyways, I checked out msdn for PIXELFORMATDESCRIPTOR:
quote:
cAlphaBits
Specifies the number of alpha bitplanes in each RGBA color buffer. Alpha bitplanes are not supported.
cAlphaShift
Specifies the shift count for alpha bitplanes in each RGBA color buffer. Alpha bitplanes are not supported.

...

Remarks
Please notice carefully, as documented above, that certain pixel format properties are not supported in the current generic implementation. The generic implementation is the Microsoft GDI software implementation of OpenGL. Hardware manufacturers may enhance parts of OpenGL, and may support some pixel format properties not supported by the generic implementation.


I''m going to have to check out my target cards to see if they support alpha bitplanes I guess, before using it.

Thanks again!

This topic is closed to new replies.

Advertisement