Jump to content
  • Advertisement
Sign in to follow this  
napier

can framebuffer ALPHA_BITS be set

This topic is 4800 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm getting odd results with alpha blending on different graphics cards. I'm hoping somebody out there can check my logic below and tell me if I'm understanding this correctly. I'm rendering many overlapping textured layers with very faint alpha, to blend colors together and create a smokey atmospheric effect. With an alpha value of .01 I get a very subtle blending on an ATI Radeon 9700 or GeForce FX 5500, but the same app on an ATI 9200 (a mac mini) shows blocky distorted colors with visible banding. This image gives an example of the 9200 (bad) vs. 9700 (good): http://potatoland.org/images/mac_mini_posterize.png I'm guessing that the 9200 has fewer pixels for the alpha values, so can't keep track of very slight color shifts, for example if a color value is .1 and alpha is .01 then the resulting color*alpha value is .001, smaller than the smallest meaningful byte value (1/255 = approx .004). If I raise the alpha value to .04 then the colors look better on the 9200, but I lose the atmoshperic effect I'm going for. getIntegerV(GL_ALPHA_BITS) on the 9200 returns 8. The FX 5500 returns 0. I don't know what that 0 means, but it seems that the 5500 can accurately handle much finer alpha blending, as if it's using more bits per pixel in the alpha channel (same for the ATI 9700 though I haven't checked the ALPHA_BITS on that card). Is there a way to configure the framebuffer to allocate more bits for the alpha channel? Is this hardware dependent or can it be adjusted programmatically? Is there some other approach to this problem that I'm missing?

Share this post


Link to post
Share on other sites
Advertisement
set your desktop to 32bit color first
try enabling/disablking dither (enabled by default)
when u creat the texture ask for internal format of say RGBA8

Quote:
getIntegerV(GL_ALPHA_BITS) on the 9200 returns 8. The FX 5500 returns 0
this is destination alpha, u set this (usually 8bit) when u create the window

Share this post


Link to post
Share on other sites
Thanks for the reply zedzeek.

1) My display is set to 32 bit color.
2) I'm not sure how to set dithering on a mac, I'll look into it.
3) My texture internal format is RGBA8 (other formats like RGBA,
RGBA16 give the same results).
4) I'll look at how I create the window and try tweaking alpha bits there.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!