Jump to content
  • Advertisement
Sign in to follow this  
ehmdjii

blend modes behaving weird on ATIs

This topic is 4144 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

hello, hello, the blendfuncs i use work fine on NVIDIA graphic cards, but on ATIs they produce weird results. objects that have an alpha value other than 1 (which makes them transparent obviously) seem to shine through light objects in the front, exactly where the alpha channel of the texture is. screenshot: here, the object on the right is completely opaque, and in front of the flower, but still the flower's alpha channel shine through in gray. on NVIDIAs this looks perfectly correct. the blendfunction used is glBlendFunc(GL_DST_ALPHA, GL_ONE_MINUS_SRC_ALPHA);

Share this post


Link to post
Share on other sites
Advertisement
Your rendering context doesnt have its Alpha channel configured correctly for each video card by the look of it. It would explain why DST_ALPHA fails, because it will default to 1.0f.

Share this post


Link to post
Share on other sites
anyone?
this is quite important to me. :-) or my game wont work on ATIs.

btw: i am using jogl and i tried to manually set the bits for RGB and alpha, but without any change.

Share this post


Link to post
Share on other sites
Are you running on Linux? Jogl won't ever create a framebuffer with an alpha component on linux (unless you're luckly like with those nVidia drivers and you get it without asking for it).

I reported this as a bug ages ago (must have been about two years ago) and the response was that they don't care, and the won't fix it. Thats one of the reasons I switched back to LWJGL.

Share this post


Link to post
Share on other sites
If you're running windowed then you may need to switch the desktop to 32bit colour rather than 16bit. Or if you don't want to do that then you might consider running fullscreen. However Jogl tends to be pretty buggy when it comes to creating fullscreen displays...

Share this post


Link to post
Share on other sites
How are you setting your pixelformat for the GL context? are you implying a 32-bit colour, or explicitly declaring 8 red bits, 8 green bits, 8 blue bits and 8 alpha bits? Ive had a similar problem where my destination alpha worked on ATI cards, but not on NVidia ones. Seems to be some sort of mismatch in how the contexts are created...

Share this post


Link to post
Share on other sites
no, i am not setting the pixelformat, just relying on the default. but if i query it, i get 8 bits for each RGB and 0 bits for alpha, both on NVIDIA and ATI.

should there be 8 bits?

Share this post


Link to post
Share on other sites
anyone else got an idea? this is quite annoying, since ATI users cant play my game and i really want that fixed.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!