blend modes behaving weird on ATIs
hello,
hello, the blendfuncs i use work fine on NVIDIA graphic cards, but on ATIs they produce weird results.
objects that have an alpha value other than 1 (which makes them transparent obviously) seem to shine through light objects in the front, exactly where the alpha channel of the texture is.
screenshot:
here, the object on the right is completely opaque, and in front of the flower, but still the flower's alpha channel shine through in gray.
on NVIDIAs this looks perfectly correct.
the blendfunction used is
glBlendFunc(GL_DST_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
Your rendering context doesnt have its Alpha channel configured correctly for each video card by the look of it. It would explain why DST_ALPHA fails, because it will default to 1.0f.
anyone?
this is quite important to me. :-) or my game wont work on ATIs.
btw: i am using jogl and i tried to manually set the bits for RGB and alpha, but without any change.
this is quite important to me. :-) or my game wont work on ATIs.
btw: i am using jogl and i tried to manually set the bits for RGB and alpha, but without any change.
Are you running on Linux? Jogl won't ever create a framebuffer with an alpha component on linux (unless you're luckly like with those nVidia drivers and you get it without asking for it).
I reported this as a bug ages ago (must have been about two years ago) and the response was that they don't care, and the won't fix it. Thats one of the reasons I switched back to LWJGL.
I reported this as a bug ages ago (must have been about two years ago) and the response was that they don't care, and the won't fix it. Thats one of the reasons I switched back to LWJGL.
thanks for your reply.
but all the computers i tested were running WinXP with the latest ATI drivers.
but all the computers i tested were running WinXP with the latest ATI drivers.
If you're running windowed then you may need to switch the desktop to 32bit colour rather than 16bit. Or if you don't want to do that then you might consider running fullscreen. However Jogl tends to be pretty buggy when it comes to creating fullscreen displays...
How are you setting your pixelformat for the GL context? are you implying a 32-bit colour, or explicitly declaring 8 red bits, 8 green bits, 8 blue bits and 8 alpha bits? Ive had a similar problem where my destination alpha worked on ATI cards, but not on NVidia ones. Seems to be some sort of mismatch in how the contexts are created...
no, i am not setting the pixelformat, just relying on the default. but if i query it, i get 8 bits for each RGB and 0 bits for alpha, both on NVIDIA and ATI.
should there be 8 bits?
should there be 8 bits?
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement