Hi,
Can somebody please tell me how I can stop texture interpolation from interpolating with fully transparent pixels? I am just using this code for interpolation:
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);
At the moment my sprites that use transparency all have unnaturally coloured edges to them when I use interpolation, which isn't pretty
Thanks.
Texture Interpolation Problem With Transparent Pixels
You can't selectively interpolate some texels and not others.
However, this doesn't have to be a problem. If you make all of your transparent pixels black, and then use the "pre-multiplied alpha" blend mode (i.e. [font=courier new,courier,monospace]src=1, dst=1-srcAlpha[/font]), there won't be any artefacts around the edges.
However, this doesn't have to be a problem. If you make all of your transparent pixels black, and then use the "pre-multiplied alpha" blend mode (i.e. [font=courier new,courier,monospace]src=1, dst=1-srcAlpha[/font]), there won't be any artefacts around the edges.
You can't selectively interpolate some texels and not others.
However, this doesn't have to be a problem. If you make all of your transparent pixels black, and then use the "pre-multiplied alpha" blend mode (i.e. [font=courier new,courier,monospace]src=1, dst=1-srcAlpha[/font]), there won't be any artefacts around the edges.
Interesting, I have the same problem. Please help me understand how it works (I am using pre multiple alpha already).
[attachment=11170:TransparentArtifacts_2012-09-09.jpeg]
It is the sampling from the texture that will get a bad color, isn't it? So a position between a white non transparent pixel and a black transparent pixel would become gray 50% transparent? If so, using pre multiplied alpha won't help as long as the colors are wrong?
There is another possibility: Using mipmaps, but you have to create the mipmaps yourself. That way, you can use an averaging algorithm that ignores transparent pixels.
Yes, so if we've got two pixels A=(1,1,1,1) and B=(0,0,0,0), then sampling exactly between them gives S=A*0.5+B*0.5, or S=(0.5, 0.5, 0.5, 0.5).
It is the sampling from the texture that will get a bad color, isn't it? So a position between a white non transparent pixel and a black transparent pixel would become gray 50% transparent? If so, using pre multiplied alpha won't help as long as the colors are wrong?
For example, let's say the background is light green: G=(0.5, 1, 0.5).
With regular alpha blending ([font=courier new,courier,monospace]src=srcAlpha, dst=1-srcAlpha[/font]), we get:
S*0.5 + G*0.5
=(0.25, 0.25, 0.25) + (0.25, 0.5, 0.25)
=(0.5, 0.75, 0.5) <- darker than the original background, hence the black outline around the sprite.
With premultiplied alpha blending ([font=courier new,courier,monospace]src=1, dst=1-srcAlpha[/font]), we get:
S + G*0.5
=(0.5, 0.5, 0.5) + (0.25, 0.5, 0.25)
=(0.75, 1, 0.75) <- brighter than the original background, no black outline any more.
(I am using pre multiple alpha already)What does the code for setting your blend mode look like?
Can I just ask, does DirectX suffer from this same issue?
Yes. There isn't really an issue ("issue" as in "fault") here - GL/DX do what you tell them to.
Bilinear filtering tells the GPU to blend together the nearest 4 pixels. If the RGB values of your 'transparent' pixels are wrong, then this blending won't work.
Is there any shaders I could use to deal with this?[/quote]Well, you could disable the GPU's bilinear filtering and instead re-invent it yourself in the pixel shader. You'd have to take 4 nearest-filtered samples (from the 4 texels closest to the UV coord), calculate the regular bilinear weights as usual, but then set the weights of any transparent pixels to 0 and renormalize the remaining weights.
This would be a lot more complex and a lot slower than just ensuring that your transparent pixels are black.
The problem is I don't actually have any control over anything myself as I am working on a user game development environment and the user could use any textures themselves and call upon any blend functions as well to that matter. I ask about DirectX because this problem doesn't occur in another game development environment which I am actually trying to emulate and it uses DirectX instead of OpenGL. I'm wondering how on earth the guy that wrote that managed to resolve this issue nicely.
I apologise for not giving the full circumstances before I was trying not to complicate things.
I apologise for not giving the full circumstances before I was trying not to complicate things.
grasshop, please excuse me for hijacking your thread, hopefully it is still within your area of interest.
[/quote]
After some debugging, I think I see where my problem is. The magnifying and minifying filters produce bitmaps with pixels that are effectively pre multiplied alpha (as long as transparent areas are black). But my original bitmap is not. That is, I have something that is supposed to look like a window. So it is mostly fully transparent, with some white lines in it. The lines have some anti-aliased pixels created by the bitmap editor (gimp). These are also full white, but with alpha ranging from 0 to 1.
[attachment=11172:WindowBase_2012-09-09.jpeg]
The shader multiplied all pixels with the alpha, to get pre multiplied alpha. That works fine for the bitmap itself, but not for interpolated areas from magnifying and minifying, which will now have the alpha multiplied twice. If I disable magnifying and minifying, it looks "fine". I wonder if I can use Gimp to make the colors pre multiplied by the alpha?
What does the code for setting your blend mode look like?
[quote name='larspensjo' timestamp='1347177365' post='4978204'](I am using pre multiple alpha already)
[/quote]
After some debugging, I think I see where my problem is. The magnifying and minifying filters produce bitmaps with pixels that are effectively pre multiplied alpha (as long as transparent areas are black). But my original bitmap is not. That is, I have something that is supposed to look like a window. So it is mostly fully transparent, with some white lines in it. The lines have some anti-aliased pixels created by the bitmap editor (gimp). These are also full white, but with alpha ranging from 0 to 1.
[attachment=11172:WindowBase_2012-09-09.jpeg]
The shader multiplied all pixels with the alpha, to get pre multiplied alpha. That works fine for the bitmap itself, but not for interpolated areas from magnifying and minifying, which will now have the alpha multiplied twice. If I disable magnifying and minifying, it looks "fine". I wonder if I can use Gimp to make the colors pre multiplied by the alpha?
The problem is I don't actually have any control over anything myself as I am working on a user game development environment and the user could use any textures themselves and call upon any blend functions as well to that matter.
Do you have control over the image loading functions? If so, I think you can modify those of format RGBA to apply premultiplied alpha at loading. That's the design idea I am looking into just now.
You say you have no control over how they are blended. If that is the case, there simply exists no solution that will always work.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement