I'm trying to use a GLSL shader to transform a texture and place it in another one. It mostly works - however, if I try to save any value that is below or equal to 0.1 in the alpha channel of any pixel of texture, no data at all seems to get written to the texture. Very weird. If I ensure that the data I write to the alpha channel is always at least .101, everything works as expected (except for me losing valuable precision in my data).
Anyone got an idea as to why this happens?
Here's the shader. Note the "special" variable at the end. Just putting vertical.y in the alpha channel breaks the entire texture: #version 130
uniform sampler2D tex;
out vec4 outColor;
void main() {
float u0 = (gl_TexCoord[0].x * 2) - 1;
float v0 = (gl_TexCoord[0].y * 2) - 1;
vec4 horizontal = texture2D(tex, newCoords);
vec4 vertical = texture2D(tex, newCoords.yx);
// TODO: WHY DOES vertical.y NOT WORK? Why does it seem alpha is so special?
float special = vertical.y;
if (special < 0.101)
special = 0.101;
outColor = vec4(horizontal.x, horizontal.y, vertical.x, special); }
I'm not doing anything particularly exotic when rendering the shader. The calling code looks like this:
GL.Disable(EnableCap.Blend)
// The target FBO for rendering
GL.BindFramebuffer(FramebufferTarget.Framebuffer, ShadowBuffer)
GL.Viewport(0, 0, res, res)
// The target texture attachment in the FBO
You may be seeing an optimization called "alpha testing". Don't ask me how to disable it in OpenGL - I'd venture a guess that it is possible though.
In normal rendering, low alpha values of a pixel (when alpha blending) cause the pixel's contribution to the final image to be low, so the rendering can be sped up by not blending the pixels that have an alpha value less than a given treshold. High quality optimization level would simply set the treshold to the minimum alpha value representable on the render target, so in effect all meaningful pixels would still be written and only the actually invisible ones discarded. In cases like yours, though, it would be the intended behavior to write the pixels regardless of whether they pass a visual treshold or not.
In D3D, the treshold can be set by device states directly, but I don't have enough expertise on OpenGL to say how to set an equivalent state. However, I hope that I pointed you to the right direction. I guess googling for "alpha test opengl" would yield some results.
Hmm, I tried adding GL.Disable(EableCap.AlphaTest), but that didn't help.
However, I just now noticed that rendering with the shader program results in an Invalid Operation error if I try removing the special handling of the alpha channel. Still don't know why.
What are your exact code changes when you take out the special handling of the alpha? In particular, check the conditional logic around the special handling...
This doesn't work, and results in no usable data when rendering the texture:
#version 130
uniform sampler2D tex;
out vec4 outColor;
void main() {
//translate u and v into [-1 , 1] domain
float u0 = (gl_TexCoord[0].x * 2) - 1;
float v0 = (gl_TexCoord[0].y * 2) - 1;
//then, as u0 approaches 0 (the center), v should also approach 0
v0 = v0 * abs(u0);
//convert back from [-1,1] domain to [0,1] domain
v0 = (v0 + 1) * 0.5;
//we now have the coordinates for reading from the initial image
vec2 newCoords = vec2(gl_TexCoord[0].x, v0);
//read for both horizontal and vertical direction and store them in separate channels
vec4 horizontal = texture2D(tex, newCoords);
vec4 vertical = texture2D(tex, newCoords.yx);
// TODO: WHY DOES vertical.y NOT WORK? Why does it seem alpha is so special?
float special = vertical.y;
outColor = vec4(horizontal.x, horizontal.y, vertical.x, special);
}
This, however, works:
#version 130
uniform sampler2D tex;
out vec4 outColor;
void main() {
//translate u and v into [-1 , 1] domain
float u0 = (gl_TexCoord[0].x * 2) - 1;
float v0 = (gl_TexCoord[0].y * 2) - 1;
//then, as u0 approaches 0 (the center), v should also approach 0
v0 = v0 * abs(u0);
//convert back from [-1,1] domain to [0,1] domain
v0 = (v0 + 1) * 0.5;
//we now have the coordinates for reading from the initial image
vec2 newCoords = vec2(gl_TexCoord[0].x, v0);
//read for both horizontal and vertical direction and store them in separate channels
vec4 horizontal = texture2D(tex, newCoords);
vec4 vertical = texture2D(tex, newCoords.yx);
// TODO: WHY DOES vertical.y NOT WORK? Why does it seem alpha is so special?
float special = vertical.y;
I suspect I'm invoking undefined behaviour at some point in the shader, but I can't for the life of me see it (and OpenGL reports no errors when compiling the shader).
I don't spot any obvious errors. I'm more used to HLSL (D3D shader language), but I assume that the data type semantics are very similar here.
If it is indeed the shader that causes errors, I would be ready to blame the driver's shader compiler just about now. The shader code itself is not even that complex.
Hmm, I tried adding GL.Disable(EableCap.AlphaTest), but that didn't help.
However, I just now noticed that rendering with the shader program results in an Invalid Operation error if I try removing the special handling of the alpha channel. Still don't know why.
Weird, it should work...
You may also try to play with alpha function and cutoff level
for example glAlphaFunc (GL_ALWAYS, 0)
Other than that, it may be bug in your driver - you may try to rearrange the code a bit - for example, instead of newCoords.yx copy the values to new vec2 and shufle manually. You may also try to test it by adding some extra shader instructions (like multiplication with 0.9999) here or there to see, if it goes away.