OpenGL color interpolation not linear?

Started by
4 comments, last by ReaperSMS 10 years, 3 months ago

I have a rather simple problem that I am positive has a simple solution. I am rendering a quad twice as two layers, each with a color. The first is green, the second is red. I am using the alpha channel of the color component for the quad's vertices to blend the two colors together across the quads.

Attached bellow are two images. Image A is how it looks when rendered with OpenGL, image B is how it should look. The black color in image A is the background bleeding through. If OpenGL interpolated the color values linearly there would be no bleeding, it would be a nice transition from red to green and no black would show. But that's obviously not the case.

I've tried using a GLSL program and setting the color variable to noperspective but it makes no difference. Is there a way to force OpenGL to do plain linear interpolation of my vertex colors, so that the red and green blend evenly across the quad like in image B?

Additional information:


Red layer alpha values:
1.0 -------- 0.0
|             |
|             |
|             |
0.0 -------- 1.0

Green layer alpha values:
0.0 -------- 1.0
|              |
|              |
|              |
1.0 -------- 0.0

.

Thanks!

Advertisement
In 'a' your colours have completely faded out before even the half way point... Something is definitely wrong there -- I don't think this is a feature/design of GL.
Maybe post some more code/explanation of what you're doing?

Also, are you actually rendering a quad, or is it 2 triangles?

What are the vertex colors at each of the 4 corners of the two quads?

What blend mode are you using?

Are you rendering to sRGB?

Off the wall initial guesses, likely one of these or a combination of the two, though some quick envelope math doesn't match up directly:

Gamma is getting misunderstood.

Your vertex colors are (1, 0, 0, 1), and (0, 0, 0, 0) for the red quad, (0, 1, 0, 1) and (0, 0, 0, 0) for the green quad, and you're using alpha blending, so your color ramps drop off faster than you would expect.

For diagnosing, I suggest trying vertex colors of (1, 0, 0, 1) and (1, 0, 0, 0) for red, (0, 1, 0, 1) and (0, 1, 0, 0) for green. Might also use additive blending instead of src/inv_src. Also try a simple vertical gradient, as both two composited quads, and as a single quad going from (1, 0, 0, 1) at the top to (0, 1, 0, 1) at the bottom.

Thanks for the help!


Also, are you actually rendering a quad, or is it 2 triangles?

It's being rendered as two triangles. I've read that OpenGL uses the "Barycentric coordinate system" which might cause the issue as seen here. But that individual was rendering in 2D, so he was able to bypass the problem by generating his texture coordinates from screen space (rather then interpolating them from the vertices).


Maybe post some more code/explanation of what you're doing?

Drawing two quads (one red and one green) that overlay and blend together to create a linear fade from red to green (as is seen in image B). Somehow the interpolation of the color channel is non-linear, resulting in the background bleeding through as is seen in image A. Code:


// Draw the red quad
glBegin(GL_TRIANGLES);

glVertex3f(0, 1, 0);
glColor4f(1, 0, 0, 0);

glVertex3f(1, 1, 1);
glColor4f(1, 0, 0, 0);

glVertex3f(1, 1, 0);
glColor4f(1, 0, 0, 1);

glVertex3f(0, 1, 1);
glColor4f(1, 0, 0, 1);

glVertex3f(1, 1, 1);
glColor4f(1, 0, 0, 0);

glVertex3f(0, 1, 0);
glColor4f(1, 0, 0, 0);

glEnd();

// Draw the green quad
glBegin(GL_TRIANGLES);

glVertex3f(0, 1, 0);
glColor4f(0, 1, 0, 1);

glVertex3f(1, 1, 1);
glColor4f(0, 1, 0, 1);

glVertex3f(1, 1, 0);
glColor4f(0, 1, 0, 0);

glVertex3f(0, 1, 1);
glColor4f(0, 1, 0, 0);

glVertex3f(1, 1, 1);
glColor4f(0, 1, 0, 1);

glVertex3f(0, 1, 0);
glColor4f(0, 1, 0, 1);

glEnd();


What are the vertex colors at each of the 4 corners of the two quads?
What blend mode are you using?
Are you rendering to sRGB?

See the above for vertex colors. The blend mode is glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA). I am using sRGB. I am using Ogre3D so I can't verify that gamma correction isn't happening for sure.

I got Googling and added "fragColour.a = pow(outColour.a, 1.0 / 5.0);" to my shader rather then "fragColour.a = outColor.a;" and it seems to have done the trick! Reading the OpenGL registry though says gamma correction shouldn't happen on alpha values for textures. What about vertex colors? That and a gamma factor of 5 is a lot bigger then the standard 2.2.


I got Googling and added "fragColour.a = pow(outColour.a, 1.0 / 5.0);" to my shader rather then "fragColour.a = outColor.a;" and it seems to have done the trick! Reading the OpenGL registry though says gamma correction shouldn't happen on alpha values for textures. What about vertex colors? That and a gamma factor of 5 is a lot bigger then the standard 2.2.

IMHO there must be a problem elsewhere! First, fragment alpha is not needed at all when you use glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA). Let fragment's alpha be at 1. Second, alpha is never meaningfully targeted by gamma correction, neither in textures nor in vertices nor elsewhere.

How do you do the 2 steps w.r.t. the blend operation? it is done with glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA) in both cases? If so, then it is clear that the background has an effect onto the result.

You should be using glBlendFunc(GL_SRC_ALPHA, GL_ONE).

At a point where both of them are interpolated to 50% alpha, using SRC_ALPHA, ONE_MINUS_SRC_ALPHA, you will get 0.5 green, 0.25 red, 0.25 background, because you're doing this:

Output = 0.5 * Green + (1-0.5) * ( 0.5 * red + (1-0.5) * background );

This topic is closed to new replies.

Advertisement