Thanks! Actually, Im trying to create a water surface where the fog level(alpha component, originally) depends on the angle at which you can see the surface(the vertices).
This is a simplified shader, but it produces the same thing:
#version 330 core
layout(location = 0) in vec3 vertex_pos;
layout(location = 1) in vec3 vertex_color;
uniform mat4 V_rot;
uniform mat4 V_tran;
uniform mat4 P;
out vec4 color_VOUT;
void main(){
vec4 pos_TRAN_ONLY = V_tran * vec4( vertex_pos, 1);
float pos_dist = length( pos_TRAN_ONLY.xyz);
float alpha_level = 1 - abs( pos_TRAN_ONLY.z) / pos_dist; //1 - sin( alpha)
color_VOUT = vec4( vertex_color, 1);
color_VOUT.r = alpha_level; //set the RED component, so its more visible.
gl_Position = P * (V_rot * pos_TRAN_ONLY);
}
Sorry, I dont get why setting the distance instead of the color would be interpolated better.
...now as Im thinking, it can be that the color (which depends on the distance) is interpolated linearly on a given line of a triangle, but the distance from the camera is not linear. or something like that.
Edit:
The fragment shader doesnt do anything now, just sets the output color.