I'm trying to implement a shader that allows to do color grading using a 2D texture.
2D because I need it to run on all mobile devices capable of OpenGL ES 2.0.
The lookup table textures used for color grading look like this.
The Y axis contain the green values, the X axis contains both R and B.
This is the same format used by Unreal engine for color grading.
My shading works on all of my mobile devices, works on my desktop with an AMD card.
Does not work properly in my laptop that has an ATI card though, some pixels get completly wrong colors, most look perfecty good..
This is shader, can someone find any problem with it?
uniform sampler2D color_table_texture; // LUT texture with nearest filter, mipmaps off
uniform mediump float color_table_elements; // LUT.Height
uniform mediump float color_table_scale; // 256 / LUT.Height
uniform mediump vec3 color_table_clamp; // vec3(LUT.Height-1)
lowp vec3 ColorTableLookup(mediump vec3 color)
{
mediump float base_value = 255.0;
mediump vec3 rescaler = vec3(base_value, base_value, base_value);
color *= rescaler; // convert to 0..255 range
color /= color_table_scale; // convert to LUT range (0..LUT.HEIGHT)
mediump vec3 delta = fract(color); // get interpolation value
mediump vec3 rgb1 = floor(color); // get lower color
mediump vec3 rgb2 = ceil(color); // get higher color
rgb2 = min(rgb2, color_table_clamp); // clamp to LUT.height-1
// loop up first color
mediump vec2 ofs = vec2(rgb1.r + rgb1.b * color_table_elements, rgb1.g);
ofs.x /= (color_table_elements*color_table_elements);
ofs.y /= color_table_elements;
mediump vec3 temp = texture2D(color_table_texture, ofs).rgb;
// loop up second color
ofs = vec2(rgb2.r + rgb2.b * color_table_elements, rgb2.g);
ofs.x /= (color_table_elements*color_table_elements);
ofs.y /= color_table_elements;
mediump vec3 temp2 = texture2D(color_table_texture, ofs).rgb;
// interpolate
return mix(temp, temp2, delta);
}
Note that while here I'm doign two lookups and interpolating, I also tried a trimmed down version that only did one lookup, no interpolation, the problem still happens.
I've tried so many things, it seems that some of the texture reads result in wrong lookups.
I don't know if this is a precision issue during generation of the offsets for lookup.
Is there a way to access a texture using integer pixel coordinates (1..Texture.Size) instead of 0...1?
Or what else could be the problem?