GLSL texture2D problem.
I've asked this elsewhere but no one has been able to help me. I have a fragment shader...
uniform sampler2D bTexture,tTexture,lTexture;
float maxC;
vec4 tColor1,tColor2,lColor;
void main(void){
tColor1=texture2D(bTexture,vec2(gl_TexCoord[0]));
tColor2=texture2D(tTexture,vec2(gl_TexCoord[1]));
lColor=texture2D(lTexture,vec2(gl_TexCoord[2]))+((gl_Color)-0.5);
gl_FragColor=vec4(vec3(mix(tColor1,tColor2,tColor2.a)),tColor1.a+tColor2.a)*lColor;
}
This code works perfectly on Geforce cards. Problem is that it runs slowly on Radeon cards. If I remove the call to texture2D the problem goes away. So I think the problem is there. But I'm unsure what's causing the problem.
You shouldn't query the UV coords from the fragment shader.
Make a varying variable to store the UV coords and then use that varying from the fragment shader.
Make a varying variable to store the UV coords and then use that varying from the fragment shader.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement