Sign in to follow this  
bencelot

GLSL runs extremely slow on some computers [solved]

Recommended Posts

Heyo :) I've noticed that my game runs at less than 1 fps on my friend's computer. This only happens when he enables shaders however which leads me to believe this is causing the problem. He says his card is Radeon X1300 Series I've done some research and it sounds as if his computer has decided to run the shaders in software mode. The weird thing is that the shaders compile and link just fine on his computer. There is no warning saying that GLSL is going to revert to software mode.. and yet it's still extremely slow. My questions are: 1) Is it possible for GLSL to switch to software rendering sometime after the program has been compiled and linked? If so is there a way to detect when this has happened so I can revert to a rendering path that doesn't involve shaders. 2) Are there any tips you might have to help me improve my shader code? I've heard all sorts of things like if statements and branching and using uniforms can all drastically slow down on some cards. Is this true for Radeon X1300 Series? Here is the shader code:
//THE SHADER CODE
  const char* shadowCode = 
    "#version 100\n"

    "uniform sampler2D baseTexture;"
    "uniform sampler2D shadowTexture;"
    "uniform sampler1D shadowHeightTexture;"
    "uniform bool mapMode;"
    "uniform bool fadeMode;"
    "uniform float contrast;"

    "void main() {"

    "  vec4 baseColour = texture2D(baseTexture, gl_TexCoord[0].st); "

    "  float shadowIntensity = texture2D(shadowTexture, gl_TexCoord[1].st).a; "
    "  if(mapMode || fadeMode) { "
    "    if(gl_TexCoord[2].s < 0.5) { "
    "      shadowIntensity = min(1.0, shadowIntensity + (1.0-texture1D(shadowHeightTexture,gl_TexCoord[2].s).r) ); "
    "    } else { "
    "      shadowIntensity = max(shadowIntensity, (1.0-texture1D(shadowHeightTexture,gl_TexCoord[2].s).r) ); "
    "    } "
    "    shadowIntensity *= shadowIntensity; "
    "  } "

    "  baseColour *= gl_Color; "

    "  float lumin = (baseColour.r*0.3 + baseColour.g*0.59 + baseColour.b*0.11); "
    "  baseColour.gb *= (contrast*((lumin - baseColour.gb)/2.0) + 1.0); "
    "  baseColour.rgb *= (lumin*contrast + 0.5*(2.0-contrast) + 0.2*contrast); "




    "  if(mapMode) { "
    "    baseColour.rgb *= (1.0-shadowIntensity/3.0); "
    "  } "
    
    "  if(fadeMode) { "
    "    baseColour.a *= (1.0 - shadowIntensity); "
    "  } "

    " gl_FragColor = baseColour; "

    "}";



Any help would be greatly appreciated. Cheers! Ben. [Edited by - bencelot on November 10, 2009 2:42:56 AM]

Share this post


Link to post
Share on other sites
Ok I just rewrote the shader to remove all if statements but that didn't help things. It still runs dreadfully slow.


So my first question still remains though.. is there a way to detect when GLSL switches to software mode so I can revert to a rendering method that doesn't involve shaders?

Share this post


Link to post
Share on other sites
Quote:
Original post by bencelot
Ok I just rewrote the shader to remove all if statements but that didn't help things. It still runs dreadfully slow.


So my first question still remains though.. is there a way to detect when GLSL switches to software mode so I can revert to a rendering method that doesn't involve shaders?


Not really. Your only chance is to parse the shader info log (glGetShaderInfoLog) looking for keywords like "software"... This might also give you some clue as to what has gone wrong.

Nothing stands out in your shader, the X1300 is a shader model 3.0 card which should be able to cope with this code. Maybe the issue is caused by something outside the shader?

What texture formats and parameters are you using? (The X1x00 series and earlier hardware has several limitations on floating-point textures and non-power-of-two textures)

Share this post


Link to post
Share on other sites
It could be the old issue where enabling polygon smoothing or line smoothing causes it run on software mode.

You might want to optimize your shader as well
baseColour.gb *= (contrast*((lumin - baseColour.gb)/2.0) + 1.0);
turns into
baseColour.gb *= (contrast*((lumin - baseColour.gb)*0.5) + 1.0);

http://www.opengl.org/wiki/GLSL_:_common_mistakes#Get_MAD

Share this post


Link to post
Share on other sites
Thanks V-man, I'll check that link out now.

Fidder, I'm already printing the info log but nothing displays. I assume this is because it's successful.

glGetInfoLogARB does the same thing as glGetShaderInfoLog, right? I would gladly parse the infolog for "software" if only it would show up!

Here's the function I'm using to create shaders:


void ImageManager:: CreateShader(const char* code, GLenum& program) {
GLint compileResults[3];
char infoLog[500];

program = glCreateProgramObjectARB();
GLenum fragShader = glCreateShaderObjectARB(GL_FRAGMENT_SHADER_ARB);

//Compiling Shaders
glShaderSourceARB(fragShader, 1, &code, NULL);
glCompileShaderARB(fragShader);

glGetObjectParameterivARB(fragShader, GL_COMPILE_STATUS, compileResults);
if((*compileResults) == GL_TRUE) {
cout << "Shadow Shader...OK" << endl;
glGetInfoLogARB(fragShader, 200, NULL, infoLog);
cout << infoLog << endl;
} else {
cout << "Couldn't compile shaders!!" << endl;
glGetInfoLogARB(fragShader, 200, NULL, infoLog);
cout << infoLog << endl;

return;
}

//Attatching and Linking Shaders
glAttachObjectARB(program, fragShader);
glLinkProgramARB(program);

glGetObjectParameterivARB(fragShader, GL_LINK_STATUS, compileResults);
if((*compileResults) == GL_TRUE) {
cout << "Shadow Linking...OK" << endl;
glGetInfoLogARB(fragShader, 200, NULL, infoLog);
cout << infoLog << endl;
} else {
cout << "Couldn't Link Shaders" << endl;
glGetInfoLogARB(fragShader, 200, NULL, infoLog);
cout << infoLog << endl;

return;
}

glUseProgramObjectARB(0);
}




All it displays is:

Shadow Shader...OK

Shadow Linking...OK

All my textures are power of 2. The game runs just fine when he disables shaders.. but as soon as he enables them the game freezes and runs at < 1 fps.






Share this post


Link to post
Share on other sites
No he hasn't.. and when this is all done I'll tell him to do so. The problem is that other players won't have updated their drivers either and might start playing with < 0 fps.

It's weird because I'm checking for all 3 necessary extensions using GLEE:

allowShaders = false;
if(GLEE_ARB_shader_objects && GLEE_ARB_fragment_shader && GLEE_ARB_shading_language_100) allowShaders = true;

and the info log isn't reporting any errors or messages so I can't see if it's dropped to software mode either.

Just to clarify.. is it possible for GLSL to switch to software mode AFTER the linking stage? If so you'd hope there'd be a way to test for this, the difference is huge!

Share this post


Link to post
Share on other sites
Just another thought.. is there something I can put in my shader code that will force it to run in hardware? Or failing that.. at least disable the ability to run in software forcing and error instead of a warning?

Share this post


Link to post
Share on other sites
Stick your hand up if you're awesome!!!

V-man, stick your hand up :)

It was:

glEnable(GL_LINE_SMOOTH);

that was causing all my problems. Soo happy!

Thank you very much.

Ben.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this