Jump to content
  • Advertisement
Sign in to follow this  
bencelot

OpenGL Strange problem linking fragment shaders..

This topic is 3242 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hey there, I'm trying to link a fragment shader. It works fine for most people but not for one of my players. When he tries to link the shaders he gets the following output: Could Not Link Shaders!!! Fragment shader(s) linked, no vertex shader(s) defined. My game does not use vertex shaders though I was under the impression that if I didn't the default fixed function would be used. So I'm wondering why the linking is failing, despite the error message saying "Fragment shader(s) linked". Here's the relevant section of code (this works on other computers):
  //...

  glAttachObjectARB(program, fragShader);
  glLinkProgramARB(program);

  GLint compileResults[3];
  char infoLog[500];

  glGetObjectParameterivARB(fragShader, GL_LINK_STATUS, compileResults);
  if((*compileResults) != GL_TRUE) {
    cout << "Could Not Link Shaders!!!" << endl;
    cs.AddLog("Could Not Link Shaders - Update Drivers.");

    //Just displaying the Info Log (this is where the error is displayed)
    CheckInfoLog(program);
    CheckInfoLog(fragShader);
  }

  //...


void ImageManager:: CheckInfoLog(GLhandleARB obj) {
  int infologLength = 0;
  int charsWritten  = 0;
  char *infoLog;

  glGetObjectParameterivARB(obj, GL_OBJECT_INFO_LOG_LENGTH_ARB,
    &infologLength);

  if (infologLength > 0)
  {
    infoLog = (char *)malloc(infologLength);
    glGetInfoLogARB(obj, infologLength, &charsWritten, infoLog);

    //Displaying the error message!!
    cout << infoLog << endl;

    free(infoLog);
  }
}


Has anyone ever seen this sort of a problem before? compileResults != GL_TRUE and yet the error message says "Fragment shader(s) linked". Weirdness! Btw this computer's OpenGL Version is: OPENGL VERSION: 2.1.8787 Any help or suggestions would be greatly appreciated. Cheers, Ben.

Share this post


Link to post
Share on other sites
Advertisement
It seems that this player has crappy drivers which abusively needs vertex program whenever you use a fragment program. Unless your fragment program tries to access user-defined varyings, the vertex program should not be mandatory - though allowed of course.

Do you know the graphics card / driver version of your player's hardware ?

Share this post


Link to post
Share on other sites
Not sure about the other guy, but the same problem just popped up for another player who just got a new graphics card.

His card: XFX Radeon HD 5770
His OpenGL Version: 2.1.9116

Not sure about the actual driver version. I'm not using any vertex shaders or any varying variables in any or my shaders. It works fine on most computers even though no vertex shader is defined (as expected).

I'm wondering if the shader actually WILL work... just as a "warning" or whatever compileResults != GL_TRUE. Does compileResults only ever equal GL_TRUE or GL_FALSE? Because if it equals something like GL_WARNING then maybe I can check for that and allow the shaders to run.

Share this post


Link to post
Share on other sites
Oh deary, is this true? ATI really sucks.

If so, does anyone know what the default vertex shader is so it doesn't mess with anything?

Share this post


Link to post
Share on other sites
Quote:
His card: XFX Radeon HD 5770
His OpenGL Version: 2.1.9116

ATi generally provides good drivers. However if the player got the drivers out of the box, chances are the OpenGL driver *may* not be perfect. His best bet would be to install the latest drivers of his graphics card, if not done already.

Quote:
I'm wondering if the shader actually WILL work... just as a "warning" or whatever compileResults != GL_TRUE. Does compileResults only ever equal GL_TRUE or GL_FALSE? Because if it equals something like GL_WARNING then maybe I can check for that and allow the shaders to run.

compileResults will always return either GL_TRUE or GL_FALSE, according to the shader objects specifications :
http://www.opengl.org/registry/specs/ARB/shader_objects.txt - § 2.14.2, LinkProgramARB

If the result is GL_FALSE, it means that the linker failed. So you can not use this shader. Maybe some drivers will still run the shader if you bind it, but in the best case you will have undefined results and in the worst case your application will simply crash or freeze the system or eat your dog or whatever. In practice, calling glUseProgramObjectARB after a link failure will set the GL error to GL_INVALID_OPERATION, still according to the shader objects spec :
http://www.opengl.org/registry/specs/ARB/shader_objects.txt - § 2.14.2, UseProgramObjectARB

As a side note, this line is not valid :
glGetObjectParameterivARB(fragShader, GL_LINK_STATUS, compileResults);

you should ask for GL_OBJECT_LINK_STATUS (or GL_OBJECT_LINK_STATUS_ARB) instead :
glGetObjectParameterivARB(fragShader, GL_OBJECT_LINK_STATUS, compileResults);

In fact this is just a literal issue because GL_LINK_STATUS and GL_OBJECT_LINK_STATUS do equal, but that is a hack that should not be used.

Share this post


Link to post
Share on other sites
Quote:
Original post by nuno_silva_pt
Unlike nVidia, all ATI cards need a VS when using a FS, at least that's what i heard somewhere a long time ago.

AFAIK that's an old issue that was supposed to be corrected the year before. However I haven't tried since that day, so maybe the problem still lurks around.

Quote:
Original post by bencelot
If so, does anyone know what the default vertex shader is so it doesn't mess with anything?

Don't even think about it. First you won't get it. Secondly, if you ever had it, I'm not sure you can have a single vertex shader for all ATi cards, neither you would know how well it will perform. Your best bet would be to write you own vertex shader which passes exactly what you need and pray that a single vertex shader is enough for all your fragment-shaded objects (otherwise you will have to write a vertex shader for every case).

Share this post


Link to post
Share on other sites
ATI cards work very well with a fragment shader alone. They've had for years.

Does your player have the latest drivers ?

Share this post


Link to post
Share on other sites
No I don't think so.. but you know, you can't trust players to have the latest drivers, and I'd like to make it work for as many people as possible.

It still seems strange to me that it'd complain about not having a vertex shader when it shouldn't need one.

Share this post


Link to post
Share on other sites
I've seen a similar topic last week or so, and it was solved adding a vertex shader (the guy *did* have the last drivers)

Emulating the fixed function pipeline vertex shader shouldn't be a problem, at least I didn't understand vincoofs point about performance and uncompatibilities, since using no shader is, after compiling, the same as using a "default" shader.

I have no experience with GLSL, but the fixed function shader is in Wikipedia: http://en.wikipedia.org/wiki/GLSL

Good Luck!

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!