glCreateProgram/glCreateProgramObjectARB not working

Started by
18 comments, last by RIZAX 15 years, 11 months ago
I have the oddest problem, would anyone know why both glCreateProgram and glCreateProgramObjectARB refuses to return a valid value on non-NVIDIA hardware? I have 5 machines I've tried my shader on, 2 Nvidia, 2 ATI, 1 Intel and only the Nvidia's work. The program compiles\links fine on all platforms, but on the non-Nvidia platforms when I try to create a program instance I get nothing but invalid values back. Platforms: Dual 7800GT = success Single 8800GT = success X1600 = fail 9800xt = fail Integrated Intel (some chipset - didnt get which one, but it was a dell with a centrino in it). = fail Thanks for any feedback.
Advertisement
What does the error log say? Checking for errors is a start.
"Works on nVidia but not on all others" is usually due to illegal GLSL code which is using something that's legal in Cg. The nVidia compiler is permissive with those things, as it compiles via Cg anyway. The others aren't and fail (being "permissive" isn't necessarily good in this case, however).

For example, using the "half" datatype or assigning an int to a float (GLSL does no implicit casts) are things to look for.
Thanks for the replies,

The error code is 501 when you try to execute either, so thats an GL_INVALID_VALUE.

I understand that my GLSL might not be the greatest (actually it isnt, being that the frag shader is one giant main), and I would look to cleaning it up, but first I need to be able to create a program instance, which is where the error comes.

The error happens in my loader class in the constructor of all places.

The constructor code is (typing this at work so bare with me if theres some syntax issue):
Shader(){     GLenum error;     program = glCreateProgram();     if((error=glGetError())!=GL_NO_ERROR)            exit(error);}

error is always 501 on non-nvidia machines. BTW, im using glew for the extensions in case that has a bearing on the problem.

[Edited by - RIZAX on May 7, 2008 6:15:24 AM]
You should try this

std::cout << gluErrorString(glGetError()); See what is reported. Also if you haven't download RenderMonkey and dump your shader code into RenderMonkey and see if you have errors. It will tell you if you do.
It's all very well you people saying 'check the GLSL' but it's the program object creation which is failing; we haven't even got to the source code yet.

@OP

The only thing I can say is check you have a valid context as the code posted looks fine to me.
Probably he has a class and this is his constructor

Shader(){     GLenum error;     program = glCreateProgram();     if((error=glGetError())!=GL_NO_ERROR)            exit(error);}


so the constructor gets called while there is no GL context or even a window. It's the typical C++/Java/other OO language problem.
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);
Everything should be up and running, to verify everything, i put a small little main together just to see what was happening

int main(int argc, char** argv){  GLuint program;  glutInit(&argc, argv);  glutInitDisplayMode(GLUT_RGB | GLUT_DOUBLE | GLUT_DEPTH);  glutInitWindowPosition(400, 100);  glutInitWindowSize(640, 480);  glutCreateWindow("ExtensionTest");  GLenum err = glewInit();  if (err == GLEW_OK)	 logmsg("GLEW WORKS!!!");  else   {	 logmsg("GLEW NO INIT\n");	 exit(1);  }  if(GLEW_VERSION_2_0)	  logmsg("Opengl 2.0 is supported");  else  {	  logmsg("Opengl 2.0 not supported");	  exit(1);  }    program = glCreateProgram();  logmsg("program: %u",program);  return 0;


which produces the output:

GLEW WORKS!!!Opengl 2.0 is supportedprogram: 2147483649


The odd thing, that I just started to notice is that the value that program gets is always the same on every platform thats not NVIDIA ... the plot thickens....
It must have a valid context, he said the code works fine with Nvidia cards. If you don't have a valid context, nothing should work, not just one vendor, but all vendors shouldn't work. Unless I am not reading it clearly that he has gotten it to work on Nvidia hardware...

[Edited by - MARS_999 on May 7, 2008 10:18:26 PM]
Quote:Original post by MARS_999
If must have a valid context, he said the code works fine with Nvidia cards. If you don't have a valid context, nothing will work, not just one vendor. Unless I am not reading it clearly that he has gotten it to work on Nvidia hardware...


Your right, everything works like a charm on my 8800/7800 ... it dies everywhere else.

This topic is closed to new replies.

Advertisement