• Advertisement
Sign in to follow this  

glCreateProgram/glCreateProgramObjectARB not working

This topic is 3630 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I have the oddest problem, would anyone know why both glCreateProgram and glCreateProgramObjectARB refuses to return a valid value on non-NVIDIA hardware? I have 5 machines I've tried my shader on, 2 Nvidia, 2 ATI, 1 Intel and only the Nvidia's work. The program compiles\links fine on all platforms, but on the non-Nvidia platforms when I try to create a program instance I get nothing but invalid values back. Platforms: Dual 7800GT = success Single 8800GT = success X1600 = fail 9800xt = fail Integrated Intel (some chipset - didnt get which one, but it was a dell with a centrino in it). = fail Thanks for any feedback.

Share this post


Link to post
Share on other sites
Advertisement
"Works on nVidia but not on all others" is usually due to illegal GLSL code which is using something that's legal in Cg. The nVidia compiler is permissive with those things, as it compiles via Cg anyway. The others aren't and fail (being "permissive" isn't necessarily good in this case, however).

For example, using the "half" datatype or assigning an int to a float (GLSL does no implicit casts) are things to look for.

Share this post


Link to post
Share on other sites
Thanks for the replies,

The error code is 501 when you try to execute either, so thats an GL_INVALID_VALUE.

I understand that my GLSL might not be the greatest (actually it isnt, being that the frag shader is one giant main), and I would look to cleaning it up, but first I need to be able to create a program instance, which is where the error comes.

The error happens in my loader class in the constructor of all places.

The constructor code is (typing this at work so bare with me if theres some syntax issue):

Shader()
{
GLenum error;
program = glCreateProgram();
if((error=glGetError())!=GL_NO_ERROR)
exit(error);
}



error is always 501 on non-nvidia machines. BTW, im using glew for the extensions in case that has a bearing on the problem.

[Edited by - RIZAX on May 7, 2008 6:15:24 AM]

Share this post


Link to post
Share on other sites
You should try this

std::cout << gluErrorString(glGetError()); See what is reported. Also if you haven't download RenderMonkey and dump your shader code into RenderMonkey and see if you have errors. It will tell you if you do.

Share this post


Link to post
Share on other sites
It's all very well you people saying 'check the GLSL' but it's the program object creation which is failing; we haven't even got to the source code yet.

@OP

The only thing I can say is check you have a valid context as the code posted looks fine to me.

Share this post


Link to post
Share on other sites
Probably he has a class and this is his constructor


Shader()
{
GLenum error;
program = glCreateProgram();
if((error=glGetError())!=GL_NO_ERROR)
exit(error);
}



so the constructor gets called while there is no GL context or even a window. It's the typical C++/Java/other OO language problem.

Share this post


Link to post
Share on other sites
Everything should be up and running, to verify everything, i put a small little main together just to see what was happening


int main(int argc, char** argv)
{
GLuint program;
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_RGB | GLUT_DOUBLE | GLUT_DEPTH);
glutInitWindowPosition(400, 100);
glutInitWindowSize(640, 480);
glutCreateWindow("ExtensionTest");

GLenum err = glewInit();
if (err == GLEW_OK)
logmsg("GLEW WORKS!!!");
else
{
logmsg("GLEW NO INIT\n");
exit(1);
}

if(GLEW_VERSION_2_0)
logmsg("Opengl 2.0 is supported");
else
{
logmsg("Opengl 2.0 not supported");
exit(1);
}

program = glCreateProgram();
logmsg("program: %u",program);
return 0;




which produces the output:


GLEW WORKS!!!
Opengl 2.0 is supported
program: 2147483649




The odd thing, that I just started to notice is that the value that program gets is always the same on every platform thats not NVIDIA ... the plot thickens....

Share this post


Link to post
Share on other sites
It must have a valid context, he said the code works fine with Nvidia cards. If you don't have a valid context, nothing should work, not just one vendor, but all vendors shouldn't work. Unless I am not reading it clearly that he has gotten it to work on Nvidia hardware...

[Edited by - MARS_999 on May 7, 2008 10:18:26 PM]

Share this post


Link to post
Share on other sites
Quote:
Original post by MARS_999
If must have a valid context, he said the code works fine with Nvidia cards. If you don't have a valid context, nothing will work, not just one vendor. Unless I am not reading it clearly that he has gotten it to work on Nvidia hardware...


Your right, everything works like a charm on my 8800/7800 ... it dies everywhere else.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement