glCreateProgram/glCreateProgramObjectARB not working

Started by
18 comments, last by RIZAX 15 years, 11 months ago
Just because the context is valid on one configuration doesn't mean it is valid on all configurations; thus it is my go to question.

So, the questions remain;
the GLUT code posted, does this 'work' on all your test systems?
what glError() values do you get back if not?
does the returned value really change? (forget the meaning of the returned glCreateProgram() function, there is no standard for that and it could mean anything depending on the driver).
what does a program such as realtech VR's OpenGL caps viewer tell you about the broken target systems?
Advertisement
Ok I am going to ask the obvious, do you have these functions called somwhere?

glCreateShader
glShaderSource
glCompileShader
glAttachShader
glLinkProgram

better yet dump some more code, code that deals with glsl setup. BTW did you try your shader code in Rendermonkey yet? Just to make sure the shader source code works also?

To Phantom:

That glut code works fine on all my systems as far as it compiling-linking-running. I added the glGetError after the program object creation and its returning 0, so I started looking else where. As far as gl caps of the target systems GPU Caps Viewer is reporting 102 ext. available, which consist of the ARB extensions for the fragment and vertex shaders. So after double checking everything, setting everything to use ARB extensions, I can get everything to run up to
glShaderSourceARB(), where I recieve a 500 for GL_INVALID_ENUM, which for the moment only happens on the vert shaders (I've ran 3 fragment shaders through and it only happens when I try to run the 2 vertex shaders in).

I'll try figuring out some more stuff tommorow ...

thanks for all the input guys.
Quote:Original post by MARS_999
Ok I am going to ask the obvious, do you have these functions called somwhere?

glCreateShader
glShaderSource
glCompileShader
glAttachShader
glLinkProgram



Unimportant.

If the problem is with 'glCreateProgram()' as he mentioned then none of those have even come into effect yet. You can't attach and link a program together if you can't create a program object in the first place.

Meant to come back to this earlier but work and fire alarms have stopped me..

Quote:Original post by RIZAX
To Phantom:

That glut code works fine on all my systems as far as it compiling-linking-running. I added the glGetError after the program object creation and its returning 0, so I started looking else where. As far as gl caps of the target systems GPU Caps Viewer is reporting 102 ext. available, which consist of the ARB extensions for the fragment and vertex shaders. So after double checking everything, setting everything to use ARB extensions, I can get everything to run up to
glShaderSourceARB(), where I recieve a 500 for GL_INVALID_ENUM, which for the moment only happens on the vert shaders (I've ran 3 fragment shaders through and it only happens when I try to run the 2 vertex shaders in).

I'll try figuring out some more stuff tommorow ...

thanks for all the input guys.


OK, so the start of your reply seems to indicate that GLSL works fine, which means the problem lies with setup of your application which is having the problem.

The bit of code you pasted earlier looks fine, which means something isn't being setup sanely prior to that code being called, that's the only way it's likely to fail. As previously mentioned, context is the main reason but it could be something else, without more code it's hard to tell.

But it seems you've got passed that now? Getting an error of GL_INVALID_ENUM is a little odd as it's not an error you'd expect to see from the glShaderSourceARB() function, mostly because it doesn't take an enum. Posting source might help there as well.

Anyways, let us know if you figure it out [smile]
UPDATE - RESOLOVED!!!

After reading about the difference in how the shaders compile, I reverified everything yet again. I also went back to the straight 2.0 spec in my loader class
and verified that everything was loading correctly. That last error I was reporting traced back to a compile error in my fragment shader so I went line by line through the fragment shader and found something very very odd.

I have a varying normal that I pass into the fragment shader from the vertex shader that I'm using to compute light intensities. Whenever I try to use it directly in a dot product operation, it crashes as a compile error, but if I set a local equal to it, then use it - everything works fine. Aside from that little tidbit its running on my x1600 now.

Again thanks for all the input guys.
Glad to hear you figured it out. Only thing is this could have been avoided if you used Rendermonkey as I had pointed out earlier. RM is a nice app to check your shader code in the future before you beat you head against the wall in your own code.

Happy coding.
Yea, I grabbed RM and FX Composer, as that I'm still new to shaders, I'm still trying to figure out the 2 apps so I can decided on which one I want to use.

Also, I figured out why the normal thing was producing an odd problem, and magically its a type mismatching thing: int vs float, that produces a warning in the compiler, that is allowed to pass on nvidia hardware, but the others complain about it. Also, the logging information that the shader compiler on ATI hardware produces is rather lacking ... all I get is random characters out, where as the NVIDIA stuff produces full strings as to what the problem is. I honestly don't know how all the artist deal with these flimsy standards...
IMO Nvidia's OpenGL support is better than ATI's. But this will start a flame war. I have used both, and right now ATI's extension support for newer features e.g. DX10 isn't here yet, as for Nvidia they are. That is why I use Nvidia's hardware now for coding. Do you mean the float vs. int like you were putting 1.0f? vs. 1.0?
No, I was taking a sum that was a float and dividing it by gl_MaxLights which is defined as an int. When compiling with the Nvidia hardware you get warnings about a type mismatch, but when I use the ATI hardware it completely fails to compile, and of course it won't link. At first I thought it had something to do with the normal calculation which was inside the same code block, but it was really the division.

This topic is closed to new replies.

Advertisement