Jump to content
  • Advertisement
Sign in to follow this  
jeroenb

Strange GLSL behaviour

This topic is 4905 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

My engine should be platform independend. I use GLSL for the shading functionalities. On Windows it works fine, but now I am trying it one linux, and it doesn't work. What happens is: for the world I use one shader object (with only a vertex shader for the light vector calculation etc) to do bumpmapping etc. For my particle engine I use another shader object with another vertex shader. The code first renders the world and then the particle engines. But then in the second loop it keeps using the particle engines shader object while the world is rendered. When the particle system stops, I disable the shaders with glUseProgramARB(0) and set the worlds shader object correctly (checking it with glGetError) again before I render the world the second time. btw. I also use vertex object buffers for faster rendering. If I don't call the glDrawArray function during rendering of the particles it does work correctly. But I ensured that I don't write out of the buffers bounds, so this really is odd. Has someone had this behaviour before or has suggestions how I could solve it? I am working on a GeForce 4 TI 5200 and it does support the shader objects, etc.

Share this post


Link to post
Share on other sites
Advertisement
Mmm.. I noticed that on other computers with Windows but using a GeForce4 it also responds like described above. Makes me wondering is there is true support for GLSL on Geforce4 videocards. Though the glGetString does show that it's supported :-s

EDIT: this is getting very odd. If I change the shader of the particle engine to the same as world shader, the world is at least drawn correctly. But then the particles aren't visible :s I have absolutely no idea what is going on here :s

EDIT: I even checked with glValidate, glGetHandle, etc to make sure that the correct shader has been bound, and still it uses the wrong shader :-(...

[Edited by - jeroenb on May 12, 2005 2:35:34 PM]

Share this post


Link to post
Share on other sites
but why is it then included in the GL_EXTENSIONS string? and with one shader is actually does work :s

Share this post


Link to post
Share on other sites
You might be confusing the extensions. There are several "shader" extensions for OpenGL. There is GL_ARB_vertex_shader and GL_ARB_fragment_shader, but this is not glsl. They're assembly shaders, I believe. The extension you need to look for is GL_ARB_shading_language_100, (and also GL_ARB_shader_objects).

Besides that it's hard to tell what's wrong. At the moment you're the person who knows your own code best, and you're the person best able to debug it. So, I suggest you do just that.

Share this post


Link to post
Share on other sites
I also check for GLEE_ARB_shading_language_100, which results true (it's also listed in the string).

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!