Fragment Shader problems

Started by
16 comments, last by taby 18 years ago
Place #version 110 at the top of the shader file to explicitely tell the compiler which version of GLSL to compile for (ie: 1.1).

Aside, if you are using recursive calls, this is not defined behaviour in GLSL, and will cause all kinds of trouble.
Advertisement
Oh! Thanks, I'll try that when I get home. Is the GLSL version tied to the shader model used, or is that purely a hardware issue? In other words, is it possible that my problem could be solved by compiling to a specific version, or is it already targeting my hardware?

Sorry for all of the questions. :-)
I can't honestly tell you what the behaviour is, though the nvidia developer's site may contain information directly related to this.

However, I suspect that the compiler comes bundled with your driver, so it may very well be hardware optimized, etc. I've always assumed that this is why shaders are compiled at runtime (for now).
You don't have to compile at run-time tho? Well not from a fully high-level language shading languge?
Steven ToveySPUify | Twitter
Pre-compiled shaders won't be supported until the next version of OpenGL, according to the materials gleaned from the GDC 06 conference.
Ahhh, cheers for that :-)
Steven ToveySPUify | Twitter
Can you use the Cg runtime to load shaders written in GLSL but compiled with cgc? I noticed that nVidia uses the cgc compiler to compile the GLSL anyway.
If they use the same shader/uniform data input model, I can't see why not.

As far as I recall, the driver I have (77.77) uses the 3DLabs compiler.

This topic is closed to new replies.

Advertisement