GLSL: some information

Started by
12 comments, last by _the_phantom_ 18 years, 10 months ago
The two fragment interfaces are targetted at PS2.0 level hardware, thus wont be exposed on pre-DX9 class cards
Advertisement
Quote:Original post by Promit
Quote:Original post by Hulag
Thats the exact reason while NV's compiler is "dumber", Nvidia doesn't stick to the glSlang specs while ATI does.


When one compiler compiles float f = 1 + FloatVar; and one doesn't, I know which one is the better compiler, specs or not.


sorry, but you are wrong.
A compiler which conforms to the spec is by definition a better compiler, just because you dont want to write conforment code doesnt enter into it.
Also, the NV compiler allows you to use Cg functions in GLSL code, which makes the problem even worse.

There's a good reason compilers for all sorts of things add non-standard extensions. Of course, I'm perfectly well aware that NV simply shoved the Cg compiler in and made the necessary changes to have it compile GLSL.

Anyway, that's really, really annoying about the Gf4.
SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.
Quote:Original post by Promit
There's a good reason compilers for all sorts of things add non-standard extensions.


indeed, and i'm not against the idea of extension, just that if you take advantage of them and it doesnt compile on another compiler you can hardly call the standard compliant one 'dumb', perticularly when the one which allows the non-standard stuff is in the minority (both ATI and 3DLabs wont allow it, nor will any of the standard compliant compilers which exist or will exist).

In GLSL all extensions should be off by default, however the current default start of the NV GLSL compiler (as i understand it) is as though all your shaders have been pre-fixed with #extension CompilerAsCg : enable, when in reality if you want this non-conforment behaviour you should put a line like that at the start of your shader (which will still then fail to compiler still on ATI and 3DLabs et al compilers, however this is expected behaviour as you've been explicate about your shaders needs).

In short, unless you explicately request it then standard behaviour should be the default for everyone, if it was then convasion about which compiler 'is the best' wouldnt be accuring and OpenGL would be a happy land, as it stands OpenGL has a crack down the middle because one vendor hasnt done things right. GLSL was ment to unify shaders for hardware, instead we still have a split, its just dumb.

This topic is closed to new replies.

Advertisement