What's your opinion about Glslang wins over CG at ARB?

Started by
10 comments, last by qingrui 20 years, 7 months ago
While D3D already has its HLSL out for a year, shaders langs in the OpenGL world are still in battle. Only CG was born before D3D''s HLSL and it has been with us for a long time. But ATI and 3DLabs make another shader lang Glslang as the OpenGL standard, and it''s still embryo. Although CG is made open-source, ATI claims not to support it. Those things blurred the future of OpenGL, didn''t they?
Advertisement
That fact is, nVidia being the first to implement a shading language that supports OpenGL does not mean that Cg may become the standard for OpenGL specification. Moreover, Cg uses D3D as well as OGL.
Personnally I prefer a standard, unified, approved language by all the members of the Architecture Review Board rather than a language developed by one company only.

With that said, Cg is great. It''s just a bit off-topic to integrate it as an OpenGL standard thing.
I''m also working with Cg. It''s just great because it''s actually the same as C/C++.

But since shader programs aren''t really that long and complicated I think it''s not a problem to go over to GLslang once it is released. Though I really think it''ll take a long time until they can really call GLslang a "standard". Cg had a long testing and debugging time and is still in constant development.

You shouldn''t see it that way, that Cg will disappear once GLslang is released. "Standard" doesn''t mean: "This is the best, this has to be used, nothing else will work anymore" Professionell game developers always try to get as much performance as possible on every single gfx card and often use vendor-specific extensions anyway (with different code-paths), and because NVidia uses NV_fragment_program(2) which will ever be supported by Cg since Cg means NVidia there are pretty good chances for Cg to survive (they will perhaps use GLslang on ATI cards).

Cg is great, I haven''t seen GLslang syntax/specifications, yet. But if GLslang will ever become such a good language as Cg it''ll really be worth a look. Don''t be angry just because they don''t made Cg a standard - you should be curious and look forward to what GLslang will bring
cg is rather useless.. at least if you don''t want to follow marketinghypes..

"take a look around" - limp bizkit
www.google.com
If that's not the help you're after then you're going to have to explain the problem better than what you have. - joanusdmentia

My Page davepermen.net | My Music on Bandcamp and on Soundcloud

Well CG, HLSL and GLslang are all the same lanugage pretty much (there are some differences though I think). CG doesn''t run well on ATI cards (or so I''ve heard) so you''ll probably want to go for HLSL if you''re using DX and GLslang if you''re using OGL.

davepermen: Why do you think CG is useless?
I agree that it''s useless when you want to optimize your 3D engine.
Actually I don''t use Cg because I prefer using extensions "manually". This way there''s more room for customization.

Whatsoever, it''s really appreciable to build a model with e.g. 3DSMax and see in realtime how it will look before importing it to your project, thanks to the 3DSMax Cg plugin.
I''ve read the draft spec of Glslang. But I haven''t found much differences except the type names and function names compared to Cg. (The type names looks a bit like variable names.) The only good point is that it has all input/output uniform parameters built-in, so I don''t need to declare them myself. So I feel it''s not worth the time to make a same thing. If they do modifications on Cg, it''ll be much faster progress...
of course, I guess chances are pretty good that the next release of the Cg dlls may well include a glSlang profile, in which case Cg would still be my prefered choice. No code changes would be needed.

I realise that Cg and last generation ati cards (8500/9000/9100/9200) don''t mix at all in GL, but then again, they''re fine in D3D (although you can forget PS 1.4, dispite how good it is). The fact the compiler for Cg is open source would suggest hope, but I somehow doubt it will ever change...

that said,

Being in a position where I''m building both a GL and D3D renderer for my project, I simply cannot see another option, especially now I have some other people producing content for it (for a small side project). Their ability, as mentioned, to build the shaders in 3Dsm is extremly useful.

But easily the most useful thing is having that level of api independance.

Overall I''m impressed, sure the performance probably could be better if you re-wrote your shaders in the more ASM like shading languages, but that, in my opinion, is a short term advantage that will only benifit a few users and more than likly disadvantage many.

| - Project-X - On hold (kindof ).. - | - adDeath - | - email me - |
I have been playing with both Cg and GLSlang for a few months now and have found them very similar in language respects (ie C and C++ style). GLSlang is more tightly bound to OpenGL so it is easier to get at the GL state. The unfortunate part about using GLSlang right now is that only the compiler source code is available (go to the 3DLabs website to get it) so you have to build your own executable and it is far from complete and there are a number of bugs that I am finding so it is still in its infancy. The other problem too is that the output from the GLSlang compiler is in an intermediate form so you have to write a back end for it to run on a particular GPU so its more of a toy right now to play with until its sanctioned by ARB and vendors write backends for the compiler.

Cg and HLSL have matured in the last 8 months and I find them far more useful for shader development. Since I am using an ATI 8500 I am still stuck in a bit of a hard place since Cg doesn''t support the card 100% but I have found ways around that. I use the Cg compiler to output ARB_vertex_program format for vertex shaders which I can load directly into OGL. Pixel shader development initially proved to be a little trickier since the ATI 8500 does not support ARB_fragment_program but only ATI_fragment_shader which is very arcane to develop for but is very powerful. I was a little miffed about that since there are a lot of 8500, 9000, 9100, 9200 cards out there but with no good OpenGL tools for pixel shader development most developers shy away or spend time developing in house tools. Using the ATI_fragment_shader path is labour intensive and does not lend itself to quick development since the shader is hard coded. The 8500 and its derivatives are supported well under the DirectX route since there is good developer support with pixel shaders. To resolve this problem I wrote a quick and dirty PS_!_3 (Cg doesn''t support PS_1_4) reader that will take pixel shader source meant for DirectX and generates code for ATI_fragment_shader. The reader takes PS_1_3 and tries to optimize for PS1.4. Its usable for basic pixel shaders but still has a few more bugs to be worked out in order to take advantage of PS1.4. I have been looking at the source for the Cg compiler to see how hard it would be to implement for PS1.4 so that it could be loaded directly through the reader and it looks feasible but is it worth it?

As a side note, it doesn''t look good for the 8500 family being supported in GLSlang or by GL2.0 because its pixel shader doesn''t meet the spec as laid out by 3DLabs in their white paper. I''ve emailed ATI about what kind of support the 8500 will get in GL2.0 and the reply was GL2.0 does not exist yet so they can''t say what hardware will support it (laim). But I may be reading between the lines. ATI did not add limited GL driver support for ARB_fragment_program for the 8500 and never moved the ATI_text_fragment_shader over from the Mac OS driver to the windows driver. ATI_text_fragment_shader works very similar to ARB_fragment_program and would have been a better alternative than ATI_fragment_shader.

So what it comes down to in the end is I will probably stay with Cg for the next year for serious devopment projects or until I decide not to support ATI cards lower than a 9500.
Well, 3 hours after writting that last post and playing with the Cg source code and converting it to C++ under BCB 3.0 and setting up my first assembler definition for the PS_1_4 things are looking good. Compiled my first pixel shader from Cg to PS_1_4. Used the reader to bring it into OGL and it worked. Haven''t figured out how to generate errors if you use functions not supported by the 8500 ie sin(), cos(), etc.

Another cudo to NVidia for Cg even if some people don''t like it. Laying out the assembly instructions for the 8500 in Cg is a lot easier than writing the back end for GLSlang. I guess I should have started a new topic for this post, oh well.

This topic is closed to new replies.

Advertisement