Cg vs GLSL
HI all,
This post is not meant to begin some awful flamewar, but as I begin to use vertex/fragment shaders the question comes up: GLSL or Cg?
I have been told, but I do not know if it is true, that only nVidia cards really support Cg (i.e. the back-end of the Cg compiler is for nVidia only).. but I do not know if that is really true.... moreover, which would be faster, (even just for nVidia cards I'd like to know) Cg or GLSL? I get the _impression_ that Cg might be better for nVidia, but I do not know... from peoples experiences which is easier to use, Cg or GLSL? Which has fewer "driver-implementation" issues?
Since compiled CG-shaders is standard OpenGL Vertex/Fragment-program code, ATI users shouldn't have a problem using your shaders.
I still use CG, instead of GLSL, since I believe that the driver-implementation for GLSL isen't good enough yet (I know this has been discussed for a long time).
Best regards,
Roquqkie
I still use CG, instead of GLSL, since I believe that the driver-implementation for GLSL isen't good enough yet (I know this has been discussed for a long time).
Best regards,
Roquqkie
From what i've been told from reliable sources is that GLSL is not yet very well supported by hardware nor software(drivers). Perhaps in a year or so the support for GLSL will have matured enough for it to be really beneficial...
Basically, it's only 3D-Lab's wilcat cards that fully support GLSL to date...not even GeForces new 6x00 cards support if to it's full extent.
I've decided to look into Cg until GLSL can be used without worries, and from the little i've seen it looks basically the same in the language. However GLSL's syntax and design is a lot more nice and "sexy", so i'm really looking forward to it's safe to use it...
Basically, it's only 3D-Lab's wilcat cards that fully support GLSL to date...not even GeForces new 6x00 cards support if to it's full extent.
I've decided to look into Cg until GLSL can be used without worries, and from the little i've seen it looks basically the same in the language. However GLSL's syntax and design is a lot more nice and "sexy", so i'm really looking forward to it's safe to use it...
You can do with GLSL all you can with Cg. Drivers are not completly mature yet, but there are no problems with hardware. Cg will work correctly both on nVidia's and ATi's hardware with no problems, and so will GLSL.
ATi has very good GLSL support, while nVidia's is very crappy (fails in >50% of the 3DLabs GLSL Parser Tests), because they internally translate GLSL in CG which is not always working properly.
I don't want that nVidia forces me to use their proprietary standards, so I'm trying hard to write GLSL shaders that work with their drivers too.
I wish nVidia could simply drop that CG shit and concentrate on industry standards, but I don't think that will happen soon.
I don't want that nVidia forces me to use their proprietary standards, so I'm trying hard to write GLSL shaders that work with their drivers too.
I wish nVidia could simply drop that CG shit and concentrate on industry standards, but I don't think that will happen soon.
Most of the failed GLSL test in nVidia's drivers are test that shouldn't compile but still compile. This shouldn't stop working shaders from working on nVidia's drivers.
Can you give some sources for your claims that nVidia are translating GLSL to Cg?
Also what is so wrong with Cg? What industry standards is it missing?!
Can you give some sources for your claims that nVidia are translating GLSL to Cg?
Also what is so wrong with Cg? What industry standards is it missing?!
NVIDIA uses the Cg compiler to compile GLSL shaders. So even if onbe uses NVIDIA's 'half' data type inside the GLSL shaders it will compile and run fine. But if you take that onto a different card the GLSL shaders will crash because of the 'half' type.
I have been using GLSL with NVIDIA QuadroFX series and seem to have no problems so far. Even if I had any problems I would like to stick to GLSL rather than to shift to Cg because I do not like to be forced by the industry.
I have been using GLSL with NVIDIA QuadroFX series and seem to have no problems so far. Even if I had any problems I would like to stick to GLSL rather than to shift to Cg because I do not like to be forced by the industry.
Quote:Original post by vNistelrooy
Can you give some sources for your claims that nVidia are translating GLSL to Cg?
Its a very well known fact that NV do this, and technically there is no problem with it and it does make sense as they have a working Cg compiler so a translation layer is easier to build than another full compiler and if they would enforce strict GLSL compliance this would be a non-issue, but they dont. As hinted at by dimensionX above they allow the 'half' type in standard GLSL code which while being a reserved word isnt in use in the GLSL 1.10 spec (GL2.0 version), they also allow you to use your fav. Cg functions (such as 'lit', which was used in a shader on the codesampler.com website and thus caused it to blow up on ATI hardware). Now, if they wanted to extend the interface, fair play to them and good luck, I'd like ATI todo the same if they had some features they wanted to expose HOWEVER they should be optional and should only be active from a request (via a GLget command maybe, or an extension which shows it to be there and you have to call a function on to allow) or exposed as a define (like how some people write C/C++ #ifdef NV_CG_EXT .... #endif) not active by default.
Quote:
Also what is so wrong with Cg? What industry standards is it missing?!
Cg compiles down to ARB_fragment_program/ARB_vertex_program code. This is both its blessing and its curse, because while it allows you to write code for this interface for both ATI and NV cards (although it doesnt produce the most optimal ATI output) that interface isnt standard and currently looks like it wont ever get an upgrade. So, if the hardware exposes some fancy new functionality the GLSL compiler will be able to take advantage of it right away however the other interface isnt being expanded so you are stuck with the current level of support and even if it was supported you'd have to go back and intergrate a new CG backend into your program inorder to take advantage of it.
(as a side note, I'm a great fan of the GLSL but I can now see a point to expanding this interface for a while into the future)
Cg is a great tool for R&D. I wouldn't use it in your final build though. It is open source and you can modify the backend to do whatever you want.
Alright. I have just switched to my new nVidia card, and I haven't done any shaders coding on it. All my shaders were written on an ATi Radeon 9600 pro.
If I'm not mistaken ARB_fragment_program and ARB_vertex_program were dropped out of OpenGL 2.0, but still they are quite standard for the time being. They are quite well supported and widly used.
If I'm not mistaken ARB_fragment_program and ARB_vertex_program were dropped out of OpenGL 2.0, but still they are quite standard for the time being. They are quite well supported and widly used.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement