A note on Cg

Started by
9 comments, last by skiritis 18 years, 9 months ago
I started reading shaders and Cg recently and i want to get some things clear bout pixel shader languages. Well, i own an Ati radeon 9800Pro, Cg is developed by nVidia and it seems that programmes written with Cg don't run on my ati card. Is this for real? Is Cg supposed to work on nVidia cards only? And what about HLSL and GLSL? If the above really is true and if i want to make wertex and pixel shading programmes that can run on all cards should i study GLSL and leave Cg? I really need some help here on these matter in which i'm new. Thanx. --- The big brother is watching you ---
Advertisement
You'll need to select the correct profile to us on ATI cards, afaik the only profile which will work is compiling down to ARB vertex/fragment programs.

HLSL is a DX shading language and thus no use to OpenGL programmers [wink]
GLSL is OpenGL's native high level shading language, with good support from both NV and ATI, its the future for shaders (the ARB vertex/fragment program interfaces arent going to be updated ever), so unless Cg gains a GLSL based target its not going to be able to use more advanced features on later ATI cards (NV will expose their own extensions so its not an issue from that angle)
Can someone post some sample code of how to choose the correct profile for the ati card?
the only ati-related bugs i've encountered with regard to Cg, is in the ARB_vertex_program-support at atis side. they had some issues with some param-syntax...

something like this...

PARAM some_name[] = { {state.matrix.mvp}, {1.0, 1.0, 0.0, 1.0} };

however, in the latest Cg-release, nvidia claim to have fixed some non-nvidia-bugs, but that might just be in CgFX for all i know.
Quote:Original post by skiritis
I started reading shaders and Cg recently and i want to get some things clear bout pixel shader languages.
Well, i own an Ati radeon 9800Pro, Cg is developed by nVidia and it seems that programmes written with Cg don't run on my ati card.
Is this for real? Is Cg supposed to work on nVidia cards only?

Nope, Cg works fine on ATI cards. (Of course, there might be some driver/combatibility issues as always, but nothing related to the Cg language.
Note that SM3 won't ( or is highly unlikely ) to be supported on ATI cards through Cg, as Cg works through the asm interface.
If at first you don't succeed, redefine success.
The reason i'm asking to see a sample cg programme that runs fine on ati cards is that the nehe lesson 47 that introduces cg, does not work/does not produce any results on my computer. So i thought that maybe there is something to the code that must be changed. I don't know if i'm right however.
Lession 47 should work for you. Just be sure to get cg Toolkit 1.2. In version 1.3 the sin/cos functions stoped working on some ATI cards.
Quote:Original post by python_regious
Note that SM3 won't ( or is highly unlikely ) to be supported on ATI cards through Cg, as Cg works through the asm interface.


But isn't that ASM interface just a virtual machine instruction set, like it is under DirectX? Or does it compile the shaders at compile-time? I haven't used Cg (looked at it briefly, but haven't used it), so of course I could be wrong. :)
It just seems to me that the only sensible solution is to use a standard ASM interface, and then compile it to native code for your graphics card when the game is run. Just like HLSL does.
the truth is i downloaded cg 1.4
i'll try with 1.2 and see.

This topic is closed to new replies.

Advertisement