Sign in to follow this  
skiritis

A note on Cg

Recommended Posts

skiritis    144
I started reading shaders and Cg recently and i want to get some things clear bout pixel shader languages. Well, i own an Ati radeon 9800Pro, Cg is developed by nVidia and it seems that programmes written with Cg don't run on my ati card. Is this for real? Is Cg supposed to work on nVidia cards only? And what about HLSL and GLSL? If the above really is true and if i want to make wertex and pixel shading programmes that can run on all cards should i study GLSL and leave Cg? I really need some help here on these matter in which i'm new. Thanx. --- The big brother is watching you ---

Share this post


Link to post
Share on other sites
_the_phantom_    11250
You'll need to select the correct profile to us on ATI cards, afaik the only profile which will work is compiling down to ARB vertex/fragment programs.

HLSL is a DX shading language and thus no use to OpenGL programmers [wink]
GLSL is OpenGL's native high level shading language, with good support from both NV and ATI, its the future for shaders (the ARB vertex/fragment program interfaces arent going to be updated ever), so unless Cg gains a GLSL based target its not going to be able to use more advanced features on later ATI cards (NV will expose their own extensions so its not an issue from that angle)

Share this post


Link to post
Share on other sites
kusma    170
the only ati-related bugs i've encountered with regard to Cg, is in the ARB_vertex_program-support at atis side. they had some issues with some param-syntax...

something like this...

PARAM some_name[] = { {state.matrix.mvp}, {1.0, 1.0, 0.0, 1.0} };

however, in the latest Cg-release, nvidia claim to have fixed some non-nvidia-bugs, but that might just be in CgFX for all i know.

Share this post


Link to post
Share on other sites
Spoonbender    1258
Quote:
Original post by skiritis
I started reading shaders and Cg recently and i want to get some things clear bout pixel shader languages.
Well, i own an Ati radeon 9800Pro, Cg is developed by nVidia and it seems that programmes written with Cg don't run on my ati card.
Is this for real? Is Cg supposed to work on nVidia cards only?

Nope, Cg works fine on ATI cards. (Of course, there might be some driver/combatibility issues as always, but nothing related to the Cg language.

Share this post


Link to post
Share on other sites
skiritis    144
The reason i'm asking to see a sample cg programme that runs fine on ati cards is that the nehe lesson 47 that introduces cg, does not work/does not produce any results on my computer. So i thought that maybe there is something to the code that must be changed. I don't know if i'm right however.

Share this post


Link to post
Share on other sites
skow    248
Lession 47 should work for you. Just be sure to get cg Toolkit 1.2. In version 1.3 the sin/cos functions stoped working on some ATI cards.

Share this post


Link to post
Share on other sites
Spoonbender    1258
Quote:
Original post by python_regious
Note that SM3 won't ( or is highly unlikely ) to be supported on ATI cards through Cg, as Cg works through the asm interface.


But isn't that ASM interface just a virtual machine instruction set, like it is under DirectX? Or does it compile the shaders at compile-time? I haven't used Cg (looked at it briefly, but haven't used it), so of course I could be wrong. :)
It just seems to me that the only sensible solution is to use a standard ASM interface, and then compile it to native code for your graphics card when the game is run. Just like HLSL does.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this