Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

FreJa

Why would I want to use Cg?

This topic is 5713 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Advertisement
Guest Anonymous Poster
eye candy.
http://developer.nvidia.com/view.asp?IO=cg_testimonials

Share this post


Link to post
Share on other sites
Cg allows you to develop vertex and fragment programs in a high level shading language that can be used in OpenGL or DirectX. The benefit of using a high level language is that its much much easier to write and debug your code when its written in C like syntax instead of assembly.

You get nice features such as functions, a preprocessor, for/while loops, if statements, a built in standard library with many common math, geometry, and texture functions.

Share this post


Link to post
Share on other sites
At this time, no reason at all. Stick with asm approach at least for now. The CG compiler is still lacking stuff that asm would allow. This goes both for ms hlsl and CG. I would use glslang once it''s released instead of CG but that is me.

Share this post


Link to post
Share on other sites
I want to make some per-pixel lighting effects and bump mapping, is Cg the best option?
How is Cg being accepted by the game industry?

[edited by - FreJa on April 22, 2003 6:15:59 AM]

Share this post


Link to post
Share on other sites
no.

perpixellighting, bumpmapping is doable on hardware that is not supported by cg. gf1,gf2,radeons before 9500, etc.

cg is used in the gaming industry by companies that let their souls get buyed by nvidia, and add the funny "nvidia, the way its ment to be played" logo in front of their games.

"take a look around" - limp bizkit
www.google.com

Share this post


Link to post
Share on other sites
quote:
Original post by davepermen
no
perpixellighting, bumpmapping is doable on hardware that is not supported by cg. gf1,gf2,radeons before 9500, etc.


Should I not have cool effects in my game because someone has a crappy 10$ gf1, gf2?

quote:
Original post by davepermen
cg is used in the gaming industry by companies that let their souls get buyed by nvidia, and add the funny "nvidia, the way its ment to be played" logo in front of their games.


What else do You suggest that we do? Code in assembly? Sell our souls to ATI?
Ok your right about the logo

Share this post


Link to post
Share on other sites
Cg is Nvidia''s take on how pixel shaders should/can be used. This is fine, but for one thing. DX9 now does it all. And using DX9 Fx shader script, (same as Nvidias essentially) there is no need to use Cg anymore, in any case.

My advice - get DX9 SDK. And be happy :-)

Also IMHO people shouldnt get too attached to writing fancy pixel shaders anymore. We use the FX shader tool for 3DS Max, and this pretty much (not entirely) lets the artist design the look of the shader, which is a great thing. IMHO. We load in the FX shader afterwards.. and artists are very happy with the consistant results :-)

Eventually we hope to pipe pretty much all resource dev through Max, and let the designers and artists be in control of content, with the tech just loading in resources (shaders included).



Share this post


Link to post
Share on other sites
why? and you people with your obession with money and crap, nVidia is a good company just like ATI is. To get to the point, use cg so you dont have to worry about extensions in OpenGL.

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!