Why would I want to use Cg?

Started by
18 comments, last by FreJa 20 years, 12 months ago
quote:Original post by AxoDosS
Should I not have cool effects in my game because someone has a crappy 10$ gf1, gf2?

if you use cg, yes. those cards are out of support. no perpixellighting on them.

quote:What else do You suggest that we do? Code in assembly? Sell our souls to ATI?
Ok your right about the logo

code in assembly, or with cg, or what ever. but don''t rely on cg. and definitely don''t sell your soul to anyone.

its annoying enough seeing gamecompanies planning to support only one hw vendor and dropping other ones (ea - nvidia). that is primitive, and not usable.

for gf1/gf2, there is no asm to code for. you have to use fixed function (register combiners, that is).

it all depends on what you wanna do.


but the main title is "why would i want to use cg?". and actually. i have no reason. i haven''t found any real use for it yet.

"take a look around" - limp bizkit
www.google.com
If that's not the help you're after then you're going to have to explain the problem better than what you have. - joanusdmentia

My Page davepermen.net | My Music on Bandcamp and on Soundcloud

Advertisement
quote:Original post by digitec devil
why? and you people with your obession with money and crap, nVidia is a good company just like ATI is. To get to the point, use cg so you dont have to worry about extensions in OpenGL.


except that you have to drop hw support for pixelshading for gf1,gf2,radeon8500,radeon9000,...radeon9200. all capable of doing great pixelshading effects, espencially all the named radeons, wich have bigger capabilities than every geforce except gfFX.

and nobody shall ever buy a gfFX anyways:D (check all the tests, benchmarks etc..).

cg does not help you to not worry about extensions in opengl. it is supposed to do this officially, and is planned to make people more nvidia dependend.

of course, you can deny all this and say its not true. reality shows the the other side..:D^

anyways, if you need to use complex high level shading now, get good hw, and use cg.

just haven''t seen really much use for it yet.

"take a look around" - limp bizkit
www.google.com
If that's not the help you're after then you're going to have to explain the problem better than what you have. - joanusdmentia

My Page davepermen.net | My Music on Bandcamp and on Soundcloud

Thanks...

Just one more thing: does anyone have any code that shows per-pixel lighting using the register combiners?


Thanks again...
"Through me the road to the city of desolation,Through me the road to sorrows diuturnal,Through me the road among the lost creation."
gfFX isn''t out yet so we''ll let time tell on that one. He asked why use CG, and that''s one of the valid answers. and settle down with this "sell your soul" crap, at the heart of every corporation(ATI, nVidia, Microsoft, Sun, Oracle) is someone who just wants your money, they could give a crap so don''t act like nVidia is an exception.
quote:Original post by davepermen
espencially all the named radeons, wich have bigger capabilities than every geforce except gfFX.

you honestly believe that the radeon 8500 is a better card than my Geforce 4 Ti 4600?

quote:Original post by davepermen
and nobody shall ever buy a gfFX anyways:D (check all the tests, benchmarks etc..).

this I do agree with. Nvidia I think has essentally destroyed themselves with the GFFX. For a card that is supposed to be 6 months more advanced technologically than the 9700, it really has shown no benefits over it. Plus ATI has the 9800 coming out fairly soon. I think Nvidia''s time is over.
Not to steer away from the subject too much or anything but I''d say to give NVidia some time.

I think they took a big risk developing the new chip architecture when they did and it is hurting them but it also gauranteed they''re place in the next gen cards. ATI will have to do the same research and development and probably will have a card that isn''t up to par, just like the FX. From a business model perspective, I''d say ATI chose to hold out on the development, let NVidia take the initial risk, and produce a good card and make money in the short term. Long term however, they''ll still need to get into the smaller chip architecture and it might effect sales later on. I''ll be interested in seeing what both companies cards look like later this year.

As for the shaders, this isn''t something I''ve looked at enough to really formulate an opinion yet and its probably way above my head anyway at this point I do remember reading/hearing that there wasn''t a real shader standard, each manufacturer was developing something different. Since DirectX 9 has something built in I''d suggest that since its independant from a specific manufacturer and it may become a standard. Just a guess anyway

Dotren
Well I don''t know how else you people can be creating GL_ARB_vertex_programs but I find cg really easy for using hw shaders this way.. I like how they supported OpenGL, not just DX8/9. In any case it sure beats messing with ASM shaders, but when glSlang comes out I wonder if cg will be used at all..
this place is lame
What is this glSlang anyway?
"Through me the road to the city of desolation,Through me the road to sorrows diuturnal,Through me the road among the lost creation."
glslang is the OpenGL shader language. there''s a draft of the spec here: http://www.opengl.org/developers/documentation/gl2_workgroup/
so... it is going to be something like an opengl extension?
"Through me the road to the city of desolation,Through me the road to sorrows diuturnal,Through me the road among the lost creation."

This topic is closed to new replies.

Advertisement