Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

benjamin bunny

ATi's GLSL implementation

This topic is 5271 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I thought it might be interesting to discuss the limitations of ATi's GLSL compiler. The 4.1 driver is a lot more stable than the 3.10, but there are still some limitations and bugs. Aside from the well documented ones, I've discovered 2 bugs: - When using VBO combined with GLSL and arbitrary attributes, the program crashes. - (non constant) int values cannot be used as array indices - contrary to the spec, which specifically says they can. I worked around the first bug by using texture coordinates instead of arbitrary attributes. The second bug is more serious; the workaround in this case was to abandon GLSL altogether and use Cg, since vertex skinning can't be done without proper indexing. The lack of loop constructs is also annoying. I understand the hardware may not be able to cope with arbitrary loops, but it could at least unroll simple loops for you. GLSL on ATi is a nice idea, but it's not yet usable. Personally I don't see any reason to switch from Cg yet. [edited by - benjamin bunny on February 8, 2004 3:54:23 AM]

Share this post


Link to post
Share on other sites
Advertisement
Have you seen the cloth demo, which was written using GLSL?

I have an ATi, and the demo looks really good.

I have not looked at the code for the demo.

BTW, if you have ATi, have you checked out RenderMonkey ?

Wouldn't it be nice to see RenderMonkery support for GLSL ?


[edited by - WilyCoder on February 7, 2004 10:34:29 PM]

Share this post


Link to post
Share on other sites
Ben - do you think that ATI will treat GLSL like its baby in order to have a ''better'' appeal to a shading language by saying they are fully GLSL compliant?

I just see this has being a potiental war between nVidia (Cg) and ATI. Who can get the most out of their language. While nvida doing a ''good enough'' job to implement GLSL because they have Cg.

I dunno...just random thoughts I guess...maybe its not going ot be a problem

Share this post


Link to post
Share on other sites
If nvidia took on GLSL properly, with their experience at developing compilers, I think it would force ATi to fix their implementation. They claim to "fully support" the language, so I guess we''ll have to wait and see what they come up with.

Share this post


Link to post
Share on other sites
not just you. after reading the specs i thought about giving it a try once the basic stuff is set up, but it seems like at this point it doesnt really matter if atis driver supports it.

btw. will ati ever give out _useful_ info about new drivers? everytime i check the "fixed" part of it i find tons of useless stuff like "dreadlocks of character x in game y are rendered correctly now". so what? how about mentioning what _caused_ the bug and what they fixed exactly? id rather read "fixed behaviour of function x where...". because it just feels like they spend all day long with adding hacks and workarounds to make single games work right.

or did i miss special "developer info" where one might someday read "fixed indices in glsl where ints couldnt be used"?

Share this post


Link to post
Share on other sites
ATI are only 2 driver revisions into the GLSL support and imo doing it this way, putting it into the drivers for people to find the bugs in, is a good way to do it as people can find things they might have missed.

I assume you have contacted ATI re your findings Benjamin bunny? not just via the feedback page but via email to devrel@ati.com ? (I''ve found them responsive in the past to my questions, 24h turn around on my last two)

Share this post


Link to post
Share on other sites
quote:
Original post by _the_phantom_
ATI are only 2 driver revisions into the GLSL support and imo doing it this way, putting it into the drivers for people to find the bugs in, is a good way to do it as people can find things they might have missed.

Well it''s been almost a year since 3DLabs released their compiler. I''d think at least one of the major consumer hardware companies should have a stable driver by now. I agree that letting us get our hands on the beta drivers is useful, but I''d like to see more progress.

quote:

I assume you have contacted ATI re your findings Benjamin bunny? not just via the feedback page but via email to devrel@ati.com ? (I''ve found them responsive in the past to my questions, 24h turn around on my last two)

Yep. I contacted devrel about both issues. No response yet.

Share this post


Link to post
Share on other sites
Yeah, a stable driver would be nice, but then OpenGL has been around for years and both ATI and Nvidia have issues with it :|
Still, i gather from the email i got from Devrel that they are puting alot of effort into it, which is good to see they are commited, just as you say, would be nice to have certainly those bugs worked out

hmmm out of intrest, when was it officaly picked as the HLSL for OpenGL?

Share this post


Link to post
Share on other sites
IIRC, when the ARB approved the OpenGL 2 spec sometime last year. I guess it became official when the ARB extensions got approved with the release of OpenGL 1.5. I know there was some debate about whether or not to use Cg (nvidia''s suggestion), but that was voted against in one of the ARB meetings last year (not sure which one). They also had a poll on the openGL.org web page, and again Cg lost out to the GLSL.

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!