GLSL on an ATI 9600...

Started by
9 comments, last by Prozak 20 years, 1 month ago
Tried to run Humus''s Portals Demo, it uses GLSL (GL Shading Language), on my ATI 9600 XT, but got an error saying these where not present: GL_ARB_Shader_Objects GL_ARB_Vertex_Shader GL_ARB_Fragment_Shader GL_ARB_Shading_language_100 GL_ARB_occlusion_query My own engine captures a list of the extensions, and sure enough these weren''t present... ...so, does my board not support it or is it something I have to install? (a GLSL package maybe, or get better drivers from ATI)... Also, what are you guys using for shading (vertex and fragment), cause I''m having problems implementing either Cg, which is lame on ATI cards, or GLSL... Thanks for any input on this Salsa cooked it, your eyes eat it![Hugo Ferreira][Positronic Dreams][Colibri 3D Engine][Entropy HL2 MOD][My DevDiary]
[Yann L.][Enginuity] [Penny Arcade] [MSDN][VS RoadMap][Humus][BSPs][UGP][NeHe]
Prozak - The GameDever formally known as pentium3id
Advertisement
I''m not sure, but I thought that GLSL is available only on R9500 and up, witch basicly excludes R9600.

If I''m wrong, then download newest Catalyst drivers from ATI''s web-site.
Ok, got the newest drivers from ATI and GLSL and all the previous extensions seem to be available now.

I installed the "vanilla" drivers that came in the CD with the board, so I guess that was one of the problems...

Salsa cooked it, your eyes eat it![Hugo Ferreira][Positronic Dreams][Colibri 3D Engine][Entropy HL2 MOD][My DevDiary]
[Yann L.][Enginuity] [Penny Arcade] [MSDN][VS RoadMap][Humus][BSPs][UGP][NeHe]
Prozak - The GameDever formally known as pentium3id
yeah i''m not sure about your card.
I got a geoforce fx 5950 and it doesn''t support any of the opengl 2.0 extensions.
My radeon 9700 didn''t either until I updated to latest catalyst drivers that just came out.
I''d use glview to test your card



If God played dice, He''d win.
—Ian Stewart, Does God Play Dice? The Mathematics of Chaos
If God played dice, He'd win.—Ian Stewart, Does God Play Dice? The Mathematics of Chaos
any of ATI''s DX9 parts will support GLSLang, as you found out you just need more recent drivers (anything 3.10 and up has the extensions exposed), this is the reason I keep up with driver releases
All ATI DirectX9 cards (9500 and up) support OGSL. nVidias'' cards don''t support OGSL yet.


"C lets you shoot yourself in the foot rather easily. C++ allows you to reuse the bullet!"
"C lets you shoot yourself in the foot rather easily. C++ allows you to reuse the bullet!"
Ok so:
GLSL - not supported by nvidia cards yet
Cg - partially supported under ATI cards, but you lose a lot of the advanced features
HLSL - not an OpenGL option...

isnt there a tool that will compile vertex and fragment shaders into assembly code to be used under diferent profiles, and you select a profile for a certain hardware configuration? pretty much all equal to Cg, besides the fact that Cg doesnt expose its assembly generated code...

Someone at sourceforge must be working on this...

Salsa cooked it, your eyes eat it![Hugo Ferreira][Positronic Dreams][Colibri 3D Engine][Entropy HL2 MOD][My DevDiary]
[Yann L.][Enginuity] [Penny Arcade] [MSDN][VS RoadMap][Humus][BSPs][UGP][NeHe]
Prozak - The GameDever formally known as pentium3id
quote:Original post by Prozak
Ok so:
GLSL - not supported by nvidia cards yet
Cg - partially supported under ATI cards, but you lose a lot of the advanced features
HLSL - not an OpenGL option...

There is, apparently, a way of accessing GLSL through nvidia drivers by changing registry values, although my understanding is that they're at a much earlier stage than ATi's implementation.

With Cg, you can use it with ATi as long as you compile to use the ARB_FP/ARB_VP profiles. AFAIK, the only "advanced feature" you can't use is halfs (16-bit floats), which are treated as full floats (24 bits on ATi) with ARB_VP/FP.

quote:
isnt there a tool that will compile vertex and fragment shaders into assembly code to be used under diferent profiles, and you select a profile for a certain hardware configuration? pretty much all equal to Cg, besides the fact that Cg doesnt expose its assembly generated code...

I don't think you could emulate the GLSL entirely this way, because you'd lose the ability to connect chains of shaders together. Having said that, I'm not entirely sure how this works internally, so I could be wrong.

Oh and BTW, Cg does expose its assembly generated code. You just have to use the Cg compiler (bin\cgc.exe) rather than the runtime library. In my project I'm using Cg at the build stage to compile the shaders into ARB_FP/ARB_VP programs; this is a pain as far as named parameters are concerned (I had to write my own shader framework to handle this), but it means the shaders are already compiled at runtime, and I don't need the extra lib.

[edited by - benjamin bunny on March 18, 2004 3:11:44 PM]

____________________________________________________________www.elf-stone.com | Automated GL Extension Loading: GLee 5.00 for Win32 and Linux

[HKEY_LOCAL_MACHINE\SOFTWARE\NVIDIA Corporation\Global\OpenGL\Debug]"ShaderObjects"=dword:00000001 


"... we should have such an empire for liberty as she has never surveyed since the creation ..."
Thomas Jefferson
"... we should have such an empire for liberty as she has never surveyed since the creation ..."Thomas Jefferson
quote:Original post by benjamin bunny With Cg, you can use it with ATi as long as you compile to use the ARB_FP/ARB_VP profiles. AFAIK, the only "advanced feature" you can''t use is halfs (16-bit floats), which are treated as full floats (24 bits on ATi) with ARB_VP/FP.
[edited by - benjamin bunny on March 18, 2004 3:11:44 PM]


No for loops with arbfp1


"C lets you shoot yourself in the foot rather easily. C++ allows you to reuse the bullet!"
"C lets you shoot yourself in the foot rather easily. C++ allows you to reuse the bullet!"

This topic is closed to new replies.

Advertisement