Jump to content
  • Advertisement
Sign in to follow this  
3dmodelerguy

OpenGL OpenGL shader help

This topic is 4830 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Ok I am using RenderMonkey to create my shaders for GLSL but i have some questions. Ok, now the game engine i am working on right now is going to support DirectX and OpenGL so I don't want to write to shaders, one in GLSL and one in HLSL that do the same thing. (1)So my first question is is there a why to get the asm version of the shader from Render monkey and if not is there a way to turn the GLSL shader code into asm shader code? (2)My second question is how do i use the asm version shader in OpenGL? Any link to content about this would be great.

Share this post


Link to post
Share on other sites
Advertisement
Long story short, no.

GLSL goes in and gfx card specific microcode comes out, which right now you dont have any way to get out (well, I think NV might let you get at it for debug purposes but its not to be relied apon).

What you really want if you want to do a OGL/D3D shader based game/engine/flying machine is to us Cg, as this can be made to compile for both OpenGL/D3D.

HOWEVER, it does come with one minor issue. Afaik Cg on OpenGL can only target the ARB vertex/fragment program extensions and NV's own extensions. However, the vp/fp extensions never made it to core and arent going to be updated, as such they are basically dead (and technically with GLSL around thats not a great loss imo), but this does lead to problems with Cg, as it means the Cg code cant be compiled to anything more than the vp/fp interface allows.

Now, I hope for your sake that NV updates it for a GLSL target, if they havent already, if not then you might not have much of a choice but to write the shaders twice if you need functionality beyond the vp/fp interface.

The alternative is make your own shader building backend, which can build GLSL/HLSL from code fragments given to it, however this is a big job, so Cg is probably your best bet.

Share this post


Link to post
Share on other sites
but I have read that only Nvidia video cards support Cg which means I need 3 shaders, 1 in Cg for nvidia card with openGL and DirectX, 1 in HLSL for DirectX on ATI cards and 1 more in GLSL ofr OpenGL on ATI cards. now if this is wrong please correct me.

also when you say no, do you mean that I can't convert the GLSL shader to ASM or that OpenGL does not support ASM shaders even when all the extentions?

Share this post


Link to post
Share on other sites
Quote:
Original post by 3dmodelerguy
but I have read that only Nvidia video cards support Cg which means I need 3 shaders, 1 in Cg for nvidia card with openGL and DirectX, 1 in HLSL for DirectX on ATI cards and 1 more in GLSL ofr OpenGL on ATI cards. now if this is wrong please correct me.
Cg works just fine with ATI cards. You just won't be able to compile to any of the NV targets.
Quote:
Original post by 3dmodelerguy
also when you say no, do you mean that I can't convert the GLSL shader to ASM or that OpenGL does not support ASM shaders even when all the extentions?
You can manually convert a GLSL shader to the assembly-like shaders. You might not be able to get the same functionality as you could in GLSL though, because GLSL is continuing to be supported and will be updated, whereas the assembly-like shaders will not be. You can't get assembly-like shaders out of the GLSL compiler because that's not the way it works. It is translated directly into what the graphics card can understand, which is different for all cards.

Share this post


Link to post
Share on other sites
Quote:
Original post by Kalidor
It is translated directly into what the graphics card can understand, which is different for all cards.


That comments makes it seem that the shader i create will only work on my card card as only people with my perticular video can run the shader is this true? you say all card are different does that mean i need to create a seprate shader for every single video card? I really doubt it is true but it is just the why you put it that makes it seem like that.

Share this post


Link to post
Share on other sites
thats not what he means.

When you pass the GLSL shader to the driver it compiles it for that card at that time so you can use it. The same GLSL code will work on different cards however the microcode produced by the driver (which isnt the same as the ARB fp/vp code either) is specific to that card and maybe even that driver revision, as such with the way things are currently done even if you could read back the compiled code it would be basically useless on anything but the system it was compiled on.

Share this post


Link to post
Share on other sites
Kinda off topic, but why is it that so many people seem to think that cg is nvidia only? I mean, ATI's support hasn't always been great, but it's always been present.

Go figure.

Share this post


Link to post
Share on other sites
Well, it was created by NV and ATI have never been shown to give offical support to it. The fact it only works via the fp/vp extenion on ATI hardware doesnt help that perception.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!