Jump to content
  • Advertisement
Sign in to follow this  
Enrico

OpenGL Shader implementation for a graphics API independent renderer

This topic is 3869 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi, I have taken the task to implement a shader solution for a graphics API independent renderer. So far we have implementations for Direct3D9 and OpenGL and we do not plan to extend this to e.g. Direct3D10 in the near future. Our material system is very similar to what Quake4 is doing: We have a meta-material, which specifies which texture to use, colours, specular exponents, etc. Now Quake4 uses only a OpenGL-Renderer, which makes specifying the shader program to use easy:
textures/base_gothic/wall {
    shader glsl/diffuse.glsl
    textureMap textures/base_gothic/wall.png
}
For the API independent approach we have, this is somewhat more complicated: The meta-material should always stay the same (this is required), but then the shader program can not be used for all APIs. So my options are: 1) Use an (Abstract) Shade Tree. 2) Nvidia's Cg for D3D9 and OpenGL. 3) Add another level of indirection into the "shader"-keyword: Instead of seleting a complete file, select only the first path "shaders/diffuse" and the renderer implementation will have to complete path to e.g. "shaders/diffuse.glsl" . 4) Use ATI's HLSL->GLSL converter and support only HLSL. Some thoughts: 1) means a lot of work and I need to write an appropriate authoring tool. 2) I found some statements that Cg on D3D9 is different from Cg OpenGL. I have not verified this, so maybe someone can comment on this? 3) Quick, a little bit dirty, but should work well. 4) I do not trust this converter... Any other options? Has somebody already experience with implementing such a system? Any comments? Thanks, Enrico PS: If I would be able to choose again, I would not take the graphics API independent approach again :-S

Share this post


Link to post
Share on other sites
Advertisement
Hi Enrico,

I also looked into this problem a while back in developing an API abstraction library. I basically started with the D3D10 API and starting writing an OpenGL backend to support all the basic features. The plan was that the D3D10 backend would be trivial. This worked well for most basic objects like textures and vertex buffers and I could map the functionality back to OpenGL without too much trouble.

Shaders on the other hand got a bit tricky. I did some basic testing and actually found that the core of GLSL/HLSL are so similar that I could make an HLSL function compile in GLSL by adding a bunch of #defines and implementing functions that were missing (eg. I would implement lerp() in GLSL which would just call mix() ).

The remaining problem was that of how GLSL uses globals as inputs and outputs of shaders while HLSL uses semantics, and GLSL always uses a main() function. To get around that I specified my shaders as plain HLSL functions with no semantics or globals; The API then takes as input the basic code as well as an "Entrypoint" string which specifies all the input/output information. For example an entrypoint of "PhongPS( IN.Tex0, false, true, 3, OUT.COLOR0 )", would append the extra HLSL functions before the shader and add the following code at the end of the shader.

varying float4 Tex0;
void main()
{
PhongPS( Tex0, false, true, 3, gl_FragData[0] );
}

That way I could use the same source file for vertex/pixel shaders like HLSL lets you do with Effects, and the GLSL compiler should be smart enough to optimize for different bool and int parameters.

Anyway, not sure if that is what you are looking for but that was my solution. I haven't tried to do this for geometry shaders yet. I think it will take a bit more generated code but should be doable as well.

Cheers!

Eric Penner

[Edited by - WizardOfOzzz on February 8, 2008 5:44:16 PM]

Share this post


Link to post
Share on other sites
Some other quick comments. I tried HLSL2GLSL and worked for basic input but crashed when using certain types as input. I think it couldn't handle passing a sampler to a function which I do lots to decouple the code from the input texture.

I haven't tried it lately but I've heard reports Cg doesn't play nice with ATI cards. This could be different now though.


Share this post


Link to post
Share on other sites
You can program all your shaders in HLSL and then you have translators that translate into GLSL or Cg ... depending on what you need.

Share this post


Link to post
Share on other sites
Quote:
Original post by WizardOfOzzz
Some other quick comments. I tried HLSL2GLSL and worked for basic input but crashed when using certain types as input. I think it couldn't handle passing a sampler to a function which I do lots to decouple the code from the input texture.

Thanks for sharing your experience :-)
What are the shader developers thinking about your approach?

Quote:
You can program all your shaders in HLSL and then you have translators that translate into GLSL or Cg ... depending on what you need.

Do you have any experience with such a system?

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!