Jump to content
  • Advertisement
Sign in to follow this  
irreversible

OpenGL wrapping shader code

This topic is 3048 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm currently doing all of my stuff in OpenGL and thus far I have just a couple of shaders that I really need or want. The problem is I've adapted one of these in cg, which I haven't read too much positive about. I know there are profile converters out there that would probably convert it to GLSL for me and save me some headache (as one can probably tell, I'm not too well versed in shader stuff and usually try to avoid using them if I can help it - especially since I generally really don't need shaders for what I'm doing), but that's really not the issue for me right now. What I've done is created a video "driver" wrapper (a renderer, not a real driver), which hides API-specific code. Even though I'm using cg stuff directly right now, bypassing the wrapper, and everything seems to be working fine on my computer, I'm can't help but wonder if I should dump either cg or GLSL (eg, should I expect any issues to arise when mixing the two?) and if I should, then which one should I keep? Secondly, how similar is HLSL to GLSL and should I expect any quirks/special cases to crop up when (or if) I do decide to implement a D3D renderer? My GLSL code loads shaders directly from a buffer in memory with no additional setup required, which makes it very robust. I assume HLSL to not be much different.

Share this post


Link to post
Share on other sites
Advertisement
GLSL only works with GL, so if you want an API-independent solution, go with CG.

You may not know this, but CG and HLSL are actually 2 different implementations of the same language. The difference is that CG (NVidia's implementation) has a compiler that supports both DX and GL.

Share this post


Link to post
Share on other sites
Thanks for the reply, hodgman!

As for GLSL and HLSL - I know they're API-dependent; what I was driving at was whether I should go with GLSL or cg for OpenGL (and a separate HLSL pipeline for D3D). I didn't know cg compiled for both :)

Like I mentioned, I've read a few bad things about cg, though - along the lines that its reliability/consistency/support on different platforms is really not that good. Additionally, since it's nVidias, would it be rational to assume that cg shaders won't work on API cards?

The most important question right now, though, is whether it's okay to mix cg and GLSL or should I be aware of something when doing so?

Share this post


Link to post
Share on other sites
Sorry I've never mixed two shader languages in the one app, so I've got no advice for that :/
Quote:
Original post by irreversible
I've read a few bad things about cg, though - along the lines that its reliability/consistency/support on different platforms is really not that good.
Well this doesn't help with GL, only DX, but if there are cases of incompatibility -- seeing that CG and HLSL are the same language, you should be able to pass CG code to DirectX directly (as if it were HLSL code) without using the CG runtimes at all.
For example, the 360 uses Microsoft-HLSL and the PS3 uses NVidia-CG, but we write one shader and it works on both platforms.
""NVIDIA and Microsoft collaborated to develop the Cg language. Microsoft calls its implementation High-Level Shading Language, or HLSL for short. HLSL and Cg are the same language but reflect the different names each company uses to identify the language and its underlying technology.""
Quote:
Additionally, since it's nVidias, would it be rational to assume that cg shaders won't work on API cards?
No, the whole point of CG is that it just works everywhere, so NVidia would be shooting themselves in the foot by forcing game developers to write two versions of all their shaders (one for NVidia cards and one for non-NVidia cards).

Share this post


Link to post
Share on other sites
Yeah I'll second the Cg/HLSL approach. There's very few cases where you would need to modify a shader to compile for one or the other, and you can usually fix it with a #define.

I've also always considered GLSL a no-go since it lacks a very fundamental feature that Cg and HLSL support: off-line compilation.

Share this post


Link to post
Share on other sites
Alrighty - cheerio, guys! I'll go with the cg/HLSL approach as well in that case. Incidentally - something I found for GLSL enthusiasts while doing research.

One note though - I did try to pass a (pretty random) HLSL shader (off the net) to the cg compiler, but it choked on it; I didn't even try to figure out why, though, so it might've been the shader.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!