HLSL to GLSL

Started by
5 comments, last by christian h 17 years, 10 months ago
Does anyone know if there is a tool that will convert HLSL code to GLSL? I am looking through the ShaderX3 book and many of the examples are in HLSL. I could pick up a ref on HLSL, but if a tool exists... TIA Z
Advertisement
I dont know about GLSL but i know that CgFX is almost the same as HLSL.
I thought I read something about a Cg to GLSL converter/program on these forums but I couldn't find the thread.

"Those who would give up essential liberty to purchase a little temporary safety deserve neither liberty nor safety." --Benjamin Franklin

Quote:Original post by Mike2343
I thought I read something about a Cg to GLSL converter/program on these forums but I couldn't find the thread.



NVIDIA Cg shader language will compile to HLSL or GLSL. It's nice because you write shaders in it and use them in either OpenGL or DirectX applications.

http://developer.nvidia.com/object/cg_toolkit.html
How are you going to use Cg on ATI? It will compile to ARB_vp and ARB_fp which are deprecated now.

I remember someone from nVidia said that you can use the Cg compiler to translate between the languages but sorry, I didn't keep the link. It was on the shading language forum at opengl.org
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);
Umm... I thought Cg worked on ATI. I've run Cg scripts on my 9800Pro unless I smoked some really good drugs and just seen it working ;-)

"Those who would give up essential liberty to purchase a little temporary safety deserve neither liberty nor safety." --Benjamin Franklin

Quote:Original post by V-man
How are you going to use Cg on ATI? It will compile to ARB_vp and ARB_fp which are deprecated now.

I remember someone from nVidia said that you can use the Cg compiler to translate between the languages but sorry, I didn't keep the link. It was on the shading language forum at opengl.org


The new beta version has a profile for GLSL. It can also take a GLSL code as input too and convert it to any of the profiles. And as nef stated, Cg is lot like HLSL, so carefully designing the code to be both, you can use Cg to convert the HLSL to GLSL.
GLSL to HLSL is little tougher.

Oh yeah, the link
ch.

This topic is closed to new replies.

Advertisement