GLSL and cg shaders

Started by
4 comments, last by AverageJoeSSU 14 years, 1 month ago
What are the differences from cg and glsl shader language and what do you advice to me? -Is possible to debug both with gdebugger? and with nvidia shader debugger? The last question : -is possible to link nvidia shader debugger with an opengl .exe file? Thanks [Edited by - giugio on March 18, 2010 10:00:52 AM]
Advertisement
Cg is another language but it is "C" like kind of like GLSL is "C" like.
Cg shaders are compiled (converted) by the Cg runtime (the Cg dll) to some other language that can be submitted to GL such as GLSL or arbvp+arbfp or nvvp+nvfp.

The advantage of Cg is that you can compile the same code to HLSL and use in D3D. So Cg is cross API.
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);
I've said it before: In addition to API independence, Cg supports offline compilation. This means you don't have to compile shader source at runtime, you can just load up a pre-compiled shader binary at runtime. This makes it a bit smaller, faster to load, and more obfuscated in the final product.
I'm encountering a problem with this actually. I'm so glad other people are familiar with this.

I have a deferred shader pipeline that uses cg shaders with ogl. One of my colleagues got a new ATI card and i realized we had to go from the gp4v and gp4f profiles to the glsl profiles. When I did this, we got a black screen.

I have yet to run through them with the gdebugger but there are a couple of things i noticed.

1) nvidia calls the glsl profiles "OpenGL 2.0 GLSL" which is pretty old and sucky.
this excludes dfdx and dfdy (which is glsl 1.3 I believe) functions which for cg is ddx/ddy. But i noticed in the compiled Cg glsl source that indeed cg did compile for those functions.

2) you can cross compile, but when i went to read the glsl it was understandably confusing. Is there a way to make it compile in a more readable format? this may be asking a lot i realize.

EDIT: to answer your question yes... you can run through them with the gdebugger when the glsl profile is used, however the compiled shaders look gnarley and the variable names are mangled.

[Edited by - AverageJoeSSU on March 18, 2010 2:14:51 PM]

------------------------------

redwoodpixel.com

@giugio
Cg and GLSL are quite different languages, with the biggest difference being in their platforms. GLSL is just a language, and doesn't provide much else. Cg comes with shader compilation, example programs, etc. If you are going to try playing around with Cg shaders, my best recommendation is using Nvidia's FX composer.

@AverageJoeSSU
1.) I think I may know your problem, since I encountered the same problem. When you use the GLSL profiles, you need to make sure to merge the vertex and fragment CGprograms into one program handle. To do that, you need to use cgCombinePrograms() like this:

CGprogram result;CGprogram programs[2];programs[0] = cgCreateProgramFromFile(cgContext, CG_SOURCE, vs, cgVertexProfile, "VertexEntry", args);programs[1] = cgCreateProgramFromFile(cgContext, CG_SOURCE, ps, cgPixelProfile, "PixelEntry", args);result = cgCombinePrograms(2, programs);cgGLLoadProgram(result);cgDestroyProgram(programs[1]);cgDestroyProgram(programs[0]);

After this, you will need to use "result" as the handle for your combined program. This means you can't mix'n'match vertex/fragment/etc programs. However, you don't have much choice if you want to support non-Nvidia platforms.

Also, I have found that cgGLGetLatestProfile() returns arb*p1 on non-Nvidia hardware, so you should manually set the profile on your ATI/Intel/etc platforms.

2.) No, you can't.

[Hardware:] Falcon Northwest Tiki, Windows 7, Nvidia Geforce GTX 970

[Websites:] Development Blog | LinkedIn
[Unity3D :] Alloy Physical Shader Framework

cheers noob... i'll give that a whirl.

------------------------------

redwoodpixel.com

This topic is closed to new replies.

Advertisement