Pixel/fragment shader tutorials?

Started by
11 comments, last by cowsarenotevil 20 years, 3 months ago
I''m looking for a very simple tutorial for someone who''s never used shaders before. I''ll try the NeHe one, but it''s not best for pixel/fragment shaders and I''d prefer one that''s not limited to nVidia''s CG.
-~-The Cow of Darkness-~-
Advertisement
A GLSL tut would be nice, i just hope nvidia will support it soon.

---------------------------------
For an overdose of l33tness, flashbang.nu
http://www.gamedev.net/columns/hardcore/dxshader1/

this series are cool!
Yes. Too bad this is an OpenGL site. I suppose I could use Direct3D, but I really don''t like Direct3D.

-~-The Cow of Darkness-~-
quote:I'd prefer one that's not limited to nVidia's CG.


What's wrong with CG? If you know CG you can write HLSL and GlSlang, just the standard libary function names could be different. If you mean you want an ASM one, why? If you want to code all your pixel shaders in ASM it means you'll have to code different versions for different cards, to take advantage of different optimisations etc. You can always learn the ASM stuff after you've done a high-level shader language, it will probably be easier to learn as well if you've already coded shaders in a high level language.

[edited by - Monder on January 16, 2004 4:02:50 PM]
quote:Original post by Monder
quote:I''d prefer one that''s not limited to nVidia''s CG.


What''s wrong with CG? If you know CG you can write HLSL and GlSlang, just the standard libary function names could be different. If you mean you want an ASM one, why? If you want to code all your pixel shaders in ASM it means you''ll have to code different versions for different cards, to take advantage of different optimisations etc. You can always learn the ASM stuff after you''ve done a high-level shader language, it will probably be easier to learn as well if you''ve already coded shaders in a high level language.


OK, prepare for complete stupidity:

Can you use CG with a normal C/OpenGL program? Is it a seperate language, or is it used within a C/C++ program?
Can you make your own shaders with CG?
Does CG work on non-nVidia cards?

Thanks.

-~-The Cow of Darkness-~-
Yes. Seperate language.
Yes.
Yes.

There''s a runtime you can use so that a cg file is compiled when your program is run, or you can compile cg to ARB_vertex_program, ARB_fragment_program, NV_vertex_program, NV_vertex_program1_1, NV_vertex_program2. The latter is what I do (at the moment - it''s simpler!)

Sorry, can''t give a better explanation right now - got to dash!

Enigma
quote:OK, prepare for complete stupidity:

Can you use CG with a normal C/OpenGL program? Is it a seperate language, or is it used within a C/C++ program?
Can you make your own shaders with CG?
Does CG work on non-nVidia cards?

Thanks.


I haven''t actually ever written a Cg program or attempted to use one, however I have read an entire book on Cg and stuff about the runtime libary so I may get some details wrong, but I should be mostly correct. Anyway...

Basically when you write a Cg program you can compile it in two ways. Use a seperate Cg compiler and end up with something you can give to ARB_fragment_program or ARB_vertex_program or some other similar extension. The other way is to get the Cg runtime libary to do the compiling during run-time, this is the best way as it means your Cg code will be compiled with specific optimizations for the card in the computer the program is currently running on. The runtime libary fits in with OpenGL just fine. Another advantage of Cg is that if say the computer your code is running on doesn''t have ARB_fragment_shader but it does have NV_tetxure_shader, it can compile code that will run on NV_texture_shader, and this is completely invisible to you, you just tell the runtime libary what your program is and then bind it, it sorts the rest out.

With Cg you are making you''re own shaders, you''re just programming them in a high-level language as opposed to an assembly like one.

Cg will work on non NVidia cards though I''m guessing that it will work better on NVidia cards.
I found a pretty nice site with tuturials in GLSL, it is a work in progress so it dosn''t have any advanced stuff like bump,horison and offset mapping.
www.clockworkcoders.com/oglsl/

IT''s fairly good though.
If only Nvidia would add GLSL support anytime soon.

---------------------------------
For an overdose of l33tness, flashbang.nu
I''ve been promising Nehe some CG tutorials for a while now - but been so busy with uni work, that havent got around to it.
Should i get time in the coming weeks, i would probably use the existing framework created for that vertex tutorial, so a possible series would at least have continuity. What sort of tutorials would people be interested in though?

This topic is closed to new replies.

Advertisement