Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

cowsarenotevil

Pixel/fragment shader tutorials?

This topic is 5441 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I''m looking for a very simple tutorial for someone who''s never used shaders before. I''ll try the NeHe one, but it''s not best for pixel/fragment shaders and I''d prefer one that''s not limited to nVidia''s CG.

Share this post


Link to post
Share on other sites
Advertisement
quote:
I'd prefer one that's not limited to nVidia's CG.


What's wrong with CG? If you know CG you can write HLSL and GlSlang, just the standard libary function names could be different. If you mean you want an ASM one, why? If you want to code all your pixel shaders in ASM it means you'll have to code different versions for different cards, to take advantage of different optimisations etc. You can always learn the ASM stuff after you've done a high-level shader language, it will probably be easier to learn as well if you've already coded shaders in a high level language.

[edited by - Monder on January 16, 2004 4:02:50 PM]

Share this post


Link to post
Share on other sites
quote:
Original post by Monder
quote:
I''d prefer one that''s not limited to nVidia''s CG.


What''s wrong with CG? If you know CG you can write HLSL and GlSlang, just the standard libary function names could be different. If you mean you want an ASM one, why? If you want to code all your pixel shaders in ASM it means you''ll have to code different versions for different cards, to take advantage of different optimisations etc. You can always learn the ASM stuff after you''ve done a high-level shader language, it will probably be easier to learn as well if you''ve already coded shaders in a high level language.



OK, prepare for complete stupidity:

Can you use CG with a normal C/OpenGL program? Is it a seperate language, or is it used within a C/C++ program?
Can you make your own shaders with CG?
Does CG work on non-nVidia cards?

Thanks.

Share this post


Link to post
Share on other sites
Yes. Seperate language.
Yes.
Yes.

There''s a runtime you can use so that a cg file is compiled when your program is run, or you can compile cg to ARB_vertex_program, ARB_fragment_program, NV_vertex_program, NV_vertex_program1_1, NV_vertex_program2. The latter is what I do (at the moment - it''s simpler!)

Sorry, can''t give a better explanation right now - got to dash!

Enigma

Share this post


Link to post
Share on other sites
quote:
OK, prepare for complete stupidity:

Can you use CG with a normal C/OpenGL program? Is it a seperate language, or is it used within a C/C++ program?
Can you make your own shaders with CG?
Does CG work on non-nVidia cards?

Thanks.


I haven''t actually ever written a Cg program or attempted to use one, however I have read an entire book on Cg and stuff about the runtime libary so I may get some details wrong, but I should be mostly correct. Anyway...

Basically when you write a Cg program you can compile it in two ways. Use a seperate Cg compiler and end up with something you can give to ARB_fragment_program or ARB_vertex_program or some other similar extension. The other way is to get the Cg runtime libary to do the compiling during run-time, this is the best way as it means your Cg code will be compiled with specific optimizations for the card in the computer the program is currently running on. The runtime libary fits in with OpenGL just fine. Another advantage of Cg is that if say the computer your code is running on doesn''t have ARB_fragment_shader but it does have NV_tetxure_shader, it can compile code that will run on NV_texture_shader, and this is completely invisible to you, you just tell the runtime libary what your program is and then bind it, it sorts the rest out.

With Cg you are making you''re own shaders, you''re just programming them in a high-level language as opposed to an assembly like one.

Cg will work on non NVidia cards though I''m guessing that it will work better on NVidia cards.

Share this post


Link to post
Share on other sites
I''ve been promising Nehe some CG tutorials for a while now - but been so busy with uni work, that havent got around to it.
Should i get time in the coming weeks, i would probably use the existing framework created for that vertex tutorial, so a possible series would at least have continuity. What sort of tutorials would people be interested in though?

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!