Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

colinisinhere

per-pixel lighting?

This topic is 5333 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I''m interested in adding per-pixel lighting for my 3d engine. I''ve searched the forums and google for some tutorials but I didnt find anything good! can anyone show me where you have learned how to inplement per-pixel lighting? (note: I don''t know a thing about extentions, will i have to learn? where?) Thanks guys!

Share this post


Link to post
Share on other sites
Advertisement
There are innumerable ways to do pixel lighting, so it really depends. You could use shaders, which are very cool for things like bumpmapping, etc. and give you almost infinite control over your lighting, or you could use projected texture maps.

OK, so much for innumerable; that''s all I could think of, but there are lots of ways to go about it.

<-- that''s still a link if you didn''t notice
"Gay marriage will encourage people to be gay, in the same way that hanging around tall people will make you tall." - Grizwald

Share this post


Link to post
Share on other sites
At the moment, if you want to do per-pixel lighting, you have several options. If you want to code for older graphics cards, you need to use vendor-specific extensions like NV_register_combiners. Newer cards (GeForce 4 and above, not sure about ATi) support the ARB_fragment_program extension, which you can use to write custom shaders. On these newer cards, which have truly programmable pipelines, you can use a high-level shader language (HLSL), such as nVIDIA''s Cg or GLslang from 3Dlabs. OpenGL 2.0 will use GLslang, but at the moment only ATi''s drivers support it. Cg is compatible with many different graphics cards (nVIDIA, ATi, Matrox, and others apparently) through it''s use of profiles. The Cg Tutorial, by Randima Fernando and Mark J. Kilgard, is an good introduction to Cg and shading languages in general, and I suppose a GLslang book will be available in the near future. The two websites are also excellent resources, as is www.cgshaders.org. Enjoy...


Windows 95 - 32 bit extensions and a graphical shell for a 16 bit patch
to an 8 bit operating system originally coded for a 4 bit microprocessor,
written by a 2 bit company that can''t stand 1 bit of competition.

Share this post


Link to post
Share on other sites
quote:
OpenGL 2.0 will use GLslang, but at the moment only ATi''s drivers support it.

detonator 56.72 supports glslang...

i use arb_vertex_program/arb_fragment_program for ppl

Share this post


Link to post
Share on other sites
quote:
Original post by 666_1337
detonator 56.72 supports glslang...

Hooray! Can''t wait to mess around with GLslang...


Windows 95 - 32 bit extensions and a graphical shell for a 16 bit patch
to an 8 bit operating system originally coded for a 4 bit microprocessor,
written by a 2 bit company that can''t stand 1 bit of competition.

Share this post


Link to post
Share on other sites
I you have A GeForceFX 5600 and up or an ATI Radeon 9500 and up use ARB_fragment_program or OGL Shading Lang.
The best tutorials is the examples from Humus Site
Also search and download a paper called "Phong for Dummies"

[edited by - unreal on April 15, 2004 6:32:37 AM]

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!