'Mixing' Shaders and code to shader organization

Started by
4 comments, last by InvalidPointer 12 years, 1 month ago
I've been exposed to shaders for a short stint, and have been exposed to a number of lighting techniques.
I have a couple of questions regarding shaders:
1) What are some kinds of techniques that can be integrated into existing shader techniques?
Example: I have a basic Per-pixel lighting shader up. Can I integrate shadows into the lighting equation with minimal effort?
How about deferred shading, HDR lighting, etc? Can all these be combined into one shader that is used for objects that need to be
lit up? Or are there some kinds of techniques that if 'you choose technique A, you forgo technique B'? (Would love it if you can give an
example here)

2) How costly can setting uniform variables per frame be? I'm thinking of supplying some constant variables(example: WorldViewProjectionMatrix) to all shaders so there's no need to write extra code to constantly supply that uniform variable.
Advertisement
1) You're talking about effects that can be combined together but not in a single shader. To integrate shadows you need to do a shadow pass using a different shader to create a shadow map. Then in your lighting shader you can simply do lighting calculations like usual and calculate the "shadowFactor" (the % of the pixel that is under a shadow") using the shadow map.
-To use deferred rendering you also need to do multiple passes (GBuffer-pass and Lighting-pass) so you will need at least 2 shaders.
-HDR lighting is a bit different because it is mostly done on post processing , you render objects using your basic per-pixel lighting shader to a Floating-Point texture, then in post-processing you calculate the scene luminance and apply tonemapping.

You can mix all these effects in a uber-shader on a single file using #ifdef and then create multiple shader permutations at compilation-time, but you still will have multiple vertex/pixel shaders and a really confusing shader file.

What I use in my projects is basically this:
-GBuffer shader permutations file (a single shader file that contains different techniques to generate a the GBuffer, because not all objects need paralax-mapping etc).
-Shadow map shader file.
-Lighting shader file
-Material pass shader permutations file (that also contains multiple techniques).
-Post process shaders.
2) Depends greatly on the engine / API that you're using. It can be very cheap if done right, but in some engines it has a very high CPU cost.


-HDR lighting is a bit different because it is mostly done on post processing , you render objects using your basic per-pixel lighting shader to a Floating-Point Shader, then in post-processing you calculate the scene luminance and apply tonemapping.

Thanks. I think you meant to say 'Floating-Point [s]Shader[/s] textures' there.

Thanks. I think you meant to say 'Floating-Point [s]Shader[/s] textures' there.


That's right ;)
You really don't even need to use floating-point textures, either. There are plenty of fixed-point HDR encoding schemes out there that work very well, such as NAO32/LogLuv, some crazy shared-exponent deal or just 'cheating' a bit like Valve does and doing tonemapping directly in the lighting shader. Obviously these approaches can complicate your rendering pipeline a bit, but they can certainly result in a performance win or enable fancier effects on hardware that doesn't support (or supports poorly) floating-point textures.
clb: At the end of 2012, the positions of jupiter, saturn, mercury, and deimos are aligned so as to cause a denormalized flush-to-zero bug when computing earth's gravitational force, slinging it to the sun.

This topic is closed to new replies.

Advertisement