Visual Effects, Shaders, And Uber-Shaders

Started by
5 comments, last by Yourself 7 years, 8 months ago

Do people write a shader for every single visual effect? (e.g a particular particles effect)

If so, how is this handled by a uber-shader approach? I know that a uber-shader has lots of #ifdef's and so, but mixing this with all of the "visual effects" shaders may be very harsh, no?

How do coders handle this?

Advertisement

Usually... it depends. Most games will have a handful of shaders that are used for most of the games content. Then there will be a crap ton of specific shaders for special objects.

I think there will be different shader scripts for specialized things... like showing that a character currently has a power up or something.

But normally shaders will find themselves often getting reused, and by design the programmable pipeline is meant for reuse.

Now that's not saying that there aren't objects that get their own specialized shaders. There will be times when an object needs an appearance created analytically, or programmatically.

For example... say that you have an object with glowy lines over it's surface. And you want a bright halo to run down the lines over time. You'll need a special shader for that object.

For Uber Shaders... it tends to be a big pain in the ass to use it for every visual effect in the game, unless the appearance won't vary all that much. The uber shader would probably shine best when it comes to some common features about a material.

Uber shaders have many different permutations... but you're not limited to one uber shader per game. You can use many different shaders (with each of them being 'uber', or not).

Thanks for the answers. Hodgman said what I thought, but I just wanted to be sure.

Is there any open-source project that arranges shaders the way you guys suggested?

You could have a look at Doom 3 source code, it's open source and rather neat.

You should avoid having too many GPU programs (commonly referred as shaders but it's not really appropriate anymore), it costs memory, switching is still not free (but getting better every generation) and it costs time too (to compile).

If you decide to do the most logical thing, that is physically based rendering (unless you want another graphic style of course), you should end up with few GPU programs.

[And deferred shading/lighting/whatever ;)]

-* So many things to do, so little time to spend. *-

You should avoid having too many GPU programs

Yeah but that means sparing on visual effects..?

If you decide to do the most logical thing, that is physically based rendering (unless you want another graphic style of course), you should end up with few GPU programs.

But PBR can be combined with neat special (particle?) effects as well, no?

You should avoid having too many GPU programs

Yeah but that means sparing on visual effects..?


It depends. If it comes down to it, yes.
But a couple of heuristics can help you a long way.
For example, only compile/load effects that are actually used. Often permutations are not used or do not make sense.
Also try to group objects that use the same programs (so the cost of switching to that program is only payed once).

If you decide to do the most logical thing, that is physically based rendering (unless you want another graphic style of course), you should end up with few GPU programs.

But PBR can be combined with neat special (particle?) effects as well, no?


of course

This topic is closed to new replies.

Advertisement