Node-based materials and performance

Started by
3 comments, last by MJP 7 years, 1 month ago
I understand the general idea of node-based material systems and that each node may generate one or more lines of shader code. My question is with regards to how this generated code is managed.
If each material generates its own shader, then the rendering pipeline will likely perform many shader changes each frame. If state changes are expensive, then what techniques are usually used (if any) to help performance.
For example, do these engines try to combine the shader code of many materials into one monolithic shader (to minimize shader switching) and then use branches to execute specific logic? Or do they just take the hit and generate a shader per material. Or maybe something else?
How do they manage this complexity if they have separate shaders for models, sprites, particles, etc? I assume each kind of shader is marked were the generated code should be inserted.
Advertisement

You can see when opening a big Unreal project something like : "Compiling Shaders (4,197)" and that takes very long time.

Node-Based Materials is very powerfull because you can simply create all kind of shaders with the math node all that but has the cost to not be optimized like a hand-written shaders.

Material shader will still be sorted to avoid to switch shader the most possible but it's heavily possible you will switch a lot more than hand-written shader.

Another option than node based is to use a property-based material with layers.

I considered making a Node-based material system, but figured out most shaders would be the same more or less anyway - at least for the visuals I try to achieve.

When going for "Realistic Graphics", eventually using PBR (Physically Based Rendering), and maybe doing a Deferred Rendering pipeline, the amount of "Special Cases" isn't that high, because all materials take the same sort of input, apply the same sort of tricks, and go through the same sort of lighting (hence, physically based) stages.

At least, that goes for the majority of solid geometry, like your walls, terrains, furniture, and so on. Semi translucent matter like tissue or plants may need some boosters.

What I did, is making "Uber Shaders", basically 1 or few big all-scenario shaders, with a lot of #IFDEF's inside them. Then the artist would toggle (read #define) options, like NormalMapping on/off, or "Use layer2 diffuse texture". Most important are a small bunch of standard sliders, like the Smoothness/Roughness of a material, the Fresnel, or whether its a Metal or Non-Metal surface. Works out pretty well for most stuff, and the artist has almost nothing to do, other than drawing kick-ass textures of course.

Depending on the enabled options, the actual shader gets subtracted from the Uber-Shader. And before doing so, I check if it wasn't made already by some other material using the same options. The actual amount of unique shaders stay pretty low, making it suitable for sorting.

The more special cases are typically alpha-blended materials, like glass, jelly-cake, or particles. Sure a Node System would be nice for those, especially from an Artist point of view. Then again, they make up a small fraction in my case, and simply toggling techniques on / off still works here, and is a hell lot faster than building your shader with nodes from ground up. But of course, the artist is bound to whatever options the programmer delivers, which can be limited.

I think the Node System comes from times where PBR and such weren't standard yet, so special hacks had to be made to simulate different materials, such as metal or wet pavement. GPU's are strong enough these days to treat them as one (with some overhead yet), running the same code over them.

As said above, each node graph can produce a unique shader. Its pretty common to do do a step afterward on all of the final shaders to remove duplicate shaders (i.e. only differences in the graph are input constants/textures) and compile only the unique bytecode streams. From there it isn't any different to the renderer how this works from hand written shaders.

I'd argue slightly against spek's point, as most artists i've worked with use shader graphs specifically with PBR. Its not about having a node graph that modifies the entire shader (lighting/shadows math typically being hand-optimized), but making tweaks such as how textures are combined or how texture coordinates are generated, without having to be blocked waiting on a programmer to implement features. Worse yet is when both the programmer and artists time gets wasted as the artist sits with the programmer to constantly tweak and refine hand written shaders. But, YMMV - if you are doing a solo/small project this isn't typically a major concern. When you have 20 artists to 2 graphics engineers its a different matter of priorities.

I've been amazed by some of the creativity non-technical artists have produced with simple graphs. On the flip side, i've also been horrified by the same creativity when it comes down to having to optimize these graphs without breaking the end result.

But if you do go down the node system, it doesn't have to be so completely low level - nodes can be pretty high level functions and the graph just dictates order & input/output flow. And then once you have an awesome nodegraph tool. The whole engine can be abstracted a lot this way - such as rendertarget flow & post processing, particles effects, even gameplay logic.

My experience is similar to Digitalfragment's: even if you have a fairly standardized set of inputs into your lighting calculations, there can potentially be *many* ways to generate those inputs. In our engine, most of these permutations revolve around our concept of "layer blending", which is essentially the process of combining parameters from multiple sub-materials into a final result. This happens based on maps, vertex color channels, projected decals, normal orientation, etc. Good technical artists can do fairly clever things in this area if you give them a node graph: for instance, they might blend in snow based on the Y component of the vertex normal so that it only shows up on the top of meshes.

Some engines (like id's engine) side-step this issue by having unique virtual textures for all level geometry. This lets them pre-compute all of the blending into the unique textures, which makes their shaders simpler since their inputs are more uniform.

This topic is closed to new replies.

Advertisement