Designing a wysiwyg material composer

Started by
2 comments, last by MJP 13 years, 2 months ago
So I've opted to create a materials system for my project engine that unifies a procedural texture generation engine and GPU shader programs in a Quake3-texture- shader style material scripting language.

Now, the reason for this is because I think it's important to try to simplify the process and the system for artists so that they aren't dealing with textures and texture generation procedures as one set of data, then a set of GPU shaders as another, and combining them in yet another set of material scripts where everything comes together. It just seems unnecessarily tedious and confusing for artists and designers to have to deal with when they're job is to be creative.

I have only recently started getting into taking advantage of all the neat stuff GPU's have to offer in the way of functionality, and I am trying to design a tool for composing materials that allows an artist to design a material without ever needing to see a line of script code, or shader code. I would like to modularize the fragment/vertex shader programs in a way that the artist could simply check off the effects that they want their material to exhibit, and the composer tool automatically generates the corresponding vert/frag shaders to send to the GPU at runtime for the material. I hear talk of an Ubershader concept? But I'm not entirely sure thats the way I want to go, as I feel it would be more sensible to build up a shader out of pieces or 'modules' that fit together, as opposed to starting with a fully-fledged shader with all the bells and whistles and removing portions that won't be needed from it.

Does anyone know of any examples or projects that I could look at to get some ideas? I've decided on a few things after checking out some procedural texture editing programs like werkkzeug and neotexturedit, but I'm especially adamant about making all the GPU shader stuff totally invisible to the artists/designers.
Advertisement

Does anyone know of any examples or projects that I could look at to get some ideas? I've decided on a few things after checking out some procedural texture editing programs like werkkzeug and neotexturedit, but I'm especially adamant about making all the GPU shader stuff totally invisible to the artists/designers.

You should take a look at
- blender
- mapzone
- udk

I think it's important to try to simplify the process and the system for artists so that they aren't dealing with textures and texture generation procedures as one set of data, then a set of GPU shaders as another, and combining them in yet another set of material scripts where everything comes together. It just seems unnecessarily tedious and confusing for artists
This really shouldn't be how the material workflow is structured. Shaders are some of the most performance-sensitive code in a game, and should not come anywhere near the direct control of artists. Generally that job should be given to someone with a deep understanding of both the hardware and lighting models/equations. That person should be given art direction as to the types of materials required, and compromises between artistic quality and programming requirements found.
A decent way to expose these shaders to artists, is simply a drop-down box to select the material/lighting model, and then a set of check-boxes for features.

e.g. most games have one "standard" material/lighting model, such as blinn/phong, which would have options such as normal mapping, gloss masks, roughness, displacement maps, etc...
You'd then have specialised models, such as an anisotropic model for hair or metal, etc...

This keeps the material editor simple for artists (they just choose a lighting model and choose the options they want), programmers get stay in control of performance, and new material features can be created through a collaboration between disciplines.

[edit]Christer Ericson sums up this viewpoint better than I
Yeah I agree with Hodgman. It would be nice and great to conceptualize shaders as being comprised of little independent blocks of functionality that you can just string together into a full material, since artists are used to such an interface in DCC tools. And some engines (like UE3) have certainly presented their shaders/materials that way. However once you want to start actually optimizing your shaders, you're likely to find that your little bits of functionality aren't nearly as independent as you'd like them to be. And so at that point it becomes increasingly practical to work optimizations into your little chunks that you connect together, since they start to need knowledge of the other functionality. Hence why a lot of people go with the "ubershader" approach where you write out a big shader, and use #ifdef's to control which functionality is active.

This topic is closed to new replies.

Advertisement