Jump to content
  • Advertisement
Sign in to follow this  
sBibi

material system / JIT shader generator

This topic is 5409 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

mmh, first, I'm mainly posting this to share, get some advices from people who have considered/implemented such a system, get some feedback, and start a discussion on the matter. like lost of other people on these boards, I've got a dll based material system based on Yann's, from jamessharpe's original material/shader implementation thread. such a system is good, but not as good as it could be. if you want to have specialized shaders, optimized for a specific hardware, and have a high user configurability, along with a high shader/surface diversity, you'll have to write quite a high amount of shaders. it also isn't very artist friendly... about shader bouncing, Yann described the "holy grail", aka meta bouncing. I got that working quite well, but it doesn't solve everything, far from it... around march/april I started thinking about another possible approach, that would solve everything I could think of, that would be very artist friendly (able to directly export the materials from maya/3ds/whatever package without using a different plugin), very modular and flexible, highly scalable to new hardware, with no hardcoded shaders. this was based on dynamic, on the fly shader generation/compile, and shader setup code generation/JIT assembly. I paused the whole thing until a few days ago. note that what I'm going to describe below is one of two main categories in the materials: the one that's only for ingame surfaces that interact with light, not for postprocessing effects (all these fall in the second category). the basic concept is originated from meta bouncing, where the goal was to find a meta language that described the appearance of a surface in an abstract way, with different variable parameters. each parameter having weights telling how much its removal will affect the final surface aspect. (lower weight, lower importance in the final appearance). Ideally, we would want to represent a surface with its BSSRDF/BRDF. this would be quite hardcore if all ingame surfaces had to be represented that way, and not very lightweight. so instead of considering BRDFs, we can consider specific BRDFs, the ones that are present in different modeler's material editors... things like general blinn/phong/anisotropic shader, or cook-torrance for microfacets BRDF simulation. all these have parametric formulas, and we can make a general lighting equation for each of these BRDF types that take into account all the possible parameters. we just need to add weights to these params, and we've got the base we'll need. the materials themselves would be described by these parameters and their respective weights. each parameter can be either a simple color/value or a sampler. (this allows to have every parameter vary per-texel or per-subtexel on a given rendered mesh, you can have a huge variety of phong shaders, from the basic phong shader with fixed 4 exponent, fixed ambient and so on, or a supposely complex one with a gloss map, a specular exponent map, a bump map, etc...) the material system core is based on a plugin system (deja vu? ;)), but this time, the plugin system isn't about effects, but about B(SS)RDF types, each plugin dll containing the shader generation code (I haven't defined any kind of interface for my system yet, it's only in design phase yet, I'll start working on it seriously when I'll have a bit more time than now...). with these plugins, it's very simple to add a new material type. want to add cook-torrance BRDFs ? no problem, just a register command to call on the engine's autoexec script, or even at runtime, with the dll path in argument, and you're done, the material system now recognizes cook-torrance surfaces. so when the material system receives a query for a new material, it first looks up if it isn't already cached (standard stuff), now it it isn't, it'll look through the loaded surface plugins, and throw the material script to the appropriate plugin. after the plugin has done parsing the surface script, it has, for each configurable parameter, its value (texture/fixed value/engine internal var like a time based var/whatever) now the shader generator will see what parameters can be neglected: they're either 0 and simplify in the final lighting equation, or decides they their weight is too low, depending on the hardware caps, on the user settings, and so on... the generator is left with a simplified version of the equation. now the beautiful thing here is that depending on the hardware and on the number of lights (on non looping hw, this is really useful), it can generate multiple shaders for multipass light rendering, and if the hardware is good enough, it can decide to pack multiple passes in one shader (finding the best heuristics here is probably the hardest part of the whole system). also, this lighting equation should take into account shadow maps (I only thought about shadow maps for shadowing, as I don't plan to use shadow volumes anyway, dunno how this would fit with shadow volumes...). for example, let's say we've got a parallax/ppx displacement-mapped, shadowed phong-surface, with cubemap reflections and refractions. the surface can get lit by up to 5 lights (this is either user defined, or there can be no limit, and new shaders will have to be generated as we need more lights at runtime. (read: for 3.0 hardware, where variable-sized loops are available, there will be no regeneration, as the surface shader generator plugin will detect a regeneration is useless anyway, and if there are no variable sized loops, previously generated shaders can be reused for the other passes needed.)) we want to render this surface. for each possible light count up to the maximum (or up to a treshold in case of infinite possible light count), for example, 5, the generator will generate a specially optimized shader, in a meta-language form (doesn't need to be human readable, just a form of bytecode). this intermediate meta representation allows for shader re-usage between materials and passes. let's say that with the hardware we currently have, we can at most do 2 lights per pass, with all the surface features, but no displacement bump mapping, only regular parallax mapping. we'll generate: 1- 1 pass: 1-light-shader 2- 1 pass: 2-lights-shader 3- 2 pass: 2-lights-shader(already generated in step2) + 1-light-shader(already generated in step 1) 4- 2 pass: 2-lights-shader(from step2) + 2-lights-shader(from step2) 5- 3 pass: 2-lights-shader(from step2) + 2-lights-shader(from step2) + 1-light-shader(from step1) so in the end, we've generated two different shaders, and 5 shader pass combinations for this material and this hardware. note that this would change if not all lights cast shadow maps (if only the first two lights cast shadows, we would have generated 4 different shaders: 1-light, 1-shadowed-light, 2-shadowed-lights, 2-lights) the system can also handle shader LODs, and automatically generate them by bringing up the garbage treshold used to test the param weights to see what parameters can be thrown away. reflections/refractions and shadow maps, and all rendered maps in general, should not be exposed in the surface params, this should not be exposed to the artists (IMO, if you see a good reason to, please say it). only the engine should decide wether or not to use shadows for this light, or how to generate the reflection/refraction maps (for ex: cubemap or planar? the engine should decide), as this heavily depends on user settings and hardware. as a final preprocessing pass, all the generated shaders will be translated to high level code (or low level, the plugin decides what it generates, low level would be much trickier to generate though...), and thrown to a high/low level compiler (CGC (Cg compiler) for example, or an HLSL compiler (I don't know how it works with DirectX, but I suppose that's feasible ;)), or GLSL, or directly the asm compiler...)) the shader connectors that link the shaders to the renderer (all the functions that falled into the dlls in Yann's material/shader system), heavily depend on what parameters have been kept. they should also be generated and assembled on the fly. I wrote a JIT x86 assembler in my engine to compile engine scripts, and later to generate these connectors, I discovered softwire this summer, and although sw is around 53 times slower, it still seems fast enough to compile a good amount of shader connectors at load time (they wouldn't be that long anyway... it's just a bunch of function calls to change states and bind stuff). and even if it was too slow, the material system can very well save all these generated things on the disk so it only has to load them without any regeneration in the next executions. (it'll have to store a hardware caps hash in each file to see if it has to be re-generated because there was a hardware change though..) anyway it'll probably be fast enough. I did some tests with my assembler, and a load from disk + relink actually took longer than a full rebuild ;) until now, this aspect of the material system only deals with perpixel shading. what if there is a skinned/morphed/cloth simmed/whatever deformed mesh? (mmh, yes, forgot to say, this system also allows to generate perpixel or pervertex lighting shaders, or to balance calculations on the pixel or vertex shaders) I'll have to add another concept of this material system. a material isn't only a surface appearance. it also has physics properties, and acoustic properties. that's why in our current system we had physics and sound shaders, both dll based as the graphics shaders. well now, they can be generated too, and although the sound shaders are a bit special, and don't have to cohabit with the vertex/pixel shader generation process, the physics shaders do have to. consider a skinned character, that has a skin shader, with some computations done on the vertex side, and other computations on the pixel side. this is a case where there will be an active cooperation between physics and graphics shaders. (by active, I mean the graphics shaders won't only do parameter passing on the vertex side). you can generate shaders with whatever bone weights count (in my case, the engine can optimize skinned meshes to lower max vertex bone influences, that helps with LOD, in addition to skeleton simplification, and the weight counts will determine what shader will be used), same as light counts, you can generate shaders that handle bone weights from 1 to, for ex, 6. mmh, this is starting to get quite long :| I still had things to say, but I'm probably going to discourage anyone from reading this post if it gets too long. this system is really tricky to setup, and I feel I'm going to have a hard time implementing it, so I'm open to any comments and crits. it's _very_ far from good yet, I've already got some self crits that I hope to solve in the next day, the hard thing being that it must interact with lots of engine parts, thus needs a good abstract interface not to get too messy... also the intermediate shader representation is quite blurry in my mind atm, (it has to be easily mergeable between physics and graphics shaders). but I'm quite excited with this material approach, it's kind of providing a more or less unified rendering/deforming equation for the whole engine, shaders would be JIT generated and none would have to be pre-made (I don't feel like making 1000 different shaders optimized for each light/bone/hw/quality configuration, and this could generate them as wished), and exporting artist's materials would be really easy (except from the parameter weights parts, I don't really know where the artists could say how important a feature is O_o, perhaps a material editor would be needed... dunno...) anyway... sorry for this chunk of text, I hope it's a readable to some extent, I apologize for english mistakes (not my first language, and I need a nap right now, so if you'll forgive me... ;)) thanks for taking the time to read this post.

Share this post


Link to post
Share on other sites
Advertisement
argh... if I didn't write that I wouldn't find the courage to read it... should have splitted it up.... :|

Share this post


Link to post
Share on other sites
oh, btw...
this can also generate fixed function pipeline stuff (with the x86 JIT assembler), it's not limited to hardware shaders...

Share this post


Link to post
Share on other sites
Excellent ideas there. Reminds me of swShader (somewhat).

I like the idea of different material properties. You can then create your levels just as Lego blocks:-)

Althogh your idea is not new. Let me find a link... I'll be back

Share this post


Link to post
Share on other sites
thanks for the link ffs, I wasn't aware of that.
well, I knew the same kind of concepts were around since some time (UE3 apparently has something quite similar to them), although their system isn't quite like the one I described up there, from what I've seen of their material editor (witch looks a lot like UE3's material editor btw, although theirs (offsetsoft's) look cooler IMO: http://www.unrealtechnology.com/screens/MaterialEditor.jpg // http://www.offsetsoftware.com/images/shaderbuilder.jpg), they give control to technical details that would depend on hardware caps/user specific stuff, like choosing between simple bumpmapping/parallax bump mapping (witch they call virtual displacement mapping, but from what I saw in the swf, it seems to be parallax mapping, there is no edge distortion visible), or virtual displacement mapping.
it's not quite the same concepts, but you're right, there are similarities, and the material editor concept is quite the same :)
interesting link anyway, thanks!

Share this post


Link to post
Share on other sites
Hmmm how can you guys tell that the UE3 uses a JIT shader compiler similar to what sBibi described above? I can't find any clues to that on their page. For me, the tool on their screenshot looks more like a tool for artist to create HLSL/CG/GLSL/... shaders. Which doesnt have anything to do wih that kind of material system.

Share this post


Link to post
Share on other sites
I'm not sure they use the same kind of system at all, but at least there are some parts of those ideas in their material system too, looking at their material editor (OK, there's just one shot ;)), and from these qwote on UE3 technology page:
(under "visual features")
Quote:
The material framework is modular, so programmers can add not just new shader programs, but shader components which artists can connect with other components on-the-fly, resulting in dynamic composition and compilation of shader code.

but you're right Dtag, perhaps I misinterpreted that, and maybe it just means they generate shader code on the fly in the editor, then save the shaders as standard shaders, and don't generate them on the fly in the engine.

btw, looking at this page just made me think of another small point I forgot to mention on the first post (that's almost implied, but well, I'll mention it anyway): of course, this system also allows for full uniform HDR lighting over the whole lighting pipeline, or 64 bits floating point colors. no need to have an HDR and a non-HDR version of the shaders for the lower-end hardware.

Share this post


Link to post
Share on other sites
This sounds quite useful, however, it seems quite limited by the JIT. For starters, it would have to be continuously updated to support new hardware and shader technologies. Also, you'll need to program some sort of meta-generation routine for all different types of shaders you want to use ( lighting, other effects, etc ), which is a bit limiting ( correct me if I'm wrong ). An arbitary effect could be anything, so generating the shaders for it on the fly without a predisposed "use light shader generation for this effect" is, well, very hard, if not impossible.

Share this post


Link to post
Share on other sites
sBibi: Yeah that what it also looked like to me... What iam wondering tough if this will also work for more complex things ( formulas in the shader etc ). It looked a little too easy to cover the whole variety of possible shaders.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!