This is just a question of curiosity by the way.
So some few weeks ago I was curious on how to render an object multiple times with different to archive different results, and the only problem is that if I try to use additive blending, I get a black object. Now to the point, because of this, a guy said to me:
"You could also just write a single shader that performs the appropriate computations for the diffuse term, direction lighting term, and reads the color from the read and modulates it by the lighting terms."
Now this is a good idea, but i wanted to give my engine the freedom to render any shaders together (render 1, render 2, etc...), and by that it would be easier! But he is right...
So I though, why not generate a shader, like:
String Diff = "color = somecolor" String Light = "...color += something..." PixelShader += Diff; PixelShader += Light; WholeShader = Buffers + Layout + Vertex + Pixel; Compile WholeShader
Now could something like this be made, due to flexibility, and is this how real material editors work? (E.g. UDK Material Editor)