Sign in to follow this  
  • entries
    132
  • comments
    99
  • views
    88513

Designing A Material System

Sign in to follow this  
Driv3MeFar

64 views

So, at like 2am last night I started thinking about how I would design a material system for my may-happen-but-probably-wont renderer. The way I see it, materials should contain all the information to light an object. Since this is going to be a pretty light weight renderer, materials should contain any combination of the following:
1) Ambient color
2) Diffuse color
3) Specular color and exponent
4) Texture
5) Normal map

So, materials can contain any combination of those, which is 32 possible combinations. It should be pretty trivial to write a basic lighting shader that can handle all of those, and then it's just a matter of splitting it up into 32 different shaders to handle all possible material permutations.

New materials can be created pretty easily, simply by specifying which of the properties will be used, and then the value for each property. The renderer will contain a vector of all the materials being used.

Geometry will contain all the relevant matrices (a scale, rotate, and translation matrix), an index into a vertex/index buffer, and an index into the vector of materials. All vertices will use the same FVF, containing XYZ coordinates, texture coordinates, and a normal.

The renderer will hold all the vertex and index buffers. If an object wants to be drawn in a particular frame, it sends a draw request to the renderer. The renderer will then add the piece of geometry to a render queue. There will be separate queues for each material permutation. The renderer will hash the requesting geometry's material to determine which queue to add it to, and add it. That makes it pretty trivial to batch geometry by shader (since shaders are determined by material).

I'm hoping that this design also plays nice with a multithreaded architecture. At the beginning of each frame, all objects that want to be drawn send a request to the renderer. Then, the renderer will go about its business drawing all the objects in the various queues, while the objects themselves update. The rendering will be a frame behind, but as I understand it that's pretty normal. And when you're drawing 60 frames a second, one frame behind isn't that big a deal.

Special effects such as bloom, glow, and HDR can be done as post-processing effects, so I should be able to tack those on as special cases after the objects are lit and drawn to a texture. That's the idea at least, I really have no idea as to how well this will work but I'm (of course) hoping it will work well. What do you think? Will this work well, or is it fundamentally flawed in some way?

Also, I'm still undecided as to what I'm really going to work on next. I may not even end up writing this renderer for a while. See my previous post for the options, and chime in on what you'd like to see me work on/write about.
Sign in to follow this  


0 Comments


Recommended Comments

There are no comments to display.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now