Multiple Materials in a single object

Started by
9 comments, last by Rompa 16 years, 3 months ago
Hello everyone. I've been looking into Unreal Engine 3's editing, as my first attempt at understanding how map editors work (i've never looked at any other map editors), and i noticed that they e.g., have a cube, and can place multiple materials on the same cube. My question is, is it possible to place multiple materials on a single geometry object, without drawing each of the geometry parts that contain different materials separately? (or, can they draw the cube all at once instead of drawing each section if it has different mats?) If so, would anyone mind to explain at least a bit of theory on how to do so? Thank you very much for your time, and have a nice day!
Advertisement
As far as I know, material defines the way that an object interact with light sources, for example, in OpenGL it is defined with three values (ambient,specular,difuse), I don't know but possibly DirectX works with the same method.

For me (I could be wrong) apply two or more materials to the same set of geometry hasn't sense because it implies that an object could have different behavior when a light source is illuminating it.

Waiting to someone to say me "You are wrong, and exists a cool application of this.." XD

Netsheik
@ Netsheik: In this case, a material is a texture/effect combination, at least from what i saw on most map editors.
Every renderer I've ever worked with (which includes many major commercial products) cuts things up by material. There isn't any reasonable way to render multiple discrete materials simultaneously, without special shaders designed specifically for that purpose. (Array textures on some GPUs might let you do this to an extent, but I haven't heard of anyone doing that).
If you want "multi layered material" you have to write a loooong shader! The rasterization lighting methods (rasterization = the solution how our graphics cards render models) is limited by some factors, there are few main methods of lighting :
BRDF - Standart Lighting, that uses Diffuse factor, Specular Factor, Shininess factor and Ambient Factor - the whole calculation is done by "Normal dot Light" and "Normal dot Half" where half is reflection vector between normal and light (there are multiple calculations how to get it - like Blinn or Phong). All of the factors must be known before calculation!

SSS (Sub Surface Scattering) - This is one of the most difficult lighting methods (as long as you use rasterization), it's used to simulate materials like human skin, ice, ... (materials with transculency - in real - multi-layered surface). There are few methods - if you wanna get something like this http://www.otte.cz/engine/data/Ch01Pic02.jpg (this is based on realtime raytracing), you can use method described in one of the books (used in NVidia Human Head Demo), where you have to combine few lit textures of face and set some color weights (the whole method is described in GPU gems 3).

Ambient Occlusion - This method is based on light accesability to the model. It's light model, that's quite more complicated. You have to use full screen buffers (normal and depth) to get light accesability (this is called SSAO - Screen Space Ambient Occlusion), SSAO isn't physically correct. To get really nice Ambient Occlusion you have to use disks or spheres and methematically calculate light transport between them, or use rays - send ray to every direction, calculate "first hit" and then calculate light accesability (the whole process for raytracing is much longer, but this is principle).

Radiosity - For me, best method ever, it uses hemicube or spherical harmonics or rays OR mathematical formulation. I'm using it dynamically in realtime under OpenGL. Well, try to search web to get something about this method - there's plenty of info about this.

There are much more methods, you can combine them (so this is probably meant by UE3 material combining), you can blend textures of model by some masks. Material combining is possible, but you have to write many loooooong shaders! like BRDF.shd, BRDFAO.shd, BRDFSSS.shd, BRDFAOSSS.shd, ........ - this is very probably the way, how UE3 use that.

I appologize for long post.

My current blog on programming, linux and stuff - http://gameprogrammerdiary.blogspot.com

I'll try to explain better:

In UE3, you e.g., create a cube, then you can apply a texture to a corner, and another texture to another corner, etc. I'd like to know, if it's possible, and if so, how to do so, to render the now multi-material object as a single geometry instead of splitting it by materials.

Thank you for your time, and have a nice day!
Yeah, that's possible with one tricky method! Using Color masks, you set green color to first corner, blue color to second corner and red color to third corner (in a triangle :), it's easier to explain). Then in a shader (which must be loooooong) you set F.e. Glowing Material * red + Reflecting Material * Blue + Refracting Material * green. This is one of the tricky methods, but only 4 material combinations are possible with standart color map, with HDR color maps - many (but it'd be much more tricky, but not impossible).

My current blog on programming, linux and stuff - http://gameprogrammerdiary.blogspot.com

Quote:Original post by nuno_silva_pt
I'll try to explain better:

In UE3, you e.g., create a cube, then you can apply a texture to a corner, and another texture to another corner, etc. I'd like to know, if it's possible, and if so, how to do so, to render the now multi-material object as a single geometry instead of splitting it by materials.

Thank you for your time, and have a nice day!


Yeah theres loads of ways to achieve that effect, all of which depend on exactly how you wish to use such a feature.

You could perform a preprocess to build a single texture with the appropriate blendings and each face of the cube all on one texture, then its just a simple texture mapping operation, this is perfectly do-able with the fixed function pipeline, but obviously it doesnt combine effects, only diffuse textures.

Other ways require complex shaders...

You could write a shader that knows all these materials you wish to blend (or takes as input the textures if thats all they are) and uses some property of the vertices to blend them (Vilem Otte suggested using the rgba values), you could introduce your own vertex attributes to get more than 4 material keys.

You could have a shader that will use textures as masks for each material instead of encoding them into the vertices, this gives you a wider range of possible ways to combine materials since you have control at the texel granularity level instead of just vertices.

Theres also a meta-shader technique where you have a library of simpler shader effects and you describe what overall effect(s) you want to apply to a single renderable object and then (either offline or at runtime) these shader effects are automatically combined into one complex super shader.
The UE3 editor does something similar, it lets you dynamically construct shader trees and then these effects are all combined together at runtime, or atleast thats the impression I get from this image - Each one of those boxes and/or links can be represented as a piece of shader code, it all gets built into a single shader for the rendered image to the left.
I'll stand by my original response: Yes, it's possible. But it's not really worth the effort, so no one actually does it.

Edit: It's not worth the effort in the general case (eg, you make a model of a building and use a few different textures for different parts). There are very specific cases where it is worth the effort (eg, landscpape renderers with various terrain layers), but in that case the requirements are more well defined up front. I strongly suspect that UE3 will just cut your object up by material - every other engine I've ever worked with does that.
Quote:You could write a shader that knows all these materials you wish to blend (or takes as input the textures if thats all they are) and uses some property of the vertices to blend them (Vilem Otte suggested using the rgba values), you could introduce your own vertex attributes to get more than 4 material keys.

You could have a shader that will use textures as masks for each material instead of encoding them into the vertices, this gives you a wider range of possible ways to combine materials since you have control at the texel granularity level instead of just vertices.

Theres also a meta-shader technique where you have a library of simpler shader effects and you describe what overall effect(s) you want to apply to a single renderable object and then (either offline or at runtime) these shader effects are automatically combined into one complex super shader.
The UE3 editor does something similar, it lets you dynamically construct shader trees and then these effects are all combined together at runtime, or atleast thats the impression I get from this image - Each one of those boxes and/or links can be represented as a piece of shader code, it all gets built into a single shader for the rendered image to the left.


Well, using rgba values, it's pretty nice - you don't need to create super-shaders, but just "shader library" into one file (well, you may have it in more files, but i'm using one for fragment and one for vertex shader), there's possible way, how to achieve more than 4 material keys, but it's a little tricky :D.
Using 32bit HDR textures, you can code 4 colors into one part of HDR texture (four eight bit colors!), then decode it, so maximum possible combinations with HDR textures are 16 different materials on one object. Also you can send it as attribute for every vertex, so you can get much much more possible combinations.

Anyway, when OpenGL 3.0 Longs Peak (or Mount Evans) comes along, there'll be another possible solution - shader library (well in OpenGL). New parameter in upgraded GLSL languague #include will help to this kind of problem.

My current blog on programming, linux and stuff - http://gameprogrammerdiary.blogspot.com

This topic is closed to new replies.

Advertisement