How shaders are connected with 3d Materials ?

Started by
3 comments, last by ammar26 12 years, 9 months ago
I'm a beginner in Computer Graphics and no experience of Shader Programming yet
I Google a lot about Shaders and the thing i understand about them is that they provide custom rendering effects for programmable pipeline for pixel and vertex

But i have a little confusion how materials are connected with shaders ? i mean when i work on Unity3D Game Engine and create a material it gives me option to apply a shader on material ?
The same thing i observe in 3DS MAX, materials and shaders are somehow bonded. If shaders are the custom rendering effects then what materials are doing ?

Due to no experience of shader programming i don't understand how shaders uses the materials ? do the 3D materials are passed in shaders as parameters ? And if there are 3d materials to be passed why we cant just pass textures ? why need to created materials from textures and then pass

Please help me remove this confusion
Advertisement
U could describe it this way, groups of materials use the same shader but with different parameters(shininess, how much reflection).
For example in 3D modeling tools u can often choose between different shading models Blinn, Phong, Lempart and so on. Each uses a different kind of calculations(shaders) to compute the lighting with the parameters the material is giving them.

So a material describes which shader to use and with which parameters.
You can think about it in such way:

Material is usually "game world" or "modeller world" description of the appearance of surface (or object). It defines set of physical parameters (texture(s), shininess, translucency and so on) of certain surface.

Shader is GPU program, that implements specific rendering technique.

Normally materials and shaders are somewhat correlated, as vastly different types of surfaces need different GPU programs for rendering. But if two materials only differ by texture image or color, they can share the same shader program and modify rendering parameters.

Also one material may have more than one shader. For example, if surface both is visible and casts shadow, it may have to be rendered in two different stages - as visible object, and as depthmap for shadow mapping. These normally use different shader programs, because calculating full color effects would be huge waste in depthmap rendering stage.

Also, shaders do not only render pixels, but also calculate vertex (triangle) positions (vertex and geometry shaders). Thus they are somewhat wider than only visible material. For example skeletally animated meshes may have special vertex shader part, that transforms their vertices together with skeleton. Static mesh - even if using exactly the same physical material description, does not need that part (only simple transformation)

Lauris Kaplinski

First technology demo of my game Shinya is out: http://lauris.kaplinski.com/shinya
Khayyam 3D - a freeware poser and scene builder application: http://khayyam.kaplinski.com/
Notice the naming is somewhat incoherent.
For APIs, a shader is a well defined thing.
DCC tools somewhat use the word "material" to identify a concept similar to "shader", but because they have far more powerful capabilities, their "materials" are generally much more flexible than an "API shader". It also typically includes the data associated to the shader (textures, colors etc).
Some engines in the past had "engine shaders" which encapsulated the "API shader" as well as engine-specific settings (such as physics, render-texture scripts), not surprisingly, those are often called "materials" as well.

The "shaders to apply to a material", are probably related to the "lighting model", as Danny02 pointed out. There are a few of them and they are basically used as "building blocks" for the underlying "API shader".

Key: understand what the word "shader" mean for each context.

Previously "Krohm"

Thanks a lot ... it really helped rolleyes.gif

This topic is closed to new replies.

Advertisement