Deferred shading material system

Started by
4 comments, last by nini 15 years, 9 months ago
Hi community , i have a deffered renderer working but i'm wondering why others use a material ID stored in a volume texture (see stalker slide on gems2)? actually i store the basemap comming from texture and spec power in one render target , normals in another and have position reconstructed in lighting phase. Finally i do the eval of phong brdf... is this to optimize the lighting phase (eg instead of calculating phong do a fetch from this texture ?) , or am i totally wrong ? With growing computing power of gpus of nowadays (eg G80) this optimization is useless ? sorry for this dumb question but i'm lost with this... thanx
Advertisement
the material ID is usually use in conjunction with an uber shader; which is simply a single large shader which is capable of rendering every single material used in your game through conditional branching.

when the fragment shader executes, the materialID is looked up and used as a branch predicate to decide which bit of the ubershader to execute..

well, thats what I understand anyway..
"I am a donut! Ask not how many tris/batch, but rather how many batches/frame!" -- Matthias Wloka & Richard Huddy, (GDC, DirectX 9 Performance)

http://www.silvermace.com/ -- My personal website
Even if you shade all your final pixels using the same uniform shader (like I've done in the past) there are uses for a material index.

To do any decent shading you need a lot of information:
position, normal, albedo, specular exponent, specular intensity (rgb), diffuse intensity (rgb), fresnel intensity, ambient occlusion, self illumination (rgb) etc.

Storing all this data per pixel during the geometry pass gives you total flexibility, but it's not feasible (due to the amount of memory nedded).

Instead you can take some of this data (that changes unfrequently) and move them into a "material" (implemented as shader constants or a texture wich is better IMO).
Per pixel during geometry rendering, you "just" store the material index into the constants.

During shading you look up material index from the pixel and finally get the material properties from the constants (or texture).
thx for your help guys , i really love gamedev for this...
okay but concerning the texture itself contains what type of information ?
precomputed n dot l ? or precomputed specular n dot h for an exponent ? or maybe albedo ?
i can't see why a volume texture is used in stalker or other commercial implementation of deferred renderer...

i was already doing branching in the uber shader but i was setting a boolean from the app to the shader each time a material was set, during gbuffer creation..(eg bSpecEnable = true)
and execute the specular calculation code in the uber shader depending on basemap alpha channel (bspec in alpha).

Although my G-buffer is like this :

R32F linear_z
G16R16F normal xy
A8R8G8B8 basemap rgb and a=bspecularenable
A8R8G8B8 glossmap.rgb

I know this isn't enough for a commercial renderer and not fully optimized...
that's why i want to understand the material ID system.
Quote:Original post by nini
okay but concerning the texture itself contains what type of information ?


Completely depends on what kinds of materials you're rendering. The idea is that for any materials that require more than just what you've stored in the G-Buffer (in your case diffuse and specular albedo), you store it in the volume texture. For example eq mentioned fresnel intensity...perhaps you have one slice of your volume map containing a bunch of fresnel intensities. And maybe you have another slice containing sub-surface scattering components for translucent materials. There's no set answer, because it's just whatever extra info you decide you need for your rendering pass.
i've readen an article in shaderx3 that explain that matId is used to execute certain part of the uber shader for selecting varying lighting models...

so no need for a 3D texture , but i think this is used for precomputed values for a lighting model like you say mjp if u want fresnel it's in the 1st slice ect...

i have another question , should it be a great perf hit compared to doing the maths of the lighting equation in the shader ?
(it reduce the uber shader to some texture fetch instruction), does it reduce quality since it's a texture ...

thanx anyway for ur help guys , and thanx mjp.


This topic is closed to new replies.

Advertisement