where to start Physical based shading ?

Started by
41 comments, last by Alundra 10 years ago

Where can I start learning how to shade my objects through physical based approach ? I feel lost when it comes to formulas in papers . I couldn't find any sample that does the job clearly .

Advertisement

I found this article useful, includes source code:

http://content.gpwiki.org/D3DBook:%28Lighting%29_Cook-Torrance

Aether3D Game Engine: https://github.com/bioglaze/aether3d

Blog: http://twiren.kapsi.fi/blog.html

Control

http://seblagarde.wordpress.com/2011/08/17/hello-world/

http://seblagarde.wordpress.com/2011/08/17/feeding-a-physical-based-lighting-mode/

http://blog.selfshadow.com/publications/s2013-shading-course/

http://www.gamedev.net/topic/639226-your-preferred-or-desired-brdf/

I found this article useful, includes source code:

http://content.gpwiki.org/D3DBook:%28Lighting%29_Cook-Torrance

This is interesting! I've been looking a bit at that Cook-Torrance link, but from what I understand physical based shading is supposed to be "normalized", e.g. the amount of light reflected is less than or equal the amount of incoming light. Is the BRDF described there really normalized?

I've been thinking about how much space physical based shading requires in my G-buffer. I currently use 2 16-bit values (in addition to 3 for diffuse RGB and 2 for a packed normal); glossiness and specular intensity. According to this link, physical based shading seems to require diffuse RGB, specular RGB and a gloss value, which if specular is kept as grayscale would result in the same values that I need. Cook-Torrance on the other hand seems to work on some kind of roughness value instead and possibly some other term (something related to some incident angle?).

Frankly I'm not too interested in the math behind all this, but it'd still be really interesting to implement and see the results, and I need to understand it to some extent to be able to explain how to work with the lighting to our artists...

Roughness and glossiness is usually just same term. Glossiness is just calculated with some formula from smoothness which is 1- roughness. In our material pipeline we always talk about roughness but actually artists author smoothness map instead. Usual optimization is to store just spec intensity and calculate spec color from albedo if specular intensity is higher than x.(0.2 is good choise).

If you only store a monochromatic specular value you will not be able to render materials like gold or copper. Since the specular value is generally constant for a particular material, you could instead store a material id in your gbuffer and use it to look up the RGB specular value from a constant buffer.

If you only store a monochromatic specular value you will not be able to render materials like gold or copper. Since the specular value is generally constant for a particular material, you could instead store a material id in your gbuffer and use it to look up the RGB specular value from a constant buffer.

Copper and gold does not pose problem because metals don't have albedo so spec color can be stored there but Gem stones and other rare materials on other hand can cause a problems.

I solved this problem just making a rule that only metals can have colored specularity in our engine and material is concidered as metal if specular intensity is over 0.5. Basically metals have pure white intensity and only albedo channels are used for authoring but used as specular color in shader.

Small video showing incorrect gem stone(shaded as metal) https://www.facebook.com/photo.php?v=10151931766676766


Roughness and glossiness is usually just same term. Glossiness is just calculated with some formula from smoothness which is 1- roughness. In our material pipeline we always talk about roughness but actually artists author smoothness map instead. Usual optimization is to store just spec intensity and calculate spec color from albedo if specular intensity is higher than x.(0.2 is good choise).

I Just want to clear this up, roughness and glossiness isn't the same thing.

The reason people believe them to be the same is one part lazy work and a other part, different development apps not agreeing on a standard.

So Specular works with different maps:

intensity: This is how bright the specular is from 0-1 or 0-255, is referred to as a Specular map or intensity map, gloss map, Specular grey scale map, shininess map.

Size : This is the size of the specular spot and it ranges from 0-512 normally or 0-1, is referred to as a gloss map, shininess map, hardnnes map, Specular size map.

Color: This changes the color of a specular spot and is mainly used for gold and special paints, is referred to as a Specular color map.

Usually materials that sharply reflect light will also reflect a large amount of light ,like plastic, so specular map will be bright and the gloss map dark, this is why inverting still gives acceptable results quickly.

Polished metal and car paints reflect small sharp points of light and will have a bright specular map and a bright gloss map.

Small game engines like Unity only use reflection for metals and other highly reflective materials, meaning that thy only need to use the Specular map.

Other low reflective materials,like wood or cement, use a simple diffuse shader.

Large game engines like Unreal use all three specular maps, allowing artist to clearly show where the polished metal armour separates from the battle worn chain mail and even where a gold bracelet hangs from the hand.


Roughness and glossiness is usually just same term. Glossiness is just calculated with some formula from smoothness which is 1- roughness. In our material pipeline we always talk about roughness but actually artists author smoothness map instead. Usual optimization is to store just spec intensity and calculate spec color from albedo if specular intensity is higher than x.(0.2 is good choise).

I Just want to clear this up, roughness and glossiness isn't the same thing.

The reason people believe them to be the same is one part lazy work and a other part, different development apps not agreeing on a standard.

So Specular works with different maps:

intensity: This is how bright the specular is from 0-1 or 0-255, is referred to as a Specular map or intensity map, gloss map, Specular grey scale map, shininess map.

Size : This is the size of the specular spot and it ranges from 0-512 normally or 0-1, is referred to as a gloss map, shininess map, hardnnes map, Specular size map.

Color: This changes the color of a specular spot and is mainly used for gold and special paints, is referred to as a Specular color map.

Usually materials that sharply reflect light will also reflect a large amount of light ,like plastic, so specular map will be bright and the gloss map dark, this is why inverting still gives acceptable results quickly.

Polished metal and car paints reflect small sharp points of light and will have a bright specular map and a bright gloss map.

Small game engines like Unity only use reflection for metals and other highly reflective materials, meaning that thy only need to use the Specular map.

Other low reflective materials,like wood or cement, use a simple diffuse shader.

Large game engines like Unreal use all three specular maps, allowing artist to clearly show where the polished metal armour separates from the battle worn chain mail and even where a gold bracelet hangs from the hand.

Wikipedia and every physical based rendering source material contradict what you are saying. http://en.wikipedia.org/wiki/Gloss_(optics)#Surface_roughness

I found this article useful, includes source code:
http://content.gpwiki.org/D3DBook:(Lighting)_Cook-Torrance

This is interesting! I've been looking a bit at that Cook-Torrance link, but from what I understand physical based shading is supposed to be "normalized", e.g. the amount of light reflected is less than or equal the amount of incoming light. Is the BRDF described there really normalized?

To the best of my knowledge, the formula described in that book is normalized. Not all of the other BRDFs in that book are normalized though.
Note though, that the Cook Torrance code later adds the diffuse component "as is", but you need in fact to normalize that sum. A common cheap trick is to use the opposite of the fresnel argument F0:"NdotL * (cSpecular * Rs + cDiffuse * (1-f0))"

Frankly I'm not too interested in the math behind all this, but it'd still be really interesting to implement and see the results, and I need to understand it to some extent to be able to explain how to work with the lighting to our artists...

I'm afraid PBS is all about the math (tech, what you're focusing on), and feeding it with realistic values (the art).
You asked whether the Cook-T. formula was normalized, but in fact we can't know by just looking at it (unless we already know of course).
To truly check if it's normalized, you have to calculate the integral; like in this website, and like in this draft. Either that, or write a monte carlo simulation.

Either of them takes more than just 2 minutes to find out (and for some formulas it can actually be very hard even for experienced mathematicians).

Edit: Fabian Giesen's site seems to be down. I've re uploaded his PDF here.

This topic is closed to new replies.

Advertisement