Jump to content

  • Log In with Google      Sign In   
  • Create Account

We're offering banner ads on our site from just $5!

1. Details HERE. 2. GDNet+ Subscriptions HERE. 3. Ad upload HERE.


where to start Physical based shading ?


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
42 replies to this topic

#1 BlackBrain   Members   -  Reputation: 344

Like
1Likes
Like

Posted 08 April 2014 - 08:26 AM

Where can I start learning how to shade my objects through physical based approach ? I feel lost when it comes to formulas in papers . I couldn't find any sample that does the job clearly .



Sponsor:

#2 bioglaze   Members   -  Reputation: 591

Like
5Likes
Like

Posted 08 April 2014 - 09:06 AM

I found this article useful, includes source code:

http://content.gpwiki.org/D3DBook:%28Lighting%29_Cook-Torrance



#3 Chris_F   Members   -  Reputation: 2439

Like
3Likes
Like

Posted 08 April 2014 - 10:03 AM

http://seblagarde.wordpress.com/2011/08/17/hello-world/

http://seblagarde.wordpress.com/2011/08/17/feeding-a-physical-based-lighting-mode/

http://blog.selfshadow.com/publications/s2013-shading-course/

http://www.gamedev.net/topic/639226-your-preferred-or-desired-brdf/



#4 theagentd   Members   -  Reputation: 602

Like
0Likes
Like

Posted 08 April 2014 - 10:49 AM

I found this article useful, includes source code:

http://content.gpwiki.org/D3DBook:%28Lighting%29_Cook-Torrance

This is interesting! I've been looking a bit at that Cook-Torrance link, but from what I understand physical based shading is supposed to be "normalized", e.g. the amount of light reflected is less than or equal the amount of incoming light. Is the BRDF described there really normalized?

 

 

I've been thinking about how much space physical based shading requires in my G-buffer. I currently use 2 16-bit values (in addition to 3 for diffuse RGB and 2 for a packed normal); glossiness and specular intensity. According to this link, physical based shading seems to require diffuse RGB, specular RGB and a gloss value, which if specular is kept as grayscale would result in the same values that I need. Cook-Torrance on the other hand seems to work on some kind of roughness value instead and possibly some other term (something related to some incident angle?).

 

 

Frankly I'm not too interested in the math behind all this, but it'd still be really interesting to implement and see the results, and I need to understand it to some extent to be able to explain how to work with the lighting to our artists...



#5 kalle_h   Members   -  Reputation: 1476

Like
1Likes
Like

Posted 08 April 2014 - 11:31 AM

Roughness and glossiness is usually just same term. Glossiness is just calculated with some formula from smoothness which is 1- roughness. In our material pipeline we always talk about roughness but actually artists author smoothness map instead. Usual optimization is to store just spec intensity and calculate spec color from albedo if specular intensity is higher than x.(0.2 is good choise).



#6 Chris_F   Members   -  Reputation: 2439

Like
0Likes
Like

Posted 08 April 2014 - 12:04 PM

If you only store a monochromatic specular value you will not be able to render materials like gold or copper. Since the specular value is generally constant for a particular material, you could instead store a material id in your gbuffer and use it to look up the RGB specular value from a constant buffer.



#7 kalle_h   Members   -  Reputation: 1476

Like
0Likes
Like

Posted 08 April 2014 - 02:36 PM

If you only store a monochromatic specular value you will not be able to render materials like gold or copper. Since the specular value is generally constant for a particular material, you could instead store a material id in your gbuffer and use it to look up the RGB specular value from a constant buffer.

Copper and gold does not pose problem because metals don't have albedo so spec color can be stored there but Gem stones and other rare materials  on other hand can cause a problems.

 

I solved this problem just making a rule that only metals can have colored specularity in our engine and material is concidered as metal if specular intensity is over 0.5. Basically metals have pure white intensity and only albedo channels are used for authoring but used as specular color in shader.

 

Small video showing incorrect gem stone(shaded as metal) https://www.facebook.com/photo.php?v=10151931766676766



#8 Scouting Ninja   Members   -  Reputation: 714

Like
-1Likes
Like

Posted 08 April 2014 - 02:39 PM


Roughness and glossiness is usually just same term. Glossiness is just calculated with some formula from smoothness which is 1- roughness. In our material pipeline we always talk about roughness but actually artists author smoothness map instead. Usual optimization is to store just spec intensity and calculate spec color from albedo if specular intensity is higher than x.(0.2 is good choise).

 

I Just want to clear this up, roughness and glossiness isn't the same thing.

The reason people believe them to be the same is one part lazy work and a other part, different development apps not agreeing on a standard.

 

So Specular works with different maps:

intensity: This is how bright the specular is from 0-1 or 0-255, is referred to as a Specular map or intensity map, gloss map, Specular grey scale map, shininess map.

Size : This is the size of the specular spot and it ranges from 0-512 normally or 0-1, is referred to as a gloss mapshininess map, hardnnes map, Specular size map.

Color: This changes the color of a specular spot and is mainly used for gold and special paints, is referred to as a Specular color map.

 

Usually materials that sharply reflect light will also reflect a large amount of light ,like plastic, so specular map will be bright and the gloss map dark, this is why inverting still gives acceptable results quickly.

Polished metal and car paints reflect small sharp points of light and will have a bright specular map and a bright gloss map.

 

Small game engines like Unity only use reflection for metals and other highly reflective materials, meaning that thy only need to use the Specular map.

Other low reflective materials,like wood or cement, use a simple diffuse shader.

 

Large game engines like Unreal use all three specular maps, allowing artist to clearly show where the polished metal armour separates from the battle worn chain mail and even where a gold bracelet hangs from the hand.



#9 kalle_h   Members   -  Reputation: 1476

Like
0Likes
Like

Posted 08 April 2014 - 03:05 PM

 


Roughness and glossiness is usually just same term. Glossiness is just calculated with some formula from smoothness which is 1- roughness. In our material pipeline we always talk about roughness but actually artists author smoothness map instead. Usual optimization is to store just spec intensity and calculate spec color from albedo if specular intensity is higher than x.(0.2 is good choise).

 

I Just want to clear this up, roughness and glossiness isn't the same thing.

The reason people believe them to be the same is one part lazy work and a other part, different development apps not agreeing on a standard.

 

So Specular works with different maps:

intensity: This is how bright the specular is from 0-1 or 0-255, is referred to as a Specular map or intensity map, gloss map, Specular grey scale map, shininess map.

Size : This is the size of the specular spot and it ranges from 0-512 normally or 0-1, is referred to as a gloss mapshininess map, hardnnes map, Specular size map.

Color: This changes the color of a specular spot and is mainly used for gold and special paints, is referred to as a Specular color map.

 

Usually materials that sharply reflect light will also reflect a large amount of light ,like plastic, so specular map will be bright and the gloss map dark, this is why inverting still gives acceptable results quickly.

Polished metal and car paints reflect small sharp points of light and will have a bright specular map and a bright gloss map.

 

Small game engines like Unity only use reflection for metals and other highly reflective materials, meaning that thy only need to use the Specular map.

Other low reflective materials,like wood or cement, use a simple diffuse shader.

 

Large game engines like Unreal use all three specular maps, allowing artist to clearly show where the polished metal armour separates from the battle worn chain mail and even where a gold bracelet hangs from the hand.

 

 

Wikipedia and every physical based rendering source material contradict what you are saying. http://en.wikipedia.org/wiki/Gloss_(optics)#Surface_roughness



#10 Matias Goldberg   Crossbones+   -  Reputation: 3575

Like
2Likes
Like

Posted 08 April 2014 - 04:52 PM

 

I found this article useful, includes source code:
http://content.gpwiki.org/D3DBook:(Lighting)_Cook-Torrance

This is interesting! I've been looking a bit at that Cook-Torrance link, but from what I understand physical based shading is supposed to be "normalized", e.g. the amount of light reflected is less than or equal the amount of incoming light. Is the BRDF described there really normalized?

 

To the best of my knowledge, the formula described in that book is normalized. Not all of the other BRDFs in that book are normalized though.
Note though, that the Cook Torrance code later adds the diffuse component "as is", but you need in fact to normalize that sum. A common cheap trick is to use the opposite of the fresnel argument F0:"NdotL * (cSpecular * Rs + cDiffuse * (1-f0))"
 

Frankly I'm not too interested in the math behind all this, but it'd still be really interesting to implement and see the results, and I need to understand it to some extent to be able to explain how to work with the lighting to our artists...

I'm afraid PBS is all about the math (tech, what you're focusing on), and feeding it with realistic values (the art).
You asked whether the Cook-T. formula was normalized, but in fact we can't know by just looking at it (unless we already know of course).
To truly check if it's normalized, you have to calculate the integral; like in this website, and like in this draft. Either that, or write a monte carlo simulation.

Either of them takes more than just 2 minutes to find out (and for some formulas it can actually be very hard even for experienced mathematicians).

 

Edit: Fabian Giesen's site seems to be down. I've re uploaded his PDF here.


Edited by Matias Goldberg, 08 April 2014 - 04:57 PM.


#11 Scouting Ninja   Members   -  Reputation: 714

Like
-1Likes
Like

Posted 08 April 2014 - 05:21 PM


Wikipedia and every physical based rendering source material contradict what you are saying. http://en.wikipedia.org/wiki/Gloss_(optics)#Surface_roughness

 

Read this two: http://en.wikipedia.org/wiki/Specular

 

These two wiki pages refer to real world conditions, not the techniques use by real time rendering.

Specualar maps, Gloss maps and Specualar color maps, are nothing but tricks to fool the viewer.

 

If you would like to make shaders that can cast real specular reflections and gloss optics you will need to use ray tracing.

Even though ray tracing is now entering a era where it can be used in games it will still take time before we can bounce the huge amount of rays needed, over a surface and back at the viewer that is needed for gloss, although I am proud to say that specular is now possible even if it isn't practical.

 

This better explains real time specular, even if it is a bit dated: http://en.wikipedia.org/wiki/Specular_highlight

Here is a guide for materials from the artists view point: https://www.marmoset.co/toolbag/learn/materials (Good read)

 

I have worked for five years as a cg artist, by rule of thumb the maps are:

Specular map. intensity

Gloss map. Size (yes it should be roughness but it really only scales the specular size of the spot over the normals)

Specular color map. Color

 

But these names changes from software to software, so it helps to know what each does.

 

If any one would like to see how these maps work, you can just download a 3d modeling app and test them your self or you could just search for "How do specular and gloss maps work."



#12 Hodgman   Moderators   -  Reputation: 31131

Like
4Likes
Like

Posted 08 April 2014 - 06:48 PM

Scouting ninja - in your first post you've said that "gloss" can be used as a name for two of the maps (intensity and size) -- this is why using roughness/smoothness/etc is now common.

You've bolded the one under size though -- this is the same as roughness, it defines the size/shape of the specular highlight, while the other map (intensity) controls how visible the highlight is. The intensity map can be RGB or monochrome, or both (which will be multiplied together in the shader).

In traditional ad-hoc shaders, artists needed to tweak the two maps in conjunction to get good results -- e.g. increasing intensity while reducing size.
With PBR, this is no longer the case. We're now using real physics (regardless of whether were raytracing or not, the BRDF is the same) - so the material inputs / maps that we provide are now physical values. A *normalized* specular BRDF means that the intensity is automatically boosted/dulled by the physically correct amount based on the roughness/size value. The intensity is also modified according to Fresnel's law; the value painted in your specular intensity map is the fresnel reflectance at normal incidence, Fresnel's equations allow us to use that value to find the correct specular intensity for other incidence angles.

 

An infinitely smooth surface will reflect 100% of the incoming light out of one single point, a Lambertian-rough surface will reflect 1/pi of the incoming light from each point - that amount is then multiplied by the 'intensity' value, which results in small highlights being brighter than large highlights, even if both materials have the same specifed 'specular intensity' value.

With a PBR renderer, the spec maps are not just values to fool the viewer, they are real, measurable physical values being plugged into a simulation. Also, a lot of the same BRDFs are being used in real-time engines now, as are being used in offline ray-tracers for VFX. While VFX has the liberty of Ray-tracing everything, they still often use IBL for "ambient specular", which still gives real reflections, very quickly, but only for distant objects -- real time systems now use the same IBL type systems as film too!

The correct specular intensity value for a material can be calulated if it's IOR is measured -- for most non-metal objects, this value is about 3/100 -- this value does not change depending on whether the surface is smooth/glossy/rough, it only depends on the raw material that the surface is built out of.

Roughness/smoothness/size/etc is a statistical value that gets plugged into a probability distribution to simulate a highly detailed normal-map that's smaller than a pixel. If you cared to sculpt/scan your surfaces at the micro-meter scale, you could automatically select correct roughness values based on the distribution of the normals within that micro-surface.

So basically - your advice is correct for traditional rendering systems, but not PBR systems.

 

Under a traditional (non-normalized) BRDF:

-- smooth plastic has low roughness / high gloss, and has high intensity.

-- rough plastic has high rougness / low glass, and low intensity.

Under a PBR BRDF though, they're both the same material, and they're both a common dielectric, so they both have the same intensity of about 3% (AKA Fresnel reflectance at incidence zero) and only the roughness/gloss value is different.

 

Likewise, smooth plastic and smooth chrome share the same roughness/gloss value, but have completely different 'intensity' values (~3% vs 90+%).

 

[edit] Here's another link dump - not so great for implementation, but just talking about PBR:

http://www.marmoset.co/toolbag/learn/pbr-practice

https://www.fxguide.com/featured/the-state-of-rendering/

http://www.fxguide.com/featured/the-state-of-rendering-part-2/

http://www.fxguide.com/featured/game-environments-parta-remember-me-rendering/


Edited by Hodgman, 09 April 2014 - 12:04 AM.


#13 theagentd   Members   -  Reputation: 602

Like
1Likes
Like

Posted 08 April 2014 - 09:04 PM

Let me see if I have understood this correctly. To implement Cook-Torrance I need to:

 

1. Modify my G-buffer to store specular intensity (AKA ref_at_norm_incidence in the article) and a roughness value.

2. Normalize the function by multiplying the diffuse term by (1 - (specular intensity AKA ref_at_norm_incidence)).

3. Bathe in the glory of physical based shading.

 

My bet is that this is 100x easier to tweak to realistic results compared to my current lighting.


Edited by theagentd, 09 April 2014 - 07:50 AM.


#14 theagentd   Members   -  Reputation: 602

Like
0Likes
Like

Posted 09 April 2014 - 05:02 PM

Sorry to double post, but a simple "yes" or "no" is all I need... ._.



#15 Hodgman   Moderators   -  Reputation: 31131

Like
1Likes
Like

Posted 10 April 2014 - 01:11 AM

Yes.

 

Though even Phong or Blinn-phong (the typical specular formulas) both stored intensity and roughness values in the g-buffer too (often called spec-mask / just "specular" and spec-power / spec-exponent).



#16 theagentd   Members   -  Reputation: 602

Like
0Likes
Like

Posted 10 April 2014 - 05:48 AM

But from what I've undersood the specular power is actually the inverse of roughness, since a roughness value of 0 would correspond to a very high gloss value (over 100 at least or something?). You're right that the change in the G-buffer isn't much of a change though.



#17 Hodgman   Moderators   -  Reputation: 31131

Like
4Likes
Like

Posted 10 April 2014 - 06:19 AM

But from what I've undersood the specular power is actually the inverse of roughness, since a roughness value of 0 would correspond to a very high gloss value (over 100 at least or something?). You're right that the change in the G-buffer isn't much of a change though.

Yeah. But in practice, roughness is often actually stored in a "smoothness" map, where 0=rough and 255=smooth... but we still call the variable roughness anyway unsure.png

 

With specular power, it's a value that goes from 0 to infinity -- or generally, usually from 1 to a large number, maybe 2048 or 8192...

Often to decode spec-power from a texture (which only stores a 0-1 range), you'll use something like:

float power = sqrt( tex2D(...) ) * 8192 + 1; //or

float power = pow( 2, tex2D(...) * 12 + 1 ); 

Each engine will decide on a different mapping between their textures and how to decode them into spec-power values...

 

Different BRDFs will use different ranges for roughness. In some of them, roughness is in the 0-1 range, where 0=rough and 1=smooth -- in that case, you can just use the texture as is. In others 0=smooth and 1=rough -- in that case you can just use 1-tex2D(...). With the GGX BRDF that I'm using, I'm currently using 0=smooth, 1=rough and 4=extremely-rough, so I use pow(tex2D(...),4.4)*4.

 

The different curves - pow(x, y) or sqrt(x) (which is just pow(x, 1/2.0)) are there to skew the middle grey levels in the texture either towards 1 or 0.

e.g. you might want spec-power values from 1-8000. However, spec values of 1.5, 2, 2.5 and 3 give extremely different appearances, while spec values of 7800, 7900 and 8000 all look pretty much identical. So, in this case, you want to skew distribution of your middle grey values down towards zero, so you've got a more even distribution of different appearances when the artists try to paint their maps.


Edited by Hodgman, 10 April 2014 - 06:29 AM.


#18 oks2024   Members   -  Reputation: 145

Like
0Likes
Like

Posted 10 April 2014 - 11:13 AM

I was in the same situation recently, after reading a lot of paper I wanted to implement physically based shading.

 

I'll just add one link to the ones provided above : http://graphicrants.blogspot.ca/2013/08/specular-brdf-reference.html

There is a lot of cook torrance specular variations, I found it very helpfull.

 

If you want to see an actual implementation you can dive in the Unreal Engine 4 source code, you'll found those equations, and the parameters used.

 

Before implementing PBS in my (tiled) deferred engine I made a very simple forward viewer, to test the inputs I'll want to store on the GBuffer. If you want to look at it, you can download it here: http://www.alexandre-pestana.com/physically-based-rendering-viewer/

 

It's very basic, but it allows me to test textures and parameters. I didn't release the source code yet (I need to clean some import mess before), but you have access to the pixel shader, so maybe you'll find that helpfull. You can also modify the shader and dynamically recompile it by pressing "Alt Gr". But if it does not compile I think the viewer will miserably crash (I use this feature for development, with breakpoints to relaunch compilation in case of error).

 

As I said, I'm discovering PBS, so it may still be some mistakes in my implementation. If you found one, I would be happy to know !

 

 

I've made a new version of the viewer, where I implemented all the specular variations found in the blog post from Brian Karis, I think I'll upload it tonight or tomorrow.



#19 theagentd   Members   -  Reputation: 602

Like
0Likes
Like

Posted 10 April 2014 - 11:40 AM

Thank you very much! I've converted the Cook-Torrance shader to GLSL and I believe I have it working. I'm not quite sure if the result is correct though. In my opinion the fresnel effect is way too strong. Here are my results with a specular intensity of 0.0 and a roughness value of 0.5.

 

Result:

JYYaVSd.jpg

 

Specular only:

OrckCO4.jpg

 

Is this really the correct result? Despite the specular intensity being 0, I get a huge amount of specular reflection when the view angle is high. The "problematic" line is 

float fresnel = specularIntensity + (1.0 - specularIntensity) * pow(1.0 - VdotH, 5.0f);

which basically causes the specular intensity to approach 1.0 when VdotH approaches 0 regardless of what the original specular intensity was. Is this really correct? It looks very different from how I am used to. For example, I attempted to reproduce these results in real life using a plain white A4 paper and a lamp, and the specular intensity was pretty much negligable. Perhaps I was just using a too low roughness value (0.5), since a roughness value of >1 produces very good "papery" result.



#20 Alundra   Members   -  Reputation: 890

Like
0Likes
Like

Posted 10 April 2014 - 12:17 PM

you can change your line by this one who gives better perf for the same result :

float fresnel = specularIntensity + ( 1.0f - specularIntensity ) * exp2( (-5.55473f * VdotH - 6.98316f) * VdotH );

Edited by Alundra, 10 April 2014 - 12:17 PM.





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS