Jump to content
  • Advertisement
Sign in to follow this  
Lightness1024

R&D [PBR] Renormalize Lambert

This topic is 411 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello, I'd like to ask your take on Lagarde's renormalization of the Disney BRDF for the diffuse term, but applied to Lambert. Let me explain.
In this document:
https://seblagarde.files.wordpress.com/2015/07/course_notes_moving_frostbite_to_pbr_v32.pdf
(page 10, listing 1) we see that he uses 1/1.51 * percetualRoughness as a factor to renormalize the diffuse part of the lighting function. Ok.

Now let's take Karis's assertion at the beginning of his famous document:
http://blog.selfshadow.com/publications/s2013-shading-course/karis/s2013_pbs_epic_notes_v2.pdf
Page 2, diffuse BRDF: 

Quote

any more sophisticated diffuse model would be difficult to use efficiently with image-based or spherical harmonic lighting

I think his premise applies and is enough reason to use Lambert (at least in my case).

But from Lagarde's document page 11 figure 10, we see that Lambert looks frankly equivalent to Disney.

From that observation, the question that naturally comes up is, if Disney needs renormalization, doesn't Lambert too ?
And I'm not talking about 1/π (this one is obvious), but that roughness related factor.

A wild guess would tell me that because there is no Schlick in Lambert. and no dependence on roughness, and as long as 1/π is there, in all cases Lambert albedo is inferior to 1, so it shouldn't need further renormalization. So then, where does that extra energy appear in Disney ? According to the graph, it's high view angle and high roughness zone, so that would mean, here: (cf image)

toobright.png.0078468fa74aeb7b5dd7a5420672035f.png

This is super small of a difference. This certainly doesn't justify in my eyes the need for the huge darkening introduced by the 1/1.51 factor that enters in effect on a much wider range of the function. But this could be perceptual, or just my stupidity.

Looking forward to be educated :)
Bests

Edited by Lightness1024
slightly clearer distinction between usual factor and further factor

Share this post


Link to post
Share on other sites
Advertisement

The entire point of renormalisation is to ensure energy conservation. 

When you integrate Lambert over the hemisphere, you get Pi. 1 unit goes in Pi units go out... Oops, we tripled the energy in the scene! So you divide Lambert by Pi so that 1 unit goes in and 1 unit comes out.

Apparently if you integrate Disney over the hemisphere you get 1/1.51*r... So you need to divide by this constant to make sure that energy isn't just summoned up from the ether in violation of conservation of energy. 

You can multiply your Lambert with 1/1.51*r, but this won't be "normalisation" - it will just be a hack to make Lambert respond to roughness in some way. If that's what you're after, there's a lot of "approximate Oren Nayer" Lambert hacks floating around that you can use instead. 

The phenomenological goal of such hacks is to make rough surfaces more flat (brdf of k/pi/dot(n,l)) while also adding a retroreflectice (view dependant) term, but to leave smooth surfaces alone (brdf of k/pi) as Lambert is correct for smooth surfaces. I'll post mine when I'm at my PC ;)

Share this post


Link to post
Share on other sites

@FreneticPonE are you talking about this:

GGXDiffuse.png.31390fcc7dcdbd2be8b00bfde9c6ced8.png

I've never seen this magic, seems interesting though.

This is just further confusing me unfortunately. Let's say I chose a lambert for diffuse and cook torrance for speculars, am I supposed to just add the two ? Lambert doesn't even depend on roughness so mirror surfaces are going to look half diffuse half reflective if just adding both. How one would properly combine a lambert diffuse and a pbr specular ?

Share this post


Link to post
Share on other sites

well apparently disney is not complexed by just adding both:

LocalLightEquation.png.7f498e8d6c490d2fbebc2cd25bb6936a.png

but still it appears to be a subject of pondering:
https://computergraphics.stackexchange.com/questions/2285/how-to-properly-combine-the-diffuse-and-specular-terms
https://gamedev.stackexchange.com/q/87796/35669

This nice paper, from paragraph 5.1 speaks of exactly what I'm concerned with:
http://www.cs.utah.edu/~shirley/papers/pg97.pdf
And they propose an equation (equation 5) one page later that looks quite different from disney's naive (as it seems to me) approach.

Edited by Lightness1024
rephrase slightly

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!