Jump to content

  • Log In with Google      Sign In   
  • Create Account

We're offering banner ads on our site from just $5!

1. Details HERE. 2. GDNet+ Subscriptions HERE. 3. Ad upload HERE.


Don't forget to read Tuesday's email newsletter for your chance to win a free copy of Construct 2!


Geometric term in physically based shading


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
5 replies to this topic

#1 B_old   Members   -  Reputation: 665

Like
0Likes
Like

Posted 12 January 2013 - 02:20 PM

I've been looking two articles about physically based shading:
Sébastian Lagarde
Simon's Tech Blog

The sebastian article mainly describes distribution function for the specular and never mentions a geometric term, while the simon article compares different geometric terms that are used together with the distribution function.

Now I am curious why the one is including a geometric term and the other isn't. Are these two different approaches or is one article just focusing on a smaller part of the equation?

Edited by B_old, 12 January 2013 - 02:58 PM.


Sponsor:

#2 TiagoCosta   Crossbones+   -  Reputation: 2341

Like
1Likes
Like

Posted 12 January 2013 - 06:05 PM

The first link uses the implicit geometry term G_implicit.png which cancels out with the bottom of the microfacet specular BRDFmicroBRDF.png

 

For example, you can use the implicit geometry factor on older hardware. 



#3 MJP   Moderators   -  Reputation: 11584

Like
1Likes
Like

Posted 12 January 2013 - 09:27 PM

Having an implicit geometry term (which basically equates to not having one) has been used on current-gen console hardware in order to save performance. If you have the ALU to spare then you definitely want to use one, it's an important part of getting the right look at grazing angles (especially for materials with higher roughness values). It's also an important part of maintaining energy conservation with microfacet BRDFs.



#4 B_old   Members   -  Reputation: 665

Like
0Likes
Like

Posted 13 January 2013 - 10:27 AM

Thanks!



#5 B_old   Members   -  Reputation: 665

Like
0Likes
Like

Posted 14 January 2013 - 02:47 PM

I'm wondering about the 4 * dot(n,l) * (n, v). It can become 0. Are you applieing some max(x, epsilon) on the individual dot products or on the whole term or something different?



#6 Bacterius   Crossbones+   -  Reputation: 9065

Like
0Likes
Like

Posted 14 January 2013 - 03:37 PM

I'm wondering about the 4 * dot(n,l) * (n, v). It can become 0. Are you applieing some max(x, epsilon) on the individual dot products or on the whole term or something different?

 

It only becomes zero at a very grazing angle between the view vector (or light vector) and the surface, but in practice you'll probably want a few epsilons there just to guard against division by zero, since it'll mess up your entire pipeline if you end up with pixels with infinite brightness. I've seen a good shader implementation of Cook-Torrance someone on the internet which shows how to do those checks correctly and efficiently, though I can't find it right now.

 

Of course if you can cancel things out, make sure you do, as the safest and fastest operation is the one you never compute. You'll notice the dot(N, L) one is automatically cancelled out by the Lambertian cosine term, for instance.


The slowsort algorithm is a perfect illustration of the multiply and surrender paradigm, which is perhaps the single most important paradigm in the development of reluctant algorithms. The basic multiply and surrender strategy consists in replacing the problem at hand by two or more subproblems, each slightly simpler than the original, and continue multiplying subproblems and subsubproblems recursively in this fashion as long as possible. At some point the subproblems will all become so simple that their solution can no longer be postponed, and we will have to surrender. Experience shows that, in most cases, by the time this point is reached the total work will be substantially higher than what could have been wasted by a more direct approach.

 

- Pessimal Algorithms and Simplexity Analysis





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS