**0**

# Rendering Equation, Cosine Law and Inverse PDF Transform

###
#1
Crossbones+ - Reputation: **9888**

Posted 27 September 2012 - 03:06 PM

I understand why the BRDF is a function of radiance over irradiance, and the factor is required because of Lambert's Cosine Law which states that the radiant intensity perceived is proportional to the cosine of the view angle, illustrated below:

Such that is the irradiance at angle , which is then converted back to radiance through the BRDF. So this term has to be there and is independent of the BRDF used.

Now my question is: when implementing BRDF's for global illumination, say with path tracing, we never use the "raw form" which takes two angles and returns the reflectance, but we take one angle and probabilistically generate the other to obtain a given reflectance. Typically we generate a random angle in the hemisphere and work out the corresponding reflectance. This is often satisfactory, but can be inefficient for spiky distributions. But suppose we importance sampled the BRDF with inverse transform sampling, such that for any one angle , we returned with probability .

Then it would seem that with this sampling scheme, the BRDF becomes unnecessary and the reflectance per sampled angle then becomes constant as the normalization factor (such as for diffuse), as the correct distribution is encoded in the inverse PDF transform used to come up with the second angle with the right probability.

Is my intuition correct? And if so, can all BRDF's be sampled in this way? I know diffuse, specular, phong, etc... can, but I'm not sure if the more complicated ones can be efficiently inverse transform sampled, especially when throwing in additional factors such as position and wavelength. I mean, clearly the inverse PDF can be computed, but is it always about as efficient as just evaluating the BRDF itself, or could it become intractable?

--------

Secondly, as I understand it, bidirectional path tracing still requires me to connect various intersection points to create a light path, in which case inverse transform sampling is unnecessary as I am already given the two angles, so I would still need to keep the original BRDF formula around, correct? I could still importance sample the camera and light paths though.

--------

Thirdly and lastly, I do not understand the specular BRDF. It says:

So the top part basically states that the reflectance shall be zero whenever the two angles do not obey the law of reflection, but I do not understand the bottom part. I see why the cosine term should be cancelled out from a physical point of view but cannot explain it properly - is it just because the solid angle's area becomes smaller and smaller as it grazes the surface - see diagram above - such that the exitant radiance (defined as irradiance per steradian) has to be correspondingly greater? And of course the factor is just an absorption factor.

--------

The last paragraph made me think - the explanation I gave for the cosine term in the specular BRDF could apply to pretty much every other material, so if it is correct, aren't all BRDF's going to cancel out that factor ultimately? Can it be safely removed for clarity of implementation?

Your thoughts? Thanks!

The slowsort algorithm is a perfect illustration of the multiply and surrender paradigm, which is perhaps the single most important paradigm in the development of reluctant algorithms. The basic multiply and surrender strategy consists in replacing the problem at hand by two or more subproblems, each slightly simpler than the original, and continue multiplying subproblems and subsubproblems recursively in this fashion as long as possible. At some point the subproblems will all become so simple that their solution can no longer be postponed, and we will have to surrender. Experience shows that, in most cases, by the time this point is reached the total work will be substantially higher than what could have been wasted by a more direct approach.

- *Pessimal Algorithms and Simplexity Analysis*

###
#2
Crossbones+ - Reputation: **9888**

Posted 04 October 2012 - 12:27 AM

The slowsort algorithm is a perfect illustration of the multiply and surrender paradigm, which is perhaps the single most important paradigm in the development of reluctant algorithms. The basic multiply and surrender strategy consists in replacing the problem at hand by two or more subproblems, each slightly simpler than the original, and continue multiplying subproblems and subsubproblems recursively in this fashion as long as possible. At some point the subproblems will all become so simple that their solution can no longer be postponed, and we will have to surrender. Experience shows that, in most cases, by the time this point is reached the total work will be substantially higher than what could have been wasted by a more direct approach.

- *Pessimal Algorithms and Simplexity Analysis*