• 10
• 9
• 13
• 10
• 18

# Path Tracing BSDF

This topic is 2153 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

*** Follow this project on this journal: http://www.gamedev.net/blog/1455-path-tracing-is-fun/ ***

Hello,
I've been working on my path tracer for a while, and I need some feedback on how to go forward and check if what I've done is correct. Basically what I have is the following (pretty standard):

- generate a camera ray
- for each light ray bounce
--- intersect the ray with the scene
--- calculate the BSDF at the intersection, updating the radiance and generating a new ray according to the BSDF's density function

I've tried to make the path tracer as modular as possible, heavily using classes for primitive types (I've got sphere, plane, triangle, and vertical bounded-open infinitely thin cylinder - the cylinder was mostly for tests and needs to be generalised) and materials (diffuse, specular, plastic, "metal-like", light source, dielectric glass with fresnel reflection, and a few other test materials). I've got a standard perspective camera with a tweakable FOV setting (hooray), I am a little bummed about my scene graph implementation, I seem to be hopeless at implementing octrees, kd-trees, etc... I don't know why but I keep failing miserably. But that's an issue for a later time (sort of).

Now I am mostly wondering, have I actually got the theory right? I've looked online and BSDF's apparently deal with angles and ratios but I fail to understand how one implements that, so this is my interpretation of it (with some help from trusty google):
 BSDF (Bidirectional Scattering Distribution Function) -> Input: an incident ray, a normal to the intersected surface. -> Output: a new ray, and an RGB color/emittance pair for the surface (it should be a spectral result but I'm approximating here).

So for instance, a diffuse BSDF would return a random ray within the hemisphere centered on the normal, and return a color equal to the surface's color, and an emittance of zero (since it emits no light). A "light source BSDF" (which makes no sense but standardises the material types), would return a color of zero and an emittance equal to the intensity of the light times its color. Then you get semi-emissive materials, which return nonzero emittance and color (such as a glowy sphere). For glass, you get a BRDF+BTDF (reflection+refraction) combo, which are separated by a random trial controlled by the Fresnel coefficient. And so on... then the color/emittance pairs are combined into a final radiance value which becomes the pixel's color.

This is the results I get on a few scenes:
Standard Cornell Box
Sphere Pyramid
Cylinder caustics
Random stuff (the dragon on the yellow background is made of glass)

What do you guys think? Is my BSDF implementation physically correct? Or is it blatanty wrong? I have nothing to compare it to (except the standard cornell box scene but that's the simplest case and only uses a couple materials). And I have no glass dragon or pocket cornell box to test it in real life

Thanks for the feedback and help, because I would like to know if what I've done is correct before moving on to the more complex stuff (more complex models and materials, and perhaps bidirectional path tracing). I can give some code if needed but be warned that it's written in Pascal (but I made sure it was readable so it should be ok).

Also - how should I go about simulating subsurface scattering (which is needed for realistic metal materials)? I was thinking of passing the primitive object to the BSDF so it can calculate limits on where the ray should exit, but that's not very efficient. It is good enough to just approximate it and have the ray exit from a random origin on the surface normal's plane in a small radius?

##### Share on other sites
The renders look very nice. What you wrote makes sense to me. You don't have any acceleration structure in place, yet? How long do you trace such scenes then?
Regarding SSS you'll have to read some papers on that, I'm afraid. I could help just with realtime rasterised SSS (mostly for skin, which is quite fake but nice and fast ) However, it does sound reasonable that the reflected (scattered) ray should exit randomly somewhere near the entrance with a changed direction and radiance (colour), depending on surface properties (where to get them from?). That is how it obviously works.

##### Share on other sites
The renders look very nice. What you wrote makes sense to me. Tou don't have any acceleration structure in place, yet? How long do you trace such scenes then?[/quote]

I do have some kind of octree in place but it's not efficient (only a 10x speedup or so for the 100k-triangle-each dragons) and extremely fragile (bad parameters will cause holes in the meshes). The dragon renders took several hours. I'm desperately trying to implement something efficient, but all the kd-tree stuff I find online is either a) beginner-level tutorials, b) impossibly academic papers and c) nearest-neighbour searches which are irrelevant to my problem.

Let's just say it's not quite realtime . The simple scenes take a few seconds to render with average noise, and minutes to hours to make them sweet and smooth looking. The more complex scenes take one to two orders of magnitude more time, which is why I need to find a way to accelerate that. I'd be happy with 12-hour very high quality renders (since I can parallelize them over multiple computers), but I'm not ok with 2-week renders

As an idea the "Random stuff" scene such as it is took exactly 27 hours, 11 minutes and 40 seconds (I logged it for performance comparison later on).

However, it does sound reasonable that the reflected (scattered) ray should exit randomly somewhere near the entrance with a changed direction and radiance (colour), depending on surface properties (where to get them from?)[/quote]
I grab the surface properties from the material "index" for each primitive, which points towards a particular material which defines surface properties. What I was wondering if whether it was acceptable to have rays exit at some point which isn't actually part of the primitive's surface, as an approximation (since ensuring rays always exit directly from a point located on the primitive's surface is a much harder problem as it's not generic and not even local to a given primitive).

##### Share on other sites
I must say that you have indeed beatiful renders, wich would deserve more complex scenes (i.e. Sponza Atrium).

If you have troubles with kd-trees and the like, try with something easier: BIH (Bounding interval hierarchy) is not as fast as kd-tree with SAH, but estimates suggest it is 60/70% its speed, so it may help you a lot, and is not so hard to implement (I did it as well, but I would rather not give you my implementation as for some reason it is really slow :-(

You can find the paper by googling and the ompf.org forum used to have several threads on that topic, with many implementations discussed (included my own). Now the ompf.org domain no longer points to the forum, but a (temporary?) replacement has been opened. I don't know if those threads have been moved, but if you ask them they will help you.

[color=#282828][font=helvetica, arial, verdana, tahoma, sans-serif]

### [/font] And you could consider buying the "Physically based Raytracing: from theory to implementation": I have the first edition and it is truly amazing. In the second (IIRC) the show how to implement both BVH and SSS. If you are interested in Raytracing/GI you should buy that book. Hope this helps

##### Share on other sites
I will definitely need to get my hand on Physically based Raytracing, I've only heard good things from it. I hope to purchase it soon as it seems to be an excellent resource.

I have seen several mentions of the ompf.org forum scattered around the internet, but as you said the site is down (has been for a while, apparently). I wasn't aware a replacement forum was up, I will need to check that out. These guys at ompf seem to be the experts on all that is ray tracing.

For the SSS, well not all metals obviously (and my use of the word "metals" is a bit general), but apparently some metals exhibit some form of SSS. But I think that has more to do with the actual surface geometry than with the phenomenon. SSS definitely benefits more to marble and milk, etc... it is still a required ingredient in realistic path tracing, plastic dragons just don't cut it. That said it will probably require me to rethink my BSDF implementation design.

##### Share on other sites
Offhand, I think metals are the only substances that explicitly *do not* exhibit scattering behavior. On the other hand, all dielectric materials (read: nonmetallic substances) do exhibit at least minor scattering tendencies.

##### Share on other sites

Offhand, I think metals are the only substances that explicitly *do not* exhibit scattering behavior. On the other hand, all dielectric materials (read: nonmetallic substances) do exhibit at least minor scattering tendencies.

Quite true. I'm currently experimenting with some SSS models for dieletrics and it is fairly hard to implement. I mean the concept is very simple but the actual math of how it works is quite tricky. And it slows down the rendering quite a bit since a physically correct SSS model must do several small ray steps until it exits the material.

I also noticed that SSS actually naturally reduces to a standard BSDF (diffuse, specular, even dieletric refraction) if the material is completely opaque or completely transparent. But I guess the standard BSDF's can stay as faster, specialized versions of the actual underlying SSS model.

It the restriction of having closed models not too... restrictive? As far as I know SSS requires a bounded volumetric shape to work... otherwise it doesn't make sense.

##### Share on other sites
Watertight meshes are pretty par for the course, I don't think you'd earn any ire by placing that restriction.

Also, instead of mucking about with actually bouncing some rays around, I would *highly* suggest using some of Jensen's top-notch work on dipole transmittance. It's really good stuff

##### Share on other sites

Watertight meshes are pretty par for the course, I don't think you'd earn any ire by placing that restriction.

Also, instead of mucking about with actually bouncing some rays around, I would *highly* suggest using some of Jensen's top-notch work on dipole transmittance. It's really good stuff

Thanks for the link, I will have a look. Simulating SSS with tiny ray bounces is far too computationally intensive

##### Share on other sites
Sorry for the long absence from this thread, I am currently looking at bidirectional path tracing to improve my render speeds, then I will implement a BVH (and then I'll be looking at implementing all the materials I'm missing i.e. SSS, and other stuff).

I like bidirectional path tracing because it completely solves the inconsistency issue I'm facing with lights - right now I am forced to give them a null BSDF which is kind of stupid, however using BDPT they will have their own status which is more logical.

I have come to the conclusion that BDPT with naive importance sampling (i.e. averaging all the paths) is fairly easy to implement, just a bit tricky with all the special cases. However multiple importance sampling is much more difficult.

Hopefully I can make some time for trying to implement BDPT in between uni work.