Path Tracing BSDF

Started by
17 comments, last by jameszhao00 11 years, 12 months ago
I just noticed I was missing normal smoothing (my glass buddha I'm currently rendering looks particularly bad because of this). This should be an easy fix provided I can get my hands on some vertex normal data for models I use. I am not sure yet how to handle it in case this information is missing, it would require doing some mesh analysis to see which vertices are shared and averaging the normals of adjacent faces. This would require rethinking how primitives classes are designed (I will look at it when I implement geometry instancing).

“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”

Advertisement
Are you randomly choosing diffuse / specular / etc. samples per intersection? Or are you generating N rays (if a surface has diff/spec N = 2) per intersection?

Also, have you experienced any gotchas with importance sampling the specular?
I'm randomly choosing a ray according to the material's BSDF (and only one ray per intersection, so if there's for instance possibility for a ray to be refracted or reflected, I choose the outcome using a random trial by calculating the Fresnel coefficient). I'm going to be implementing more features soon when I get the time though.

Also, have you experienced any gotchas with importance sampling the specular?[/quote]
I don't follow you, there is only one possible outcome for specular surfaces (the PDF is a Dirac function) so afaik you can't importance sample it. Or do you mean when the ray can either be diffusely reflected or specularly reflected? I don't use importance sampling either in that case, I do a random trial (the probability describes how specular the surface is).

“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”

Ah sorry I meant a microfacet based specular (i.e. phong distribution).

[background=rgb(250, 251, 252)]diffusely reflected or specularly reflected[/background]



[color=#282828][font=helvetica, arial, verdana, tahoma, sans-serif][background=rgb(250, 251, 252)][/quote][/background][/font]



By 'specularly reflected' do you mean mirror? and 'diffusely' glossy? In the glossy case, there're direction generators/pdfs for importance sampling the distributions (phong, etc). Don't know how it compares to 'random trials'.


[background=rgb(250, 251, 252)]I choose the outcome using a random trial by calculating the Fresnel coefficient[/background]


[/quote]
Ah ok.
Ah no when I mean specular I mean perfectly specular (mirror). I'm not sure what the correct term is but yeah glossy would be my "wet plastic" material which basically chooses a random diffuse ray, calculates the perfectly specular ray, and then does a random trial based on a parameter between 0 and 1 to decide which ray should be used. It works pretty well (of course there are other generators to do it, this is like the most basic and obvious one).

Random trials have the issue that they are quite noisy, so if one can importance sample the distribution instead it's much better but sometimes it isn't possible (for instance dieletric refraction cannot be importance sampled, but then again the Fresnel coefficient is in general either very high or very low, so the noise is manageable).

“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”

Here is the latest render. As you can see non-smoothed normals are pretty obvious but look especially bad on the glass buddha. I will be using this as my test scene from now on as it has a fairly large number of triangles (270k roughly) and allows me to test various materials and settings. Later I might wrap an environment map around it when I get around to implementing that. The lighting is a bit too extreme, I'm trying to put a table light model in there but I can't seem to find a nice high-poly one for free.

zlt3px.png

“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”

Wow how long did this render take? Love the stained glass buddah. The indirect lighting in the backside/shadows look a tad dark.

By the way, what parts of your code is vectorized? Did you experience any tradeoffs between vectorizing an entire computation vs splitting the computation at branches? For example,

k = a() * condition() * b()

vs

k = a()
if(condition()) k *= b()
I don't really know how long it took because I had to stop and resume it several times because I needed the processor for something else, but it did take a while. It was originally rendered at 1920x1080 but I downsampled it to make it look nicer.

By the way, what parts of your code is vectorized? Did you experience any tradeoffs between vectorizing an entire computation vs splitting the computation at branches? For example,[/quote]
I didn't actually vectorize anything yet because I am still figuring out a lot of the theory, but in general I tried to code it in an efficient way (so lots of loop hoisting, precomputations, and organizing computations in a way that I can nicely vectorize when I get around to that). It's mostly all simple standard code at the moment.

Note that I will be releasing the code freely, but for that I need to have more time (quite busy right now, and I can't really do anything by working by 10 minute sessions, I need to really sit down for a few hours to be productive)

EDIT: I will be creating a journal for this project, because I think people are getting annoyed seeing this thread bubble back to the surface repeatedly. Link here: http://www.gamedev.n...tracing-is-fun/

“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”


I do have some kind of octree in place but it's not efficient (only a 10x speedup or so for the 100k-triangle-each dragons) and extremely fragile (bad parameters will cause holes in the meshes).


Had the same problem today when I was implementing an acc. structure (AABB with uniform grids). Turns out holes in my renders are caused by
1. floating point precision issues (e.g. axis aligned planes extremely close to a boundary). Have to be relatively conservative now
2. not invalidating intersections that occur outside the current cell/voxel/... Turns out something in another cell can actually be closer than the intersection.

This topic is closed to new replies.

Advertisement