Jump to content
  • Advertisement
Sign in to follow this  
JoeJ

Reference Path Tracer - confused by division by Pi

Recommended Posts

Posted (edited)

I made a path tracer to have a reference for the realtime GI i'm working on, the image looks like this:

ray.png.520b93b8af6e40e9d1ca94e037b1e321.png

I had to add a division by Pi to the realtime stuff to make it match, which means i did it wrong for many years it seems :)

However, i wrote the path tracer just out of my head without making sure it is correct. (It is very simple, no light sampling - just random directions and hit the light by luck.)

So i'm unsure, and tried to make another reference render with Mitsuba (failed), and Blender:

blender.png.3908a8fd8c1025252420d75567df8ea7.png

Blender fails to handle flat shading it seems, so the edges are round by accident, but the image is surely too bright. I tried to disable all tone mapping / color management and use just gamma 2.2, but it's hard to trust the remaining color transform to mach mine.

Confusion became even larger, because the blender image looks more like my initial realtime result. 

 

Maybe someone can recommend me another free path tracer that can handle simple lambert diffuse and emissive polygons,

or can you spot a bug / confirm my path tracer code?:

static vec Radiance (vec &pos, vec &norm, vec &albedo, const int bounce, const int maxBounces, const LinkMesh &mesh, const RaytraceMesh &tracer)
			{
				vec rndDir = RandomDirCosineWeighted(norm);

				int hitIndex;
				float t = tracer.TraceRay (hitIndex, pos, rndDir, 0, FLT_MAX);
				if (hitIndex == -1) return vec(0,0,0);
				
				pos += rndDir * t;
				norm = mesh.mPolyNormals[hitIndex] * -1;

				vec indirectAlbedo = mesh.mPolyColors[hitIndex];
				float indirectEmission = mesh.mPolyEmission[hitIndex];
				vec emit = indirectAlbedo * indirectEmission;
				
				if (bounce < maxBounces)
					emit += Radiance (pos, norm, indirectAlbedo, bounce+1, maxBounces, mesh, tracer);

				emit = cmul(albedo, emit) / float(PI);

				return emit;
			}

My only confusion here is again the final division by PI. My thought here was: We project the hemisphere down to the unit circle which has area of PI, so one ray has a weight of 1/Pi. Is this correct?

I call the above recursive function with per pixel code like this, to be complete:

vec pos = origin + ray * t; // move to primary hit
vec norm = mesh.mPolyNormals[hitIndex] * -1; // next path direction
vec albedo = mesh.mPolyColors[hitIndex]; // material at hit
float emission = mesh.mPolyEmission[hitIndex];
vec color = Tools::Radiance (pos, norm, aldebo, 0, bounces, mesh, tracer); // path tracing
color += aldebo * emission; // add emission from primary hit

// accumulate pixel with this color sample...

My color transform is just gamma:

outColor = min(1, pow(accumulatedColor, 1/2.2f));

 

Thanks for help!

Edit: All renders use 10 bounces - made sure of that.

I made the Cornell Box scene without using any reference about material and geometry, so it can't be compared to other similar renders.

 

 

 

Edited by JoeJ

Share this post


Link to post
Share on other sites
Advertisement

You have to remove the divide by PI since it cancels out. In Monte Carlo Path Tracing you have to weight each contribution by the pdf of the sample taken. And since you only consider perfectly diffuse surfaces your BRDF is c/pi where c is your surface color. And due to your cosine weighted direction sampling your pdf is cos(theta)/pi. Which nicely cancels out the cos of the rendering equation and the PI of your BRDF, leaving you only with the incoming radiance. Look here: http://www.rorydriscoll.com/2009/01/07/better-sampling/ 

Share this post


Link to post
Share on other sites

@Batzer is correct here.

This being said, comparing images from 2 different renderers tend to be very hard - you need to set up your cameras, lights and materials in exactly same way (having all PBR and in common units helps here a lot).

I've had some success in the past when I wrote real time path tracers in CUDA (for my thesis) - which were using PBR definition of materials, lights & cameras, and comparing it against PBRT generated image (you could use Luxrender - which is somewhat similar to PBRT).

 

Share this post


Link to post
Share on other sites

Thanks, guys!

After removing the division and taking the material properties from a cornell scene that came with LuxRender my path tracer and Blender already match perfectly :)

I had tried this before, but there is some reflection clamping in Blender which caused a difference because the light was too bright.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!