# Photon Mapping : Photon Tracing

## Recommended Posts

It appears my photon map is finally working right. I am now attempting to render an image using "direct visualisation" of the photon map, that is, simply rendering each pixel's color as the product of the material color times the irradiance estimate obtained from the photon map... This seems to result in garbled results (the colors are wrong).
The left wall should be red, right wall should be blue, and back wall should be yellow, the floor and ceiling are white. I believe my photon tracing is the function that was wrong, because before I implemented the russian roulette method, I was actually getting somewhat coherent results. If anyone that is knowledgable about photon mapping would like to help by taking a look at my photon tracing code (I believe its rather intuitive to follow), it would be greatly appreciated.
void CRenderer::TracePhoton(const CScene& Scene, const CRay& Ray, const CColor3f& Power, unsigned int TraceDepth)
{
// Make sure the max depth was not exceeded
if (TraceDepth > m_MaxDepth)
return;

// Declare a vector for the intersection point
CVector3 Intersect;

// Declare a float for the intersection distance
float Distance;

// Perform ray casting in the scene
CSceneObject* pObject = Scene.CastRay(Ray, Intersect, Distance);

// Make sure an object was intersected
if (!pObject)
return;

// Get the point attributes at the intersection point
SPointAttributes Point = pObject->PointAttributes(Intersect);

// Compute the average color at the point
float ColorAverage = Point.Color.Average();

// Compute the diffuse reflection probability
float ProbDiffuse = Point.Diffuse * ColorAverage;

// Compute the specular reflection probability
float ProbSpecular = Point.Specular * ColorAverage;

// Compute the transmission probability
float ProbTransmit = Point.Transparent * ColorAverage;

// Obtain a random number in the [0,1] range
float RandomValue = RandomFloat(0.0f, 1.0f);

// If this photon is to be diffusely reflected
if (RandomValue < ProbDiffuse)
{
// Generate a diffuse reflection vector
CVector3 DiffuseDirection = DiffuseVector(Point.Normal);

// Compute the diffuse power
CColor3f DiffusePower = (Power * Point.Color) / ProbDiffuse;

// Trace the diffuse photon
TracePhoton(Scene, CRay(Intersect, DiffuseDirection), DiffusePower, TraceDepth + 1);

}

// If this photon is to be specularly reflected
else if (RandomValue < ProbDiffuse + ProbSpecular)
{
// Calculate the reflected direction
CVector3 ReflectedDirection = ReflectedVector(Ray.GetDirection(), Point.Normal);

// Compute the specular power
CColor3f SpecularPower = (Power * Point.Color) / ProbSpecular;

// Trace the reflected photon
TracePhoton(Scene, CRay(Intersect, ReflectedDirection), SpecularPower, TraceDepth + 1);
}

// If this photon is to be transmitted
else if (RandomValue < ProbDiffuse + ProbSpecular + ProbTransmit)
{
// Calculate the refracted direction
CVector3 RefractedDirection = RefractedVector(Ray.GetDirection(), Point.Normal, Point.Refraction);

// Compute the transmitted power
CColor3f TransmitPower = (Power * Point.Color) / ProbTransmit;

// Trace the refracted ray to get the refracted color
TracePhoton(Scene, CRay(Intersect, RefractedDirection), TransmitPower, TraceDepth + 1);
}

// This photon will be absorbed
else
{
// Add a photon at this point
}
}



##### Share on other sites
Looks fine to me (both code and the rendered image). The color results are naturally going to be a bit "off" because of the low quality of direct-visualization (you're going to need several million photons in most cases before a direct visual gives you a decent looking result). In any case everything looks correct; there is noticeable color bleeding near the corners of the walls, and the overall illumination patterns are correct - sphere casts a shadow, ceiling and floor are brightly lit near the light source, etc. At the very least I don't see any particular cause for concern. You may want to experiment with the diffuse direction sampler and constrain it a bit more towards the normal, but other than that it looks fine to me.

##### Share on other sites
Yep, I agree, this is what I get from my photon mapping implementation when I do a direct visualization...

http://www.stanford.edu/~jhuang11/images/cornellgood.jpg

##### Share on other sites
Quote:
 The color results are naturally going to be a bit "off" because of the low quality of direct-visualization (you're going to need several million photons in most cases before a direct visual gives you a decent looking result).

Thats the problem. This was rendered with 2.6 million photons, maximum distance per estimate 0.25, and 100 photons per estimate. The colors seem all mixed up, and there is this weird color band near each wall intersection... Perhaps a filter for the irradiance estimate would help.

Quote:
 Original post by yellowjonYep, I agree, this is what I get from my photon mapping implementation when I do a direct visualization...http://www.stanford.edu/~jhuang11/images/cornellgood.jpg

Well, yours doesn't have the colors as mixed up as mine.

##### Share on other sites
There's a couple of things to try. First, like I mentioned before, you might look at a tighter constraint on the diffuse reflection vector; if the constraint is too wide, photons will tend to bounce into "unlikely" places and create unnatural looking lighting in direct visualization.

The other thing to try would be to change the back wall from yellow to white. This won't fix anything, but it will disguise the inaccuracies of direct visualization. I suspect part of what makes your render look less correct is that it has an extra color blending into the lighting, which is unusual for a Cornell box render. However, it is important to remember that even though most renders don't look like they are that mixed up, they actually are - you just can't see it because a large number of photons are white. Making them yellow exposes the inaccuracies. In reality, though, it is actually accurate for there to be mixed-color photons all over the room. If you think of light being emitted from the lightsource in "blobs" rather than discrete photons, you can think of how it would reflect around the room - after a few bounces it will start to become very blurred and mixed. The important thing is not that you have some yellow or red or blue dots mixed into other colors; the important thing is that the average color is, overall, correct.

I don't know what the scale of your scene is, so 0.25 units for the gather means nothing to me, but I generally found that to get a nice direct visualization result takes a much larger gather radius than one might think. A filter obviously can be applied to smooth the results; in fact I've experimented a pre-pass filter that smoothed the direct map in-place (without visualization) to get better results in the final render. The results were fairly good but there are better tricks available. In any case, spending time on filtering these results is a waste, because your first-hit radiance estimate from the raytracer and a good final gathering system will take care of the apparent problems.

##### Share on other sites
Well, I will be performing further experiments ;)

In the meantime, is there a good exposure function to use than clamping the color to (1,1,1), because the floor and ceiling look extremely bright, while the walls look dim. I can intensify the light power, but then the floor will just look pure white.

##### Share on other sites
Bright coloring is a side-effect of direct visualization. Because DV only looks at a very local area of photons to estimate the radiance, it doesn't create a fully accurate result without a huge number of very carefully balanced photons. Without a filtering function this becomes especially pronounced. These results will go away when you just sample the photon map for irradiance rather than doing DV.

##### Share on other sites
Quote:
 Original post by ApochPiQBright coloring is a side-effect of direct visualization. Because DV only looks at a very local area of photons to estimate the radiance, it doesn't create a fully accurate result without a huge number of very carefully balanced photons. Without a filtering function this becomes especially pronounced. These results will go away when you just sample the photon map for irradiance rather than doing DV.

I am averaging the irradiance over the radius. What I actually meant is a better mapping function for the color. To answer your previous question, the radius I gave you earlier is metric, and the room is in metric scale as well, it is about 4 meters wide.

##### Share on other sites
Try increasing your gather radius then. About 0.5 should do it; you can even play with much higher radii to see how it affects the final scene.

Also, remember that in direct visualization, you want to visualize radiance, not irradiance (unless you're just trying to debug the photon map). This means you need to pass the irradiance through the BRDF for each pixel to get a meaningful result. Visualizing irradiance will naturally look different than visualizing radiance, because a direct visualization of irradiance does not account for the surface properties of the materials in the scene. We're used to seeing radiance, so seeing irradiance of course looks a bit odd.

##### Share on other sites
Quote:

How would you compute the radiance of a purely diffuse surface?

##### Share on other sites
For comparison reasons, here's a page of direct visualization images i generated with the photon mapper I wrote a year ago: clicky

##### Share on other sites
Quote:
 Original post by Max_PayneHow would you compute the radiance of a purely diffuse surface?

For a pure Lambertian surface, invert each incident photon's direction and weight its energy by the dot product of the inverted direction with the surface normal at the point of incidence.

## Create an account

Register a new account

• ### Forum Statistics

• Total Topics
628367
• Total Posts
2982284

• 10
• 9
• 13
• 24
• 11