Jump to content

  • Log In with Google      Sign In   
  • Create Account


Path tracing - Incorrect lighting


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
9 replies to this topic

#1 Arjan B   Members   -  Reputation: 621

Like
0Likes
Like

Posted 03 January 2014 - 06:49 AM

I have started implementing a simple path tracer. However, I have run into some problems. Placing spheres into the scene leaves most of them not being lit at all.

 

Scene setup

First, let me show the setup of the scene:

1628qy9.jpg

 

I have a sphere in the middle with a radius of 4 that emits light. Around it are four spheres with a radius of 1.

 

Code

One path cannot bounce on a surface more than 5 times. If a ray does not intersect any of the objects in the world, the path terminates as well. The last way of terminating a path is by returning the emitting color of the material we just hit. This happens with 20% of the hits.

Color TraceRay(const Ray& ray, unsigned depth) {
	const unsigned maxDepth = 5;
	if (depth > maxDepth)
		return Color(0.f, 0.f, 0.f);

	float t;
	Shape* shape = NULL;
	if (!world.Intersect(ray, t, &shape))
		return Color(0.f, 0.f, 0.f);

	Point p = ray(t);
	Normal n = shape->GetNormal(p);

	const float pEmit = 0.2f;
	if (urd(mt) < pEmit) {
		return shape->emittance * (1.f / pEmit);
	}
	else {
		Vector newDir = RandomDirection(n);
		Ray newRay(p, newDir, 0.001f);
		return TraceRay(newRay, depth+1) * Dot(n, newDir) * (1.f / (1.f - pEmit));
	}
}

.

 

When bouncing off a surface, a random new direction in the same hemisphere as the surface normal must be generated. I generate three random floats which form a vector v. I normalize this vector and check whether it is in the same hemisphere as the surface normal. If so, return v. If not, flip v and return it.

Vector RandomDirection(const Normal& n) {
	Vector v(urd(mt), urd(mt), urd(mt));
	Normalize(v);
	return Dot(v, n) < 0.f ? -v : v;
}

.

 

After every pixel has been sampled, I present the results so far. The function below is called 500 times to take 500 samples per pixel. All sampled colors are summed up and divided by the number of them for the final resulting color.

void TraceRays(unsigned maxIterations, sf::Texture& texture) {
	for (unsigned x = 0; x < camera.film.GetWidth(); x++) {
		for (unsigned y = 0; y < camera.film.GetHeight(); y++) {
			Ray ray = camera.GetRay(x, y);
			Color c = camera.film.GetPixel(x, y);
			Color l = TraceRay(ray, 0);
			camera.film.SetPixel(x, y, l + c);
		}
	}

	ClearImage();
	for (unsigned x = 0; x < camera.film.GetWidth(); x++) {
		for (unsigned y = 0; y < camera.film.GetHeight(); y++) {
			Color c = camera.film.GetPixel(x,y);
			c /= maxIterations;
			image.setPixel(x, y, c.ToSFMLColor());
		}
	}
	texture.update(image);
}

.

 

Results

The light emitting sphere is clearly visible. You can also see sphere D being slightly lit in the lower right corner.

 

However, none of the other spheres are being lit. I would expect at least a few of the paths that bounce on spheres A and B to bounce in the direction of the light emitting sphere, leading to those pixels being brightened.

 

10ojtw2.jpg

 

 

Questions

I'm having a hard time debugging things pixel by pixel. I'm hoping someone here might be able to make an educated guess about what I'm doing wrong, either by seeing the resulting image or browsing through the above code.

 

Any help would be greatly appreciated!



Sponsor:

#2 Bacterius   Crossbones+   -  Reputation: 8516

Like
2Likes
Like

Posted 03 January 2014 - 08:37 AM

One major problem is your RandomDirection function. Turns out you can't just generate three random floats in the unit cube and call it a day wink.png a hemisphere has a different distribution than a unit cube. Your RandomDirection function is very biased (it heavily favors rays in the corners of the unit cube, and doesn't even appear to reach most hemisphere directions in the negative, which would explain why the other spheres aren't getting any light - the function *never* samples rays that point towards the light from their position). One possible fix is:

Vector RandomDirection(const Normal &n, const Vector &t, const Vector &b) {
    // u1, u2 are uniform random variables in [0..1)

    const float r = sqrt(1.0f - u1 * u1);
    const float phi = 2 * M_PI * u2;

    // assuming Y is up - if not, "u1" is the "up" coordinate
    Vector v = Vector(Cos(phi) * r, u1, Sin(phi) * r); // uniform in normal space

    return t * v.x + n * v.y + b * v.z; // rotate to align with the normal
}

Where t and b are the tangent and bitangent vectors at the surface (their rotation does not matter, since a hemisphere is isotropic about the central axis, all that matters is that t, b, and n form an orthonormal basis with n being the "up" axis). You can calculate those easily through a few cross products. An alternative option is to generate a random ray in the unit sphere and then "fold" it about the normal plane if it falls behind it (the distribution remains the same, though I am not sure if it is any cheaper than just calculating t and b). Yet another, more efficient solution, is to generate cosine-weighted rays, which also happens to save you from having to multiply your reflectance with the cosine term (since the ray distribution already factors it in). You can read more about that here.

 

Also check your normals are the right way around. For instance, make sure the ground's normal is actually pointing upwards. It's not getting any light, so maybe it is backwards (causing exitant rays to point downwards and repeatedly self-intersect until they reach the path depth limit). If it's still broken after you check all that, try and remove the russian roulette code for now and go for a naive integrator, and see if it works then, trying to isolate the problem. Also, this is not important for now but it is recommended to only enable russian roulette after the ray has bounced a few times, else you get rather high variance in the first couple of bounces (as evidenced by the noisy light source in your render).

 

EDIT: heh, I just realized you don't have a ground plane. I'd suggest adding one for the purposes of debugging, it will let you see more of what's going on (black images are never helpful). Here's a ray-plane intersection code if you need it, where (o, d) is the ray, n is the plane's normal, and p is any point on the plane. Or you can emulate it using a huge sphere, that works too.

bool ray_plane(Vector o, Vector d, Normal n, Vector p, float *dist)
{
    *dist = Dot(p - o, n) / Dot(d, n);
    return *dist > 0;
}

Edited by Bacterius, 03 January 2014 - 08:53 AM.

The slowsort algorithm is a perfect illustration of the multiply and surrender paradigm, which is perhaps the single most important paradigm in the development of reluctant algorithms. The basic multiply and surrender strategy consists in replacing the problem at hand by two or more subproblems, each slightly simpler than the original, and continue multiplying subproblems and subsubproblems recursively in this fashion as long as possible. At some point the subproblems will all become so simple that their solution can no longer be postponed, and we will have to surrender. Experience shows that, in most cases, by the time this point is reached the total work will be substantially higher than what could have been wasted by a more direct approach.

 

- Pessimal Algorithms and Simplexity Analysis


#3 Arjan B   Members   -  Reputation: 621

Like
0Likes
Like

Posted 03 January 2014 - 10:02 AM

Thanks for the reply!

 

After you described my random direction function as generating a vector in a unit cube, it makes sense that this is far from uniformly at random. I did not expect it to have this much of an impact, though.

 

This is the new version of the random direction function:

Vector RandomDirection(const Normal& n) {
	Vector vn(n.x, n.y, n.z);
	Vector t, b;
	CoordinateSystem(vn, &t, &b);

	float r = (float)urd(mt);
	float phi = (float)urd(mt) * 2.f * PI;

	Vector v(cosf(phi) * r, r, sinf(phi) * r);
	return t * v.x + vn * v.y + b * v.z;
}

CoordinateSystem() creates three orthonormal vectors, given one of the vectors. I think the above is well-copied from your suggestion. ^^

 

I removed the Russian Roulette part, leaving me with this:

Color TraceRay(const Ray& ray, unsigned depth) {
	const unsigned maxDepth = 5;
	if (depth > maxDepth)
		return Color(0.f, 0.f, 0.f);

	float t;
	Shape* shape = NULL;
	if (!world.Intersect(ray, t, &shape))
		return Color(0.f, 0.f, 0.f);

	Point p = ray(t);
	Normal n = shape->GetNormal(p);

	if (shape->emittance == Color(0.f, 0.f, 0.f)) {
		Vector newDir = RandomDirection(n);
		Ray newRay(p, newDir, 0.001f);
		return TraceRay(newRay, depth+1) * Dot(n, newDir);
	}
	else {
		return shape->emittance;
	}
}

Since I had added a triangle shape already, I added a floor in the form of a large triangle. The normal for this surface turns out to be what I expect it to be: (0, 1, 0). When shooting a ray from a position with a direction aimed at the light sphere's center, the normal is the reverse of the ray direction, which should be correct. Normalize(pointOfIntersection - centerOfSphere) also seems in order, so I strongly believe my normals are in order.

 

This is the result when I place the floor triangle a little below all the spheres:

ivl5wj.jpg

Nothing seems to be lit. sad.png

 

However, when I place the triangle through the centers of all the spheres, this is what I get:

nnn0wx.jpg

 

Now the triangle gets lit slightly, as do three of the spheres. There are also some weird artefacts of which I'm not sure how to explain them.


Edited by Arjan B, 03 January 2014 - 10:03 AM.


#4 Arjan B   Members   -  Reputation: 621

Like
0Likes
Like

Posted 03 January 2014 - 10:11 AM

For debugging purposes, I tried just returning a color.

 

EDIT: When vertically flipping the image, the shapes and perspective suddenly seemed correct.

 

EDIT2: These are the most current results, which still seem wrong:

dz85qe.jpg2n71ut4.jpg


Edited by Arjan B, 03 January 2014 - 10:49 AM.


#5 koiava   Members   -  Reputation: 99

Like
0Likes
Like

Posted 03 January 2014 - 02:20 PM

there is a two way to generate and weight reflected rays.
1. Generate uniform random vector in hemisphere and multiply it on BRDF value, which is actually a probability of sampling specific direction.
In this way  you can sample any type of BRDF(must be non zero) and weight sampled directions in such way to approximate different  distribution. this technique is used in Resampled Importance Sampling and is usually used with difficult BRDFs which has no closed form sampling procedure. 
2. Most commonly used technique is to generate samples directly with distribution proportional to BRDF. I think this one is more intuitive.

In your code surface color is not mentioned at all. surface color must be combined with recursively obtained color with specular coefficient. I think this is main problem.

 


Edited by koiava, 03 January 2014 - 02:27 PM.


#6 koiava   Members   -  Reputation: 99

Like
0Likes
Like

Posted 03 January 2014 - 02:33 PM

like this:

return surfaceColor*TraceRay( sampleBRDF( hitInfo ) ), depth+1 ) );



#7 Arjan B   Members   -  Reputation: 621

Like
0Likes
Like

Posted 03 January 2014 - 09:05 PM

By multiplying colors with each other, I assume multiplying their r, g and b components is meant. For colors defined by r, g and b of course. ^^

Not multiplying the result of my ray trace with anything would have the same result as multiplying it with a white surface color, right?

So I'm not sure if that is the problem.

 

My understanding of the definition or responsibilities of a BRDF is still pretty vague. I realize that it describes some attributes of the material of whatever you hit, but I'm not sure about which attributes specifically. Does it provide you with a new sample ray and the probability/weight of that ray? Does it provide the color of the material? So far, it seemed like a part of the rendering equation, dependent on the incoming and outgoing rays. For perfectly diffuse materials, which is what I've been going for so far, this would just be a constant?

 

Also, if I need to assign a weight to a randomly generated ray, I'm not sure how to go about this. Since, in theory, there are infinitely many vectors I could generate within a hemisphere, the probability of one specific vector is equal to 0. I would be able to think about small areas on the hemisphere having a small probability of being chosen, but is that the way I should go?


Edited by Arjan B, 04 January 2014 - 04:03 AM.


#8 Arjan B   Members   -  Reputation: 621

Like
0Likes
Like

Posted 07 January 2014 - 05:00 AM

I had a tool I wrote for school lying around that quickly lets me visualize some points, so I generated 500 vectors with RandomDirection for a normal of (0, 1, 0) and viewed them as points. Then I tried normalizing them, and the result is viewed on the right:

 

oszyg0.jpg9i8chz.jpg

(The lowest point was added as a reference for the origin)

 

 

EDIT:

19tmhy.jpg

 

So now I'm generating directions uniformly at random correctly. I've also added multiplying the result of a ray with the surface color. However, it still does not look like I think it should. For the image on the right, I also multiplied the color by a constant of 5.

2vhvcw7.jpg2n7e2wh.jpg

 

I'm kind of getting lost right now.. Am I correct about the BRDF being constant for perfectly diffuse materials? Does anyone have any more suggestions about what I might be doing wrong? Those horizontal green lines seem wrong. And why would there be some sort of path of light between the lower two spheres and the light emitting sphere?

 

This is the same scene, but with the light emitting sphere above the rest, and having a 10.000 times larger radius:

1z1w8pv.jpg


Edited by Arjan B, 07 January 2014 - 08:58 AM.


#9 Arjan B   Members   -  Reputation: 621

Like
1Likes
Like

Posted 09 January 2014 - 08:17 AM

I was able to fix it! biggrin.png

 

In my intersection function for the "world" I had an output parameter for the distance to the closest intersection, but I never saved that distance in the output parameter. This fixed the weird pattern you can see in the previous post.

 

I got horizontal black lines in the image as well, which were due to the fact that my intersection functions did not take rounding errors into account yet. So now I make use of the Ray's minimum t value to determine whether an intersection is considered valid.

 

2hov5f9.jpg

 

Thanks for the help, everyone!


Edited by Arjan B, 09 January 2014 - 08:21 AM.


#10 Krypt0n   Crossbones+   -  Reputation: 2502

Like
2Likes
Like

Posted 09 January 2014 - 10:27 AM

grats to your new born path tracer smile.png


Edited by Krypt0n, 09 January 2014 - 10:28 AM.





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS